- Notifications
You must be signed in to change notification settings - Fork1.1k
Distributed Asynchronous Hyperparameter Optimization in Python
License
hyperopt/hyperopt
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
Hyperopt is a Python library for serial and parallel optimization over awkwardsearch spaces, which may include real-valued, discrete, and conditionaldimensions.
Install hyperopt from PyPI
pip install hyperopt
to run your first example
# define an objective functiondefobjective(args):case,val=argsifcase=='case 1':returnvalelse:returnval**2# define a search spacefromhyperoptimporthpspace=hp.choice('a', [ ('case 1',1+hp.lognormal('c1',0,1)), ('case 2',hp.uniform('c2',-10,10)) ])# minimize the objective over the spacefromhyperoptimportfmin,tpe,space_evalbest=fmin(objective,space,algo=tpe.suggest,max_evals=100)print(best)# -> {'a': 1, 'c2': 0.01420615366247227}print(space_eval(space,best))# -> ('case 2', 0.01420615366247227}
If you're a developer and wish to contribute, please follow these steps.
Setup (based onthis)
Create an account on GitHub if you do not already have one.
Fork the project repository: click on the ‘Fork’ button near the top of the page. This creates a copy of the code under your account on the GitHub user account. For more details on how to fork a repository seethis guide.
Clone your fork of the hyperopt repo from your GitHub account to your local disk:
git clone https://github.com/<github username>/hyperopt.gitcd hyperopt
Create environment with:
$ python3 -m venv my_envor$ python -m venv my_envor with conda:$ conda create -n my_env python=3Activate the environment:
$ source my_env/bin/activate
or with conda:$ conda activate my_envInstall dependencies for extras (you'll need these to run pytest):Linux/UNIX:
$ pip install -e '.[MongoTrials, SparkTrials, ATPE, dev]'or Windows:
pip install -e .[MongoTrials]pip install -e .[SparkTrials]pip install -e .[ATPE]pip install -e .[dev]
Add the upstream remote. This saves a reference to the main hyperopt repository, which you can use to keep your repository synchronized with the latest changes:
$ git remote add upstream https://github.com/hyperopt/hyperopt.gitYou should now have a working installation of hyperopt, and your git repository properly configured. The next steps now describe the process of modifying code and submitting a PR:
Synchronize your master branch with the upstream master branch:
git checkout mastergit pull upstream master
Create a feature branch to hold your development changes:
$ git checkout -b my_featureand start making changes. Always use a feature branch. It’s good practice to never work on the master branch!
We recommend to useBlack to format your code before submitting a PR which is installed automatically in step 6.
Then, once you commit ensure that git hooks are activated (Pycharm for example has the option to omit them). This can be done usingpre-commit, which is installed automatically in step 6, as follows:
pre-commit install
This will run black automatically when you commit on all files you modified, failing if there are any files requiring to be blacked. In case black does not run execute the following:
black {source_file_or_directory}Develop the feature on your feature branch on your computer, using Git to do the version control. When you’re done editing, add changed files using git add and then git commit:
git add modified_filesgit commit -m"my first hyperopt commit"The tests for this project usePyTest and can be run by calling
pytest.Record your changes in Git, then push the changes to your GitHub account with:
git push -u origin my_feature
Note that dev dependencies require python 3.6+.
Currently three algorithms are implemented in hyperopt:
Hyperopt has been designed to accommodate Bayesian optimization algorithms based on Gaussian processes and regression trees, but these are not currently implemented.
All algorithms can be parallelized in two ways, using:
Hyperopt documentation can be found here, but is partly still hosted on the wiki. Here are some quick links to the most relevant pages:
Seeprojects using hyperopt on the wiki.
If you use this software for research, please cite the paper (http://proceedings.mlr.press/v28/bergstra13.pdf) as follows:
Bergstra, J., Yamins, D., Cox, D. D. (2013) Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures. TProc. of the 30th International Conference on Machine Learning (ICML 2013), June 2013, pp. I-115 to I-23.
This project has received support from
- National Science Foundation (IIS-0963668),
- Banting Postdoctoral Fellowship program,
- National Science and Engineering Research Council of Canada (NSERC),
- D-Wave Systems, Inc.
About
Distributed Asynchronous Hyperparameter Optimization in Python
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
