- Notifications
You must be signed in to change notification settings - Fork417
Bayesian optimization in PyTorch
License
pytorch/botorch
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
BoTorch is a library for Bayesian Optimization built on PyTorch.
BoTorch is currently in beta and under active development!
BoTorch
- Provides a modular and easily extensible interface for composing Bayesianoptimization primitives, including probabilistic models, acquisition functions,and optimizers.
- Harnesses the power of PyTorch, including auto-differentiation, native supportfor highly parallelized modern hardware (e.g. GPUs) using device-agnostic code,and a dynamic computation graph.
- Supports Monte Carlo-based acquisition functions via thereparameterization trick, which makes itstraightforward to implement new ideas without having to impose restrictiveassumptions about the underlying model.
- Enables seamless integration with deep and/or convolutional architectures in PyTorch.
- Has first-class support for state-of-the art probabilistic models inGPyTorch, including support for multi-task GaussianProcesses (GPs) deep kernel learning, deep GPs, and approximate inference.
The primary audience for hands-on use of BoTorch are researchers andsophisticated practitioners in Bayesian Optimization and AI.We recommend using BoTorch as a low-level API for implementing new algorithmsforAx. Ax has been designed to be an easy-to-use platformfor end-users, which at the same time is flexible enough for BayesianOptimization researchers to plug into for handling of feature transformations,(meta-)data management, storage, etc.We recommend that end-users who are not actively doing research on BayesianOptimization simply use Ax.
Installation Requirements
- Python >= 3.10
- PyTorch >= 2.0.1
- gpytorch == 1.14
- linear_operator == 0.6
- pyro-ppl >= 1.8.4
- scipy
- multiple-dispatch
The latest release of BoTorch is easily installed viapip
:
pip install botorch
Note: Make sure thepip
being used is actually the one from the newly createdConda environment. If you're using a Unix-based OS, you can usewhich pip
to check.
BoTorchstopped publishingan official Anaconda package to thepytorch
channel after the 0.12 release. However,users can still use the package published to theconda-forge
channel and install botorch via
conda install botorch -c gpytorch -c conda-forge
If you would like to try our bleeding edge features (and don't mind potentiallyrunning into the occasional bug here or there), you can install the latestdevelopment version directly from GitHub. If you want to also install thecurrentgpytorch
andlinear_operator
development versions, you will needto ensure that theALLOW_LATEST_GPYTORCH_LINOP
environment variable is set:
pip install --upgrade git+https://github.com/cornellius-gp/linear_operator.gitpip install --upgrade git+https://github.com/cornellius-gp/gpytorch.gitexport ALLOW_LATEST_GPYTORCH_LINOP=truepip install --upgrade git+https://github.com/pytorch/botorch.git
If you want tocontribute to BoTorch, you will want to install editably so that you can change files and have thechanges reflected in your local install.
If you want to install the currentgpytorch
andlinear_operator
development versions, as in Option 2, do thatbefore proceeding.
git clone https://github.com/pytorch/botorch.gitcd botorchpip install -e.
git clone https://github.com/pytorch/botorch.gitcd botorchexport ALLOW_BOTORCH_LATEST=truepip install -e".[dev, tutorials]"
dev
: Specifies tools necessary for development(testing, linting, docs building; seeContributing below).tutorials
: Also installs all packages necessary for running the tutorial notebooks.- You can also install either the dev or tutorials dependencies without installing both, e.g. by changing the last command to
pip install -e ".[dev]"
.
Here's a quick run down of the main components of a Bayesian optimization loop.For more details see ourDocumentation and theTutorials.
- Fit a Gaussian Process model to data
importtorchfrombotorch.modelsimportSingleTaskGPfrombotorch.models.transformsimportNormalize,Standardizefrombotorch.fitimportfit_gpytorch_mllfromgpytorch.mllsimportExactMarginalLogLikelihood# Double precision is highly recommended for GPs.# See https://github.com/pytorch/botorch/discussions/1444train_X=torch.rand(10,2,dtype=torch.double)*2Y=1- (train_X-0.5).norm(dim=-1,keepdim=True)# explicit output dimensionY+=0.1*torch.rand_like(Y)gp=SingleTaskGP(train_X=train_X,train_Y=Y,input_transform=Normalize(d=2),outcome_transform=Standardize(m=1),)mll=ExactMarginalLogLikelihood(gp.likelihood,gp)fit_gpytorch_mll(mll)
- Construct an acquisition function
frombotorch.acquisitionimportLogExpectedImprovementlogEI=LogExpectedImprovement(model=gp,best_f=Y.max())
- Optimize the acquisition function
frombotorch.optimimportoptimize_acqfbounds=torch.stack([torch.zeros(2),torch.ones(2)]).to(torch.double)candidate,acq_value=optimize_acqf(logEI,bounds=bounds,q=1,num_restarts=5,raw_samples=20,)
If you use BoTorch, please cite the following paper:
@inproceedings{balandat2020botorch, title={{BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization}}, author={Balandat, Maximilian and Karrer, Brian and Jiang, Daniel R. and Daulton, Samuel and Letham, Benjamin and Wilson, Andrew Gordon and Bakshy, Eytan}, booktitle = {Advances in Neural Information Processing Systems 33}, year={2020}, url = {http://arxiv.org/abs/1910.06403}}
Seehere for an incomplete selection of peer-reviewed papers that build off of BoTorch.
See theCONTRIBUTING file for how to help out.
BoTorch is MIT licensed, as found in theLICENSE file.
About
Bayesian optimization in PyTorch