- Notifications
You must be signed in to change notification settings - Fork2.2k
Bayesian Modeling and Probabilistic Programming in Python
License
pymc-devs/pymc
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
PyMC (formerly PyMC3) is a Python package for Bayesian statistical modelingfocusing on advanced Markov chain Monte Carlo (MCMC) and variational inference (VI)algorithms. Its flexibility and extensibility make it applicable to alarge suite of problems.
Check out thePyMC overview, orone ofthe many examples!For questions on PyMC, head on over to ourPyMC Discourse forum.
- Intuitive model specification syntax, for example,
x ~ N(0,1)translates tox = Normal('x',0,1) - Powerful sampling algorithms, such as theNo U-TurnSampler, allow complex modelswith thousands of parameters with little specialized knowledge offitting algorithms.
- Variational inference:ADVIfor fast approximate posterior estimation as well as mini-batch ADVIfor large data sets.
- Relies onPyTensor which provides:
- Computation optimization and dynamic C or JAX compilation
- NumPy broadcasting and advanced indexing
- Linear algebra operators
- Simple extensibility
- Transparent support for missing value imputation
Plant growth can be influenced by multiple factors, and understanding these relationships is crucial for optimizing agricultural practices.
Imagine we conduct an experiment to predict the growth of a plant based on different environmental variables.
importpymcaspm# Taking draws from a normal distributionseed=42x_dist=pm.Normal.dist(shape=(100,3))x_data=pm.draw(x_dist,random_seed=seed)# Independent Variables:# Sunlight Hours: Number of hours the plant is exposed to sunlight daily.# Water Amount: Daily water amount given to the plant (in milliliters).# Soil Nitrogen Content: Percentage of nitrogen content in the soil.# Dependent Variable:# Plant Growth (y): Measured as the increase in plant height (in centimeters) over a certain period.# Define coordinate values for all dimensions of the datacoords={"trial":range(100),"features": ["sunlight hours","water amount","soil nitrogen"],}# Define generative modelwithpm.Model(coords=coords)asgenerative_model:x=pm.Data("x",x_data,dims=["trial","features"])# Model parametersbetas=pm.Normal("betas",dims="features")sigma=pm.HalfNormal("sigma")# Linear modelmu=x @betas# Likelihood# Assuming we measure deviation of each plant from baselineplant_growth=pm.Normal("plant growth",mu,sigma,dims="trial")# Generating data from model by fixing parametersfixed_parameters= {"betas": [5,20,2],"sigma":0.5,}withpm.do(generative_model,fixed_parameters)assynthetic_model:idata=pm.sample_prior_predictive(random_seed=seed)# Sample from prior predictive distribution.synthetic_y=idata.prior["plant growth"].sel(draw=0,chain=0)# Infer parameters conditioned on observed datawithpm.observe(generative_model, {"plant growth":synthetic_y})asinference_model:idata=pm.sample(random_seed=seed)summary=pm.stats.summary(idata,var_names=["betas","sigma"])print(summary)
From the summary, we can see that the mean of the inferred parameters are very close to the fixed parameters
| Params | mean | sd | hdi_3% | hdi_97% | mcse_mean | mcse_sd | ess_bulk | ess_tail | r_hat |
|---|---|---|---|---|---|---|---|---|---|
| betas[sunlight hours] | 4.972 | 0.054 | 4.866 | 5.066 | 0.001 | 0.001 | 3003 | 1257 | 1 |
| betas[water amount] | 19.963 | 0.051 | 19.872 | 20.062 | 0.001 | 0.001 | 3112 | 1658 | 1 |
| betas[soil nitrogen] | 1.994 | 0.055 | 1.899 | 2.107 | 0.001 | 0.001 | 3221 | 1559 | 1 |
| sigma | 0.511 | 0.037 | 0.438 | 0.575 | 0.001 | 0 | 2945 | 1522 | 1 |
# Simulate new data conditioned on inferred parametersnew_x_data=pm.draw(pm.Normal.dist(shape=(3,3)),random_seed=seed,)new_coords=coords| {"trial": [0,1,2]}withinference_model:pm.set_data({"x":new_x_data},coords=new_coords)pm.sample_posterior_predictive(idata,predictions=True,extend_inferencedata=True,random_seed=seed, )pm.stats.summary(idata.predictions,kind="stats")
The new data conditioned on inferred parameters would look like:
| Output | mean | sd | hdi_3% | hdi_97% |
|---|---|---|---|---|
| plant growth[0] | 14.229 | 0.515 | 13.325 | 15.272 |
| plant growth[1] | 24.418 | 0.511 | 23.428 | 25.326 |
| plant growth[2] | -6.747 | 0.511 | -7.740 | -5.797 |
# Simulate new data, under a scenario where the first beta is zerowithpm.do(inference_model, {inference_model["betas"]:inference_model["betas"]* [0,1,1]},)asplant_growth_model:new_predictions=pm.sample_posterior_predictive(idata,predictions=True,random_seed=seed, )pm.stats.summary(new_predictions,kind="stats")
The new data, under the above scenario would look like:
| Output | mean | sd | hdi_3% | hdi_97% |
|---|---|---|---|---|
| plant growth[0] | 12.149 | 0.515 | 11.193 | 13.135 |
| plant growth[1] | 29.809 | 0.508 | 28.832 | 30.717 |
| plant growth[2] | -0.131 | 0.507 | -1.121 | 0.791 |
- Bayesian Analysis with Python (third edition) by Osvaldo Martin: Great introductory book.
- Probabilistic Programming and Bayesian Methods for Hackers: Fantastic book with many applied code examples.
- PyMC port of the book "Doing Bayesian Data Analysis" by John Kruschke as well as thefirst edition.
- PyMC port of the book "Statistical Rethinking A Bayesian Course with Examples in R and Stan" by Richard McElreath
- PyMC port of the book "Bayesian Cognitive Modeling" by Michael Lee and EJ Wagenmakers: Focused on using Bayesian statistics in cognitive modeling.
See also the section on books using PyMC onour website.
- Here is aYouTube playlist gathering several talks on PyMC.
- You can also find all the talks given atPyMCon 2020here.
- The"Learning Bayesian Statistics" podcast helps you discover and stay up-to-date with the vast Bayesian community. Bonus: it's hosted by Alex Andorra, one of the PyMC core devs!
To install PyMC on your system, follow the instructions on theinstallation guide.
Please choose from the following:
PyMC: A Modern and Comprehensive Probabilistic Programming Framework in Python, Abril-Pla O, Andreani V, Carroll C, Dong L, Fonnesbeck CJ, Kochurov M, Kumar R, Lao J, Luhmann CC, Martin OA, Osthege M, Vieira R, Wiecki T, Zinkov R. (2023)
BibTex version
@article{pymc2023,title ={{PyMC}: A Modern and Comprehensive Probabilistic Programming Framework in {P}ython},author ={Oriol Abril-Pla and Virgile Andreani and Colin Carroll and Larry Dong and Christopher J. Fonnesbeck and Maxim Kochurov and Ravin Kumar and Junpeng Lao and Christian C. Luhmann and Osvaldo A. Martin and Michael Osthege and Ricardo Vieira and Thomas Wiecki and Robert Zinkov},journal ={{PeerJ} Computer Science},volume ={9},number ={e1516},doi ={10.7717/peerj-cs.1516},year ={2023}}
DOIs for specific versions are shown on Zenodo and underReleases
We are usingdiscourse.pymc.io as our main communication channel.
To ask a question regarding modeling or usage of PyMC we encourage posting to our Discourse forum under the“Questions” Category. You can also suggest feature in the“Development” Category.Requests for non-technical information about the project are also welcome on Discourse,we also use Discourse internally for general announcements or governance related processes.
You can also follow us on these social media platforms for updates and other announcements:
To report an issue with PyMC please use theissue tracker.
- Bambi: BAyesian Model-Building Interface (BAMBI) in Python.
- calibr8: A toolbox for constructing detailed observation models to be used as likelihoods in PyMC.
- gumbi: A high-level interface for building GP models.
- SunODE: Fast ODE solver, much faster than the one that comes with PyMC.
- pymc-learn: Custom PyMC models built on top of pymc3_models/scikit-learn API
- Exoplanet: a toolkit for modeling of transit and/or radial velocity observations of exoplanets and other astronomical time series.
- beat: Bayesian Earthquake Analysis Tool.
- CausalPy: A package focusing on causal inference in quasi-experimental settings.
- PyMC-Marketing: Bayesian marketing toolbox for marketing mix modeling, customer lifetime value, and more.
See also theecosystem page on our website. Please contact us if your software is not listed here.
See Google Scholarhere andhere for a continuously updated list.
TheGitHub contributor page shows the people who have added content to this repowhich includes a large portion of contributors to the PyMC project but not all of them. Othercontributors have added content to other repos of thepymc-devs GitHub organization or have contributedthrough other project spaces outside of GitHub likeour Discourse forum.
If you are interested in contributing yourself, read ourCode of ConductandContributing guide.
PyMC is a non-profit project under NumFOCUS umbrella. If you want to support PyMC financially, you can donatehere.
You can get professional consulting support fromPyMC Labs.
About
Bayesian Modeling and Probabilistic Programming in Python
Topics
Resources
License
Code of conduct
Contributing
Uh oh!
There was an error while loading.Please reload this page.


