Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Bayesian inference with probabilistic programming.

License

NotificationsYou must be signed in to change notification settings

TuringLang/Turing.jl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3,506 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Turing.jl logo

Bayesian inference with probabilistic programming

TutorialsAPI docsTestsCode CoverageColPrac: Contributor's Guide on Collaborative Practices for Community Packages

Get started

Install Julia (seethe official Julia website; you will need at least Julia 1.10.8 for the latest version of Turing.jl).Then, launch a Julia REPL and run:

julia>using Pkg; Pkg.add("Turing")

You can define models using the@model macro, and then perform Markov chain Monte Carlo sampling using thesample function:

julia>using Turingjulia>@modelfunctionlinear_regression(x)# Priors           α~Normal(0,1)           β~Normal(0,1)           σ²~truncated(Cauchy(0,3); lower=0)# Likelihood           μ= α.+ β.* x           y~MvNormal(μ, σ²* I)endjulia> x, y=rand(10),rand(10)julia> posterior=linear_regression(x)| (; y= y)julia> chain=sample(posterior,NUTS(),1000)

You can find the main TuringLang documentation athttps://turinglang.org, which contains general information about Turing.jl's features, as well as a variety of tutorials with examples of Turing.jl models.

API documentation for Turing.jl is specifically available athttps://turinglang.org/Turing.jl/stable.

Contributing

Issues

If you find any bugs or unintuitive behaviour when using Turing.jl, please doopen an issue!Please don't worry about finding the correct repository for the issue; we can migrate the issue to the appropriate repository if we need to.

Pull requests

We are of course also very happy to receive pull requests.If you are unsure about whether a particular feature would be welcome, you can open an issue for discussion first.

When opening a PR, non-breaking releases (patch versions) should target themain branch.Breaking releases (minor version) should target thebreaking branch.

If you have not received any feedback on an issue or PR for a while, please feel free to ping@TuringLang/maintainers in a comment.

Other channels

The Turing.jl userbase tends to be most active on the#turing channel of Julia Slack.If you do not have an invitation to Julia's Slack, you can get one fromthe official Julia website.

There are also often threads onJulia Discourse (you can search using, e.g.,theturing tag).

What's changed recently?

We publish a fortnightly newsletter summarising recent updates in the TuringLang ecosystem, which you can view onour website,GitHub, orJulia Slack.

For Turing.jl specifically, you can see a full changelog inHISTORY.md orour GitHub releases.

Where does Turing.jl sit in the TuringLang ecosystem?

Turing.jl is the main entry point for users, and seeks to provide a unified, convenient interface to all of the functionality in the TuringLang (and broader Julia) ecosystem.

In particular, it takes the ability to specify probabilistic models withDynamicPPL.jl, and combines it with a number of inference algorithms, such as:

Citing Turing.jl

If you have used Turing.jl in your work, we would be very grateful if you could cite the following:

Turing.jl: a general-purpose probabilistic programming language
Tor Erlend Fjelde, Kai Xu, David Widmann, Mohamed Tarek, Cameron Pfiffer, Martin Trapp, Seth D. Axen, Xianda Sun, Markus Hauru, Penelope Yong, Will Tebbutt, Zoubin Ghahramani, Hong Ge
ACM Transactions on Probabilistic Machine Learning, 2025 (Just Accepted)

Turing: A Language for Flexible Probabilistic Inference
Hong Ge, Kai Xu, Zoubin Ghahramani
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1682-1690, 2018.

Expand for BibTeX
@article{10.1145/3711897,author ={Fjelde, Tor Erlend and Xu, Kai and Widmann, David and Tarek, Mohamed and Pfiffer, Cameron and Trapp, Martin and Axen, Seth D. and Sun, Xianda and Hauru, Markus and Yong, Penelope and Tebbutt, Will and Ghahramani, Zoubin and Ge, Hong},title ={Turing.jl: a general-purpose probabilistic programming language},year ={2025},publisher ={Association for Computing Machinery},address ={New York, NY, USA},url ={https://doi.org/10.1145/3711897},doi ={10.1145/3711897},note ={Just Accepted},journal ={ACM Trans. Probab. Mach. Learn.},month = feb,}@InProceedings{pmlr-v84-ge18b,title ={Turing: A Language for Flexible Probabilistic Inference},author ={Ge, Hong and Xu, Kai and Ghahramani, Zoubin},booktitle ={Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics},pages ={1682--1690},year ={2018},editor ={Storkey, Amos and Perez-Cruz, Fernando},volume ={84},series ={Proceedings of Machine Learning Research},month ={09--11 Apr},publisher ={PMLR},pdf ={http://proceedings.mlr.press/v84/ge18b/ge18b.pdf},url ={https://proceedings.mlr.press/v84/ge18b.html},}

[8]ページ先頭

©2009-2026 Movatter.jp