- Notifications
You must be signed in to change notification settings - Fork1
Evaluating model calibration methods for sensitivity analysis, uncertainty analysis, optimisation, and Bayesian inference
License
NotificationsYou must be signed in to change notification settings
JBris/model-calibration-evaluation
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Evaluating model calibration methods for sensitivity analysis, uncertainty analysis, optimisation, and Bayesian inference
Seeconfig.yaml for the ground-truth simulation parameters.
The following model calibration methods have been evaluated.
- Approximate Bayesian Computation - Sequential Monte Carlo
- Bayesian Optimisation
- Bayesian Optimisation for Likelihood-Free Inference
- Differential Evolution Adaptive Metropolis
- Experimental Design via Gaussian Process Emulation
- Flow Matching Posterior Estimation
- Genetic Algorithm
- Tree-structured Parzen Estimator
- Polynomial Chaos Expansion
- Polynomial Chaos Kriging
- Sparse Axis-Aligned Subspace Bayesian Optimization
- Shuffled Complex Evolution Algorithm Uncertainty Analysis
- Sequential Neural Posterior Estimation
- Sobol Sensitivity Analysis
- Truncated Marginal Neural Ratio Estimation
About
Evaluating model calibration methods for sensitivity analysis, uncertainty analysis, optimisation, and Bayesian inference
Topics
deep-learning pymc global-optimization sensitivity-analysis bayesian-optimization bayesian-statistics kriging uncertainty-analysis approximate-bayesian-computation polynomial-chaos-expansion surrogate-models generative-neural-network optuna likelihood-free-inference simulation-based-inference sobol-indices shuffled-complex-evolution differential-evolution-mcmc polynomial-chaos bolfi
Resources
License
Code of conduct
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Packages0
No packages published
Uh oh!
There was an error while loading.Please reload this page.