Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Bayesian Methods for State Space Models

License

Unknown, MIT licenses found

Licenses found

Unknown
LICENSE
MIT
LICENSE.md
NotificationsYou must be signed in to change notification settings

BjarkeHautop/bayesSSM

Repository files navigation

Codecov test coverageR-CMD-checkCRAN_Status_BadgeCRAN monthly downloadsCRAN total downloads

bayesSSM is an R package offering a set of tools for performingBayesian inference in state-space models (SSMs). It implements theParticle Marginal Metropolis-Hastings (PMMH) in the main functionpmmhfor Bayesian inference in SSMs.

Why bayesSSM?

While there are several alternative packages available for performingParticle MCMC,bayesSSM is designed to be simple and easy to use. Itwas developed alongside my Master’s thesis about Particle MCMC, since Iwas implementing everything from scratch anyway.

Why PMCMC?

In a state-space model, the joint posterior density of the parameters$\theta$ and the latent states$x_{1:T}$ given observations$y_{1:T}$ is

Since the dimension of the latent states increases with the number ofobservations standard MCMC methods such as HMC can struggle to explorethis high-dimensional correlated space efficiently. They can also sufferfrom degeneracy when attempting to jointly sample entire latenttrajectories.

Particle Markov Chain Monte Carlo (PMCMC) methods, such as the ParticleMarginal Metropolis-Hastings (PMMH) implemented in this package, aredesigned to handle these situations. By using particle filters tounbiasedly estimate the marginal likelihood, they enable efficientsampling from the correct joint posterior while avoiding the need toexplicitly explore the full latent state space.

State-space Models

A state-space model (SSM) has the structure given in the followingdiagram, where we omitted potential time dependency in the transitionand observation densities for simplicity.

The core function,pmmh, implements Particle MarginalMetropolis-Hastings, which is an algorithm that first generates a set of$N$ particles to approximate the intractable marginal likelihood andthen uses this approximation in the acceptance probability. Theimplementation automatically tunes the number of particles and theproposal distribution for the parameters.

Installation

You can install the latest stable version ofbayesSSM from CRAN with:

install.packages("bayesSSM")

or the development version from GitHub with:

# install.packages("pak")pak::pak("BjarkeHautop/bayesSSM")

Example

To illustrate how to use thepmmh function, we will simulate a simplestate-space model (SSM) and perform Bayesian inference on it. Note:While this example usespmmh, the model is simple enough that standardMCMC methods could also be applied. For a more complicated example wherestandard MCMC methods would struggle more see the article Stochastic SIRModelhere.

We will simulate a state-space model with the following structure:

Let’s first simulate 20 data points from this model with$\phi = 0.8$,$\sigma_x = 1$, and$\sigma_y = 0.5$.

set.seed(1405)t_val<-20phi<-0.8sigma_x<-1sigma_y<-0.5init_state<- rnorm(1,mean=0,sd=1)x<-numeric(t_val)y<-numeric(t_val)x[1]<-phi*init_state+ sin(init_state)+  rnorm(1,mean=0,sd=sigma_x)y[1]<-x[1]+ rnorm(1,mean=0,sd=sigma_y)for (tin2:t_val) {x[t]<-phi*x[t-1]+ sin(x[t-1])+ rnorm(1,mean=0,sd=sigma_x)y[t]<-x[t]+ rnorm(1,mean=0,sd=sigma_y)}x<- c(init_state,x)

We define the priors for our model as follows:

We can usepmmh to perform Bayesian inference on this model. To usepmmh we need to define the functions for the SSM and the priors.

The functionsinit_fn,transition_fn should be functions thatsimulate the latent states.init_fn must contain the argumentnum_particles for initializing the particles, andtransition_fn mustcontain the argumentparticles, which is a vector of particles, andcan contain any other arguments for model-specific parameters.

The functionlog_likelihood_fn should be a function that calculatesthe log-likelihood of the observed data given the latent statevariables. It must contain the argumentsy for the data andparticles. Time dependency can be implemented by giving at argumentintransition_fn andlog_likelihood_fn.

init_fn<-function(num_particles) {  rnorm(num_particles,mean=0,sd=1)}transition_fn<-function(particles,phi,sigma_x) {phi*particles+ sin(particles)+    rnorm(length(particles),mean=0,sd=sigma_x)}log_likelihood_fn<-function(y,particles,sigma_y) {  dnorm(y,mean=particles,sd=sigma_y,log=TRUE)}

The priors for the parameters must be defined as log-prior functions.Every parameter frominit_fn,transition_fn, andlog_likelihood_fnmust have a corresponding log-prior function.

log_prior_phi<-function(phi) {  dunif(phi,min=0,max=1,log=TRUE)}log_prior_sigma_x<-function(sigma) {  dexp(sigma,rate=1,log=TRUE)}log_prior_sigma_y<-function(sigma) {  dexp(sigma,rate=1,log=TRUE)}log_priors<-list(phi=log_prior_phi,sigma_x=log_prior_sigma_x,sigma_y=log_prior_sigma_y)

Now we can run the PMMH algorithm using thepmmh function. For thisREADME we use a lower number of samples and a smaller burn-in period,and also modify the pilot chains to only use 200 samples. This is doneto make the example run faster.

library(bayesSSM)result<- pmmh(pf_wrapper=bootstrap_filter,# use bootstrap particle filtery=y,m=500,# number of MCMC samplesinit_fn=init_fn,transition_fn=transition_fn,log_likelihood_fn=log_likelihood_fn,log_priors=log_priors,pilot_init_params=list(    c(phi=0.4,sigma_x=0.4,sigma_y=0.4),    c(phi=0.8,sigma_x=0.8,sigma_y=0.8)  ),burn_in=50,num_chains=2,seed=1405,tune_control= default_tune_control(pilot_m=200,pilot_burn_in=10))#> Running chain 1...#> Running pilot chain for tuning...#> Using 50 particles for PMMH:#> Running Particle MCMC chain with tuned settings...#> Running chain 2...#> Running pilot chain for tuning...#> Using 50 particles for PMMH:#> Running Particle MCMC chain with tuned settings...#> PMMH Results Summary:#>  Parameter Mean   SD Median 2.5% 97.5% ESS  Rhat#>        phi 0.76 0.12   0.75 0.55  0.97   8 1.478#>    sigma_x 0.78 0.56   0.74 0.01  1.85  15 1.093#>    sigma_y 0.89 0.36   0.94 0.22  1.45  36 1.051#> Warning in pmmh(pf_wrapper = bootstrap_filter, y = y, m = 500, init_fn =#> init_fn, : Some ESS values are below 400, indicating poor mixing. Consider#> running the chains for more iterations.#> Warning in pmmh(pf_wrapper = bootstrap_filter, y = y, m = 500, init_fn = init_fn, :#> Some Rhat values are above 1.01, indicating that the chains have not converged.#> Consider running the chains for more iterations and/or increase burn_in.

We get convergence warnings as expected due to the small number ofiterations.

About

Bayesian Methods for State Space Models

Topics

Resources

License

Unknown, MIT licenses found

Licenses found

Unknown
LICENSE
MIT
LICENSE.md

Stars

Watchers

Forks

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp