Sequential Monte Carlo Methods & Particle Filters Resources

by Arnaud Doucet


This vintage webpage presents a list of references, codes and videolectures available for SMC/particle filters.
It is by no means exhaustive and obviously biased towards my work and the work of my close colleagues.
A complementary site for SMC and Particle filters resources by Pierre Del Moral can be found here.

I keep on adding stuff from time to time, although not as often as I should.



G. Kitagawa & W. Gersch, Smoothness Priors Analysis of Time Series, Lecture Notes in Statistics, Springer, 1996.
- The first book I am aware of discussing particle filters, applications to spectral analysis, changepoints etc and it is also a very nice introduction to state-space modelling.

* P. P. Del Moral & L. Miclo, Branching and Interacting Particle Systems Approximations of Feynman-Kac Formulae with Applications to Non-linear Filtering, Seminaire de Probabilites, Lecture Notes in Mathematics, Springer-Verlag Berlin, vol. 1729, pp. 1-145, 2000. Ps
- The first review of theoretical results behind SMC, the books by Del Moral (2004, 2013 see below) are much more complete (but these lecture notes are free, although the books are also free on some unauthorized websites).

* A.D., N. De Freitas & N.J. Gordon (editors), SMC Methods in Practice, Springer-Verlag, 2001.
- Large collection of chapters on the subject, a bit outdated now but good to start with.

* J.S. Liu, Monte Carlo Strategies for Scientific Computing, Springer-Verlag, 2001.
- Includes a couple of chapters on SMC with applications to non-parametric Bayes, contingency tables and self-avoiding random walks.

* P. Del Moral, Feynman-Kac Formulae: Genealogical and Interacting Particle Approximations, Springer-Verlag, 2004.
- Everything you want to know about the theory behind SMC, also includes nice non-standard applications. The notation can appear a little bit overwhelming at the beginning but anyone who has worked on the subject learn to appreciate how powerful they are eventually.

* P. Del Moral, Mean Field Simulation for Monte Carlo Integration, Chapman & Hall - CRC, 2013.
- A significantly updated and expanded version of his previous 2004 monograph. For beginners, better to start with the 2004 monograph.

* O. Cappe, E. Moulines & T. Ryden, Inference in Hidden Markov Models, Springer-Verlag, 2005.
A comprehensive treatment of hidden Markov models which includes a few chapters on SMC methods.

* S. Särkkä, Bayesian Filtering and Smoothing, CUP, 2013. See dedicated site and online
- An introduction to advanced nonlinear filtering methods including SMC supported by Matlab examples.

* N. Chopin & O. Papaspiliopoulos, An Introduction to Sequential Monte Carlo, Springer-Verlag, 2020.
- Best textbook introduction to the field, covers most of the lastest developments in this area.

Basic Introduction to SMC for state-space models

* A.D., N. De Freitas and N.J. Gordon, An introduction to Sequential Monte Carlo Methods, in SMC in Practice, 2001 Pdf
- Simple introduction to basic SMC methods for state-space models.

Early papers

* L. Stewart, P. McCarty, The use of Bayesian Belief Networks to fuse continuous and discrete information for target recognition and discrete information for target recognition, tracking, and situation assessment, in Proc. SPIE Signal Processing, Sensor Fusion and Target Recognition,, vol. 1699, pp. 177-185, 1992. Pdf (see pages 171-176 in acrobat).
- First published paper I am aware of introducing what is now known as the bootstrap filter, this paper has only been cited 23 times since 1992! (at least I cited it 3 times). Outside the filtering context, i.e. physics, similar ideas had been already proposed: see, e.g., Hetherington Phys. Rev. A 1984.

* N.J. Gordon, D. Salmond and A.F.M. Smith, Novel approach to nonlinear/non-Gaussian Bayesian state estimation, IEE Proc. F, 1993 (submitted April 1992) Pdf
- The seminal paper introducing SMC for filtering.

* G. Kitagawa, Monte Carlo filter and smoother for non-Gaussian nonlinear state-space models, JCGS, 1996
- Journal version of A Monte Carlo Filtering and Smoothing Method for Non-Gaussian Nonlinear State Space Models published in 1993 in the Proceedings of the 2nd U.S.-Japan Joint Seminar on Statistical Time Series Analysis, pp. 110-131. This 1993 paper introduced particle filters at the same time as Gordon, Salmond & Smith but it has been unfairly forgotten. First introduction of stratified resampling.

* M. Hurzeler and H. Kunsch, Monte Carlo approximations for general state-space models, JCGS, 1998.
- Propose to address the filtering and smoothing problems using Monte Carlo: not quite SMC as implemented nowadays as rejection sampling is used to sample from a mixture of distributions.

* C. Berzuini, N. Best, W.R. Gilks and C. Larizza, Dynamic conditional independence models and MCMC methods, JASA, 1997 Pdf
- Uses MCMC at each time step to sample from the mixture of distributions that appears instead of exactly, also known as sequential MCMC nowadays.

* J.S. Liu and R. Chen, Sequential Monte Carlo methods for dynamic systems, JASA, 1998 Pdf
- This paper shows that SMC goes far beyond state-space models and are applicable to any sequence of distributions of increasing dimension.

* M.K. Pitt and N. Shephard, Filtering via Simulation: Auxiliary Particle Filter, JASA, 1999 Pdf
- This paper introduces the popular auxiliary particle filter and perfect adaptation.

* J. Carpenter, P. Clifford and P. Fearnhead, An Improved Particle Filter for Non-linear Problems, IEE Proc. F, 1999 Pdf  
- This paper presents an improved version of auxiliary particle filters and stratified resampling.

* A.D., S.J. Godsill and C. Andrieu, On Sequential Monte Carlo sampling methods for Bayesian filtering, Stat. Comp., 2000 Pdf  
- This paper presents the "optimal" importance distribution, ways to approximate it, smoothing and Rao-Blackwellization.

Tutorials papers

H. Kunsch, State space and hidden Markov models, Chapter 3 of Complex Stochastic Systems, O. E. Barndorff-Nielsen, D. R. Cox and C. Klüppelberg, eds., CRC Press, 109--173, 2001.
- Presents a nice survey of HMM, state-space and Monte Carlo approximations as of 2001. Also discusses the connections with the reference measure approach favoured by Elliott et al.

*  S. Arulampalam, S. Maskell, N.J. Gordon & T. Clapp, A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking, IEEE Trans. Sig. Proc., 2002 Pdf
- Popular tutorial but a bit outdated now, e.g. does not include resample-move, smoothing or Rao-Blackwellisation. Could be a good start to understand SMC in a state-space model context though.

* O. Cappe, S.J. Godsill & E. Moulines, An Overview of Existing Methods and Recent Advances in SMC, Proc. IEEE, 2007
- Discusses resample move, some smoothing techniques and Rao-Blackwellisation.

* D. Creal, A Survey of Sequential Monte Carlo Methods for Economics and Finance, Econometric Reviews Pdf
- Tutorial introducing SMC and its applications in an economics and finance context.

* P. Del Moral, F. Patras, & S. Rubenthaler, A Mean Field Theory of Nonlinear Filtering, in the Oxford Handbook of Nonlinear Filtering, Oxford University Press, 2011 Pdf
- A more theoretically oriented tutorial that presents some of the main results in the area.

* P. Del Moral & A.D., Particle Methods: An Introduction with Applications, ESAIM Proceedings, 2014 Pdf 
- Discuss particle methods in a general (non-necessarily filtering context) and presents the proofs of simple theoretical results.

*  A.D. & A.M. Johansen - A Tutorial on Particle Filtering and Smoothing: 15 Years Later, in the Oxford Handbook of Nonlinear Filtering, Oxford University Press, 2011 Pdf  
- Shows that most SMC algorithms including auxiliary particle filters, resample-move, block sampling etc. can be reinterpreted within a simple unified framework.

*  N. Kantas, A.D., S.S. Singh, J.M. Maciejowski and N. Chopin, On Particle Methods for Parameter Estimation in State-Space Models. Statistical Science, 2015. Pdf Matlab code
- Details the pros and cons of existing particle methods for static parameter estimation. Upated version of An overview of sequential Monte Carlo methods for parameter estimation in general state-space models, in Proceedings IFAC System Identification (SySid) Meeting, 2009.

* A.D. & A. Lee, Sequential Monte Carlo Methods, Handbook of Graphical Models, 2018;
-Most recent tutorial covering recent methodological progress in the SMC areas including alpha-resampling, twisted algorithms, particle MCMC etc. Pdf.

Fighting degeneracy: Using MCMC steps & look-ahead strategies

A well-known problem with SMC approximations is that they suffer from the degeneracy problem, i.e. as time increases the approximations of the "earlier" marginal distributions collapse. You can try to mitigate (but not eliminate) this problem using either some MCMC moves as suggested by Gilks & Berzuini or by using lookahead strategies of which my favourite is block sampling.

*  W. Gilks & C. Berzuini, Following a moving target: Monte Carlo inference for dynamic Bayesian models, JRSS B, 2001
- Proposes to move the particles using MCMC moves with the appropriate invariant distribution so as to introduce diversity among particles.

* A.D., M. Briers & S. Senecal, Efficient Block Sampling Strategies for SMC, JCGS, 2006 Pdf 
- Shows how it is possible to potentially drastically reduce the number of resampling steps, hence the degeneracy, by sampling blocks of state variables. Can be thought of as a block version of auxiliary particle filters.

* M. Lin, R. Chen & J.S. Liu, Lookahead Strategies for SMC, Statistical Science 2012 Pdf
- Reviews various lookahead strategies to mitigate degeneracy.

Reducing the Variance using Rao-Blackwellization

Whenever you can compute an integral analytically, then do it and avoid Monte Carlo. An obvious principle one can put in practice for a wide range of state-space models of interest.

* C. Andrieu & A.D., Particle Filtering for Partially Observed Gaussian State Space Models, JRSS B, 2002. Pdf
- Uses the Kalman filter to compute the prior of a latent process, particularly useful for censored data/dynamic probit and tobit models.

* R. Chen & J. Liu, Mixture Kalman filters, JRSS B, 2000. Pdf
Shows that for conditionally linear Gausian models which includes switching state-space models, it is possible to devise a particle filter which is a mixture of Kalman filters.

A.D., S.J. Godsill & C. Andrieu, On Sequential Monte Carlo sampling methods for Bayesian filtering, (section IV) Stat. Comp., 2000 Pdf
- Section IV proposed independently the same material as Chen & Liu (2000).

* K.P. Murphy, A.D., N. De Freitas & S. Russell, Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks, in Proc. Uncertainty in Artificial Intelligence, 2000. Pdf  
- Uses Rao-Blackwellization techniques to do efficient inference in high dimensional dynamic Bayes nets.

* P. Fearnhead, P. & P. Clifford (2003). On-line inference for hidden Markov models via particle filters. JRSS B, 2003.
- Shows that for discrete-valued latent processes, standard particle filters are inefficient and proposes an alternative approach that bypasses the sampling step. Demonstrate its use for switching state-space models.

* T. Schon, F. Gustafsson & P. Nordlund. Marginalized particle filters for mixed linear/nonlinear state-space models. IEEE Trans. Sig. Proc., 2005. Pdf
- Proposes some generalizations of the Rao-Blackwellized particle filters.

Reducing the Variance using Quasi Monte Carlo

M. Gerber and N. Chopin, Sequential Quasi Monte Carlo, JRSS B (with discussion), 2015. Pdf
Shows that you can use QMC within SMC to reduce substantially the variance of your estimates for low-dimensional models. Applying QMC to SMC is not trivial at all and the paper achieves this by the use of a Hilbert space-filling curve. Very neat but a bit difficult to implement.

Parallel implementation

A. Lee et al., On the utility of graphics cards to perform massively parallel implementation of advanced Monte Carlo, JCGS, 2010. Website with CUDA code  Pdf
- Discusses the use of GPU for the implementation of SMC methods.

* A. Lee & N.Whitely, Forest resampling for distributed SMC, Statistical Analysis and Data Mining, 2015 Pdf
- SMC are not so easy to parallelize as the resampling operation is a bottleneck. This paper provides practical and principled ways to perform resampling on a distributed computing architecture. Very neat!

SMC Methods on Trees, Partially Ordered Sets, Combinatorial Spaces

* L. Wang, A. Bouchard-Cote & A.D., Bayesian phylogenetic inference using a combinatorial SMC, JASA, 2015 Pdf
- Assume you want to do SMC not on state-space models but say for non-clock trees or more generally on some combinatorial space. This paper presents a method to deal with this relying on the existence of a partially ordered set structure (poset).

 SMC Smoothing

In the context of state-space models, it is often of interest to perform smoothing. Standard SMC approximations do provide an estimate of  the joint smoothing distributions but it is poor because of the degeneracy problem aforementioned. Specific smoothing procedures have been proposed to address this.

* A.D., S.J. Godsill & C. Andrieu, On Sequential Monte Carlo sampling methods for Bayesian filtering, Stat. Comp., 2000 Pdf  
- This paper describes an SMC implementation of forward filtering-backward smoothing to compute marginal smoothing distributions.

*  G. Kitagawa & S. Sato, Monte Carlo Smoothing and Self-organising State-Space Model. In SMC in Practice, 2001.
- This paper proposes to approximate fixed-interval marginal smoothing distributions by fixed-lag marginal smoothing distributions to reduce drastically the degeneracy problem.

* S.J. Godsill, A.D. & M. West, Monte Carlo Smoothing for Nonlinear Time Series, JASA, 2004 Pdf
- This paper describes SMC implementation of the forward filtering-backward sampling procedure to obtain samples approximately from the joint smoothing distribution, cost O(NT) per path for N particles and T data.

* P. Del Moral, A.D. & S.S. Singh, Forward Smoothing using SMC, Technical report Cambridge University TR 638, Sept. 2009, revised 2010 Pdf.
- This paper describes an SMC implementation of the forward filtering-backward smoothing to compute expectations of additive functionals that bypasses entirely the backward pass, presents theoretical results and applied it to on-line parameter estimation using on-line gradient and on-line EM. 

* M. Briers, A.D. & S. Maskell, Smoothing Algorithms for State-space Models, Ann. Instit. Stat. Math., 2010 Pdf
- This paper presents a generalized version of the two-filter smoothing formula which can be readily implemented using SMC methods to compute marginal smoothing distributions and sample approximately from the joint. Direct implementation has complexity O(N^2.T) and proposed rejection sampling method yields O(N.T) (in compact spaces...) to compute marginals, O(T) to sample approximately from the joint.

* P. Fearnhead, D. Wyncoll & J. Tawn, A Sequential Smoothing Algorithm with Linear Computational Cost, Biometrika, 2010 Pdf
- This paper describes an importance sampling procedure to reduce computational complexity of SMC implementation of direct implementation of generalized two-filter formula to O(N.T) to compute marginals.

* R. Douc, A. Garivier, E. Moulines & J. Olsson, Sequential Monte Carlo smoothing for general state space hidden Markov models, Ann. Applied Proba, 2011 Pdf
- This paper describes a rejection sampling method to implement forward filtering backward sampling, cost O(T) per path and presents various theoretical results.

* P. Guarniero, A.M. Johansen and A. Lee, The iterated auxiliary particle filter, JASA, 2017.
- This paper provides an iterative procedure to learn the backward information filter which allows to substantially reduce the degeneracy in important scenarios.

SMC for on-line Bayesian static parameter estimation in state-space models

It is tempting to do on-line Bayesian static parameter estimation in state-space models using SMC and MCMC moves. Unfortunately, all these methods suffer from the path degeneracy problems so should be used cautiously: definitely unreliable for big datasets and/or vague priors.

* P. Fearnhead, MCMC, sufficient statistics and particle filters, JCGS, 2002 Pdf
-   Journal version of a chapter of the D.Phil of 
Fearnhead (1998) where it is proposed for the first time to use MCMC moves on static parameters to mitigate the degeneracy problem. It is clearly acknowledged that this does not entirely solve the problem; see the discussion.

* C. Andrieu, N. De Freitas and A.D., Sequential MCMC for Bayesian Model Selection, Proc. IEEE Workshop HOS, 1999 Pdf
- Presents an SMC algorithm for on-line Bayesian parameter estimation for autoregressive parameters with unknown order using
reversible jump MCMC moves. It is explicitly mentioned at the end of the paper and demonstrated experimentally that such methods are bound to suffer from the degeneracy problem.

* G. Storvik,
Particle filters for state-space models with the presence of unknown static parameters, IEEE Trans. Signal Processing, 2002 Pdf
- Proposes a more sophisticated version of Fearnhead's proposal.

* H.F. Lopes & R. S. Tsay, Particle Filters and Bayesian Inference in Financial Econometrics, J. Forecasting, 2010.
- Reviews at length the so-called particle learning, i.e. auxiliary particle filter with perfect adaption and MCMC moves for static parameters, for on-line Bayesian parameter estimation with detailed simulation results illustrating the degeneracy problem.

A pragmatic approach consists of adding an artificial dynamic noise on the static parameter:

* J. Liu & M. West, Combined parameter and state estimation in simulation-based filtering, in SMC in Practice, 2001. Pdf
- Introduce artificial dynamic noise on the static parameter and mitigate the variance inflation using a shrinkage procedure.

SMC for on-line and batch maximum likelihood inference of static parameter estimation in state-space models

As all the SMC procedures for on-line Bayesian inference suffer from the degeneracy problem, me and my colleagues have tried for many years to develop alternative methods which bypass this problem. If you accept to be non-Bayesian about the parameter, this is possible. An earlier approach we considered consists of using a pseudo-likelihood, this yields an estimate which is not statistically efficient but you do not even need particle filters in this case. Eventually, we have come up with a forward-only implementation of the forward filtering-backward smoothing procedure: it is the key to obtain stable algorithms to perform on-line Maximum Likelihood (ML) static parameter estimation in state-space models. For off-line approaches, all the smoothing approaches described previously can be and have been used.

* C. Andrieu, A.D. & V.B. Tadic, Online EM for parameter estimation in nonlinear-non Gaussian state-space models, Proc. IEEE CDC, 2005 Pdf
- This paper describe a pseudo-likelihood approach originally proposed by Ryden for finite HMM, establish theoretical results, application to on-line parameter estimation where pseudo-likelihood is maximized using the on-line EM.

* J. Olsson, O. Cappe, R. Douc & E. Moulines, SMC Smoothing with Application to Parameter Estimation in Nonlinear State-Space Models, Bernoulli, 2008 Pdf
- This paper quantifies the bias introduced by the fixed-lag approximation of Kitagawa & Sato, presents numerical results and uses it for parameter estimation using off-line EM.

* P. Del Moral, A.D. & S.S. Singh, Forward Smoothing using SMC, Technical report Cambridge University TR 638, Sept. 2009, revised 2010 Pdf.
- This paper describes an SMC implementation of the forward filtering-backward smoothing to compute expectations of additive functionals that bypasses entirely the backward pass, presents theoretical results and applied it to on-line parameter estimation using on-line gradient.

* S. Malik & M.K. Pitt, Particle filters for continous likelihood evaluation and maximisation. J. Econometrics, 2011 (slighly revised version of M.K. Pitt, Smoother particle filters for likelihood evaluation and maximisation, Technical report, 2002).
- Even if you fixed the random seed in your particle filter, the resulting simulated likelihood function is discontinuous and can vary significantly for a moderate number of particles as the resampling step is a discontinuous operation. Evaluating the MLE by maximizing this function is thus difficult. For one-dimensional state, M.K. Pitt proposed a simple and efficient solution to this problem in 2002, just reorder the particles on the real line and perform a piecewise linear approximation of the resulting empirical CDF. You obtain a continuous simulated likelihood function.

* G. Poyiadjis, A.D. & S.S. Singh, Particle Approximations of the Score and Observed Information Matrix in State-Space Models with Application to Parameter Estimation, Biometrika, 2011 Pdf
- This paper describes an original approach to compute the score vector and OIM on-line that is more robust than the standard approach, it can be interpreted as a particular case of the forward smoothing and we show how it can be used for on-line ML parameter estimation. 

* J. Olsson & J. Westerborn, Efficient particle-based online smoothing in general hidden Markov models: the PaRIS algorithm, Bernoulli, 2016 to appear. Pdf
- This paper shows that you can bypass the O(N^2) computational complexity at each time step of the forward-only procedures in (Del Moral, D. & Singh, 2009; Poyadjis, D. & Singh, 2011) by using a further Monte Carlo approximation based on rejection sampling.

* E.L. Ionides, A. Bhadra, Y. Atchade & A. King, Iterated Filtering, Annals of Statistics, 2011. Pdf
- This paper shows how can one can compute an approximation of the score vector using a perturbed state-space model and apply it to batch ML parameter estimation. It is especially useful in scenarios where one does not have access to the expression of the transition kernel of the latent process.

* E.L. Ionides, D. Nguyen, Y. Atchade, S. Stoev and A.A. King, Inference for dynamic and latent variable models via iterated, perturbed Bayes maps, PNAS, 2015.
- An elegant iterative algorithm to perform batch ML iteration.

SMC for batch Bayesian static parameter estimation in state-space models

Obviously for batch joint Bayesian state and parameter estimation, you can use MCMC methods. However it can be difficult to design efficient algorithms. In the context where one can only simulate the latent process but does not have access to the transition prior, standard MCMC just fail. SMC can come to the rescue in these scenarios.

* Lee, D.S. & Chia, N.K.K, A particle algorithm for sequential Bayesian parameter estimation and model selection, IEEE Trans. Signal Proc., 2002.
- This paper proposes to use particle filters mixed with MCMC steps. MCMC steps are used to rejuvenate the whole state sequence and parameter so this is not an on-line algorithm as the complexity increases over time but can be used as an alternative to standard MCMC.

* C. Andrieu, A.D. & R. Holenstein, Particle Markov chain Monte Carlo for Efficient Numerical Simulation, in Monte Carlo and Quasi Monte Carlo Methods 2008, Lecture Notes in Statistics, Springer, pp. 45-60, 2009. Pdf  and Particle Markov chain Monte Carlo methods (with discussion), JRSS B, 2010 Pdf
- This paper shows that it is possible to build high-dimensional proposal distributions for MCMC using SMC, it can be used to develop algorithms to sample from the joint posterior distribution of states and parameters. NB: At the 101 level, you can establish the validity of the particle marginal Metropolis-Hastings on the parameter component from the unbiasedness of the marginal likelihood estimator. As pointed out in Section 5.1. of the paper, this is somewhat restrictive and misleading. In my opinion, the nicest part of the paper is the conditional SMC/particle Gibbs sampler bit which does not follow at all from the unbiasedness of the marginal likelihood estimator.

* N. Whiteley, C. Andrieu & A.D., Efficient Bayesian Inference for Switching State-Space Models using Discrete Particle Markov Chain Monte Carlo methods, Technical report no. 1004 Department of Mathematics Bristol University 2010 Pdf
- This paper shows how the discrete particle filter of Fearnhead and Clifford (2003) can be used within MCMC, also presents an original backward sampling procedure in a non-Markovian framework which is an extension of the procedure originally proposed by Whiteley in the discussion of the particle MCMC paper.

* N. Chopin, P. Jacob & O. Papaspiliopoulos. SMC^2: A SMC algorithm with particle MCMC updates. JRSS B, 2013 Pdf
- This paper substitutes to the MCMC used in the particle MCMC paper an SMC algorithm, you obtain a hierarchical SMC algorithm. This yields a powerful algorithm for sequential inference; this is not a truly on-line algorithm as the complexity increases over time.

* A. Fulop & J. Li, Robust and Efficient Learning: A Marginalized Resample-Move Approach, J. Econometrics, 2013. Pdf
- The authors proposed independently the same algorithm as SMC^2.

* F. Lindsten, M.I. Jordan and T.B.
Schon, Particle Gibbs with Ancestor Sampling, JMLR, 2014. Pdf 
- Proposes an interesting variant of the particle Gibbs sampler where ancestors are resampled in a forward pass. 

* A.D., M.K. Pitt, G. Deligiannidis and R. Kohn,
Efficient Implementation of Markov chain Monte Carlo when Using an Unbiased Likelihood, Biometrika, 2015. Pdf 
- When implementing the particle Metropolis-Hastings algorithm, and generally speaking any pseudo-marginal algorithm, how many particles one should use to evaluate the likelihood so as to minimise the asymptotic variance of the resulting MCMC estimates for a fixed computational budget? This paper provides useful guidelines.

* N. Chopin & S.S. Singh. On the particle Gibbs sampler, Bernoulli, 2015 Pdf
- Establishes uniform ergodicity of the particle Gibbs sampler through a coupling argument and discusses some algorithmic variants.

* S.S. Singh, F. Lindsten & E. Moulines, Blocking strategies and stability of particle Gibbs samplers, Preprint 2015 Pdf
- Assume you want to sample the posterior distribution of a very long hidden state sequence. This paper shows that you can use a blocking version of particle Gibbs such that
the cost per-iteration only grows linearly with the number of latent states while the mixing rate of the sampler does not deteriorate. Additionally the algorithm is easily parallelizable.

* N. Whiteley and A. Lee, Twisted particle filters, Annals of Statistics, 2014 Pdf 
- Introduces a new class of particle filters which can provide significantly lower variance estimates of the normalizing constant at the cost of a simple algorithmic modification. It relies on a change of measure on the particle system. 

SMC as an alternative/complement to MCMC

The idea of mixing SMC and MCMC to sample from a sequence of distributions all defined on the same space has appeared independently in various papers.

* G.E. Crooks,
Nonequilibrium Measurements of Free Energy Differences for Microscopically Reversible Markovian Systems, J. Stat. Phys., 1998. Pdf
- This paper presents a discrete-time version of the celebrated Jarzynski's equality in statistical physics (and slight generalization of it) showing that a sequence of non-homogeneous MCMC kernels "moving" towards a target can be reweighted using a clever importance sampling trick based on some reverse Markov kernels to obtain an unbiased estimator of the normalizing constant and an approximation of the target of interest.  Not quite SMC as no resampling is used.

* W. Gilks & C. Berzuini, Following a Moving Target: Monte Carlo Inference for Dynamic Bayesian Models, JRSS B, 2001
- This paper proposes to move the particles using MCMC moves with the appropriate invariant distribution so as to introduce diversity among particles. It was discussed in the framework of state-space models with static parameters. If you don't include any state, this gives you a method for sampling the sequence of posteriors of the parameter. It also introduces independently the same construction as Crooks based on a sequence of reverse Markov kernels (this is hidden in the proof; see Lemma 1 and Corollary 1).

* R.M. Neal, Annealed Importance Sampling, Stat. Comp., 2001. Pdf
- This paper introduced independently 
a discrete-time version of the celebrated Jarzynski's equality in statistical physics based on a sequence of reverse Markov kernels and discusses carefully some applications to Bayesian inference with a sequence of annealed target distributions. Not quite SMC as no resampling is used. In this context, increasing the number of annealed auxiliary target distributions can prevent standard weights degeneracy.

* N. Chopin, A Sequential Particle Filter Method for Static Models, Biometrika, 2002.
- This paper details a careful application of the resample move algorithm for sampling from the sequence of posterior distributions of a static parameter.

* P. Del Moral, A.D. & A. Jasra, Sequential Monte Carlo Samplers, JRSS B, 2006. Pdf
- This paper discusses a framework generalizing annealed importance sampling and resample-move, discusses the potential benefits of resampling when the sequence of targets evolve quickly and scenarios where previous methods are not applicable.

* J. Heng, A. Bishop, G. Deligiannidis & A.D., Controlled Sequential Monte Carlo. Annals of Statistics 2020. Pdf
- This paper shows how one can improve significantly the performance of Annealed importance sampling and SMC samplers in important scenarios by developing an iterative mechanism to correct the discrepancy between the proposal and target. It relies on the generalized version of Crookes identity presented in Del Moral et al. 2006. Can also be used for smoothing in state-space models.

SMC meets Machine Learning

Using the fact that the SMC estimator of the marginal likelihood is unbiased, we can obtain tighter lower variational bounds which can be used to train complex auto-encoders.

* T.A. Le et al., Auto-encoding Sequential Sequential Monte Carlo, Pdf

* C. Maddison et al., Filtering Variational Objectives, Proc. NIPS, 2017. Pdf

* C.A. Naesseth et al., Variational Sequential Monte Carlo, Pdf

How to make Particle Filters differentiable so as to include them in Machine Learning pipelines.

* A. Corenflos, J. Thornton, G. Deligiannids and A. Doucet, Differentiable Particle Filtering via Entropy-Regularized Optimal Transport, Proc. ICML 2021 Pdf

* J. Lai, J. Domke and D. Sheldon. Variational Marginal Particle Filters, Proc. AISTATS 2022 Pdf

How to combine Sequential Monte Carlo samplers with Normalizing Flows for improved performance.

* M. Arbel, A.G.D.G. Matthews and A. Doucet, Annealed Flow Transport Monte Carlo, Proc. ICML 2021 Pdf

* A.G.D.G. Matthews, M. Arbel, D. Rezende and A. Doucet, Continual Repeated Annealed Flow Transport Monte Carlo, Proc. ICML 2022 Pdf

Recent Convergence Results of Interest

Besides Del Moral's books (2004, 2013), here are a few papers of interest essentially weakening the assumptions in the aforementioned books.

* N. Whiteley, Stability properties of particle filters, Annals Applied Proba, 2013. Pdf
- Weak(er) assumptions ensuring uniform convergence of particle filters, requires bounded observations.

* R. Douc, E. Moulines & J. Olsson, Long-term stability of SMC methods under verifiable conditions, Annals Applied Proba, to appear. Pdf
- Other weak(er) assumptions ensuring uniform stability of the variance for particle filters.

* J. Berard, P. Del Moral & A.D., A log-normal central limit theorem for particle approximations of normalizing constants, Electronic J. Proba, 2014 Pdf.
- Standard CLT for particle methods assume the time horizon T is fixed and the number N of particles goes to infinity. However, in practice people usually scale N linearly with T when estimating normalizing constants/marginal likelihoods. This paper establishes a CLT for the resulting estimate as T goes to infinity. The normalizing constant estimate divided by the true normalizing constant converges towards a log-normal distribution.

What about SMC in high dimensional spaces?

* R. Van Handel, Can particle filters beat the curse of dimensionality?  Pdf
- Answer is yes if you are ready to accept some bias but current assumptions to ensure this are extremely strong, or no bias
but the rate of convergence is slowler than the usual Monte Carlo rate.

* A. Beskos, D. Crisan & A. Jasra, On the stability of SMC in high dimensions. Annals Applied Proba. Pdf
- Look at stability of SMC samplers in high dimension for i.i.d targets. Things are not going exponentially bad but only quadratically with the dimension if your MCMC kernels mix well.

Slides and Videolectures

* Video of my lectures at Machine Learning Summer School 2007.

* Video lectures: Tutorial at NIPS 2009 by A.D. & N. De Freitas

* Slides of P. Del Moral. for Machine Learning Sumeer School 2008.

* Slides of T. Schon, Linkoping 2012.

* Video of my talk at French Statistical Meeting on particle MCMC (in French!).

* Slides of my 8 hours course Statistical Mathematics summer school 2011 (to be posted). This is a much updated version of a course I gave in SAMSI in Fall 2008.

* Slides of my course at Machine Learning Summer School 2012

* Slides of P. Fearnhead for a graduate course in computational statistics 2012


LibBi by L. Murray (Oxford): powerful C++ template library + parser & compiler in Perl to perform particle filters, PMCMC and SMC^2 at lightspeed, works for multi-core CPU and GPU too.

* Probabilistic C by B. Paiges & F. Wood (Oxford): compilationtarget for probabilistic programming languages, includes SMC and PMCMC.

* Anglican by F. Wood et al. (Oxford): open source, compiled probabilistic programming language, includes PMCMC.

* Team INRIA Alea: Biips Bayesian Inference with Interacting Particle Systems. Great BUGS/JAGS type software for SMC methods.

* Gen: general purpose probabilistic programming including particle methods by MIT Probabilistic Computing Project.

* T. Brown: C++ Library for fast particle filtering : Library

* D. Creal: Matlab code and Ox code for all the examples in his recent tutorial paper (see above)

* N. De Freitas: Matlab code for Rao-Blackwellized particle filters and Unscented particle filter.

* P. Fearnhead: R code for particle filters and particle Gibbs sampler.

* P. Jacob: Python packages for 
particle Marginal Metropolis-Hastings and SMC^2.

* A. Jasra: C++ code for SMC samplers examples.

* A.M. Johansen: C++ Sequential Monte Carlo template 
link and an R interface to this package by D. Eddelbuettel link

* A. King, E.L. Ionides et al.: R package Statistical inference for partially observed models, includes bootstrap filter, iterated filtering and particle Marginal Metropolis-Hastings.

* A. Lee et al.: GPU code for particle filters and SMC samplers link and paper.

* F. Lindsten: Matlab code for particle
Marginal Metropolis-Hastings and particle Gibbs with ancestor sampling link

* D. Rasmussen: Matlab code for
particle Marginal Metropolis-Hastings implemented in this 2011 PLOS Comp. Biology paper.

* T. Schon: Matlab code for Rao-Blackwellised particle filter and EM for parameter estimation.

* D. Wilkinson: R code corresponding to the second edition of his cool book, includes
particle Marginal Metropolis-Hastings (chapter 10).