Hello,
only a fast publish to announce that particles now implements a number of of the smoothing algorithms launched in our latest paper with Dang on the complexity of smoothing algorithms. Here’s a plot that compares their operating time for a given variety of particles:
All these algorithms are primarily based on FFBS (ahead filtering backward smoothing). The primary two aren’t new. O(N^2) FFBS is the classical FFBS algorithm, which as complexity O(N^2).
FFBS-reject makes use of (pure) rejection to decide on the ancestors within the backward step. In our paper, we clarify that the operating time of FFBS-reject is random, and will have an infinite variance. Discover how massive are the corresponding bins, and the big variety of outliers.
To alleviate this challenge, we launched two new FFBS algorithms; FFBS-hybrid tries to make use of rejection, however stops after N failed makes an attempt (after which change to the extra in depth, precise methodology). FFBS-MCMC merely makes use of a (single) MCMC step.
Clearly, these two variants runs quicker, however FFBS-MCMC wins the cake. This has to do with the inherent problem in implementing (effectively) rejection sampling in Python. I’ll weblog about that time in a while (hopefully quickly). Additionally, the operating time of FFBS-MCMC is deterministic and is O(N).
That’s it. If you wish to know extra about these algorithms (and different smoothing algorithms), take a look on the paper. The script that generated the plot above can be obtainable in particles (in folder papers/complexity_smoothing). This experiment was carried out on the identical mannequin as the instance as within the chapter on smoothing within the guide, primarily. I ought to add that, on this experiment, the Monte Carlo variance of the output is basically the identical for the 4 algorithms (so evaluating them solely by way of CPU time is honest).