Thursday, October 30, 2025

what’s new, what’s subsequent (your feedback most welcome) – Statisfaction


I’ve simply launched model 0.2 of my SMC python library, particles. I checklist beneath the primary modifications, and talk about some concepts for the way forward for the library.

This module implements varied variance estimators that could be computed from a single run of an SMC algorithm, à la Chan and Lai (2013) and Lee and Whiteley (2018). For extra particulars, see this pocket book.

This module makes it simpler to load the datasets included within the module. Here’s a fast instance:

from particles import datasets as dts

dataset = dts.Pima()
assist(dataset) # fundamental data on dataset
assist(dataset.preprocess) # how knowledge was pre-processed
knowledge = dataset.knowledge # sometimes a numpy array

The library makes it doable to run a number of SMC algorithms in parallel, utilizing the multiprocessing module. Hai-Dang Dau observed there was some efficiency challenge with the earlier implementation (just a few cores might keep idle) and glued it.

Whereas testing the brand new model, I observed that operate distinct_seeds (module utils), which, because the title suggests, generate distinct random seeds for the processes run in parallel, may very well be very gradual in sure instances. I modified the way in which the seeds had been generated to repair the issue (utilizing stratified resampling). I’ll talk about this in additional element a separate weblog put up.

Growth of this library is partly pushed by interactions with customers. For example, the subsequent model can have a extra basic MvNormal distribution (permitting for a covariance matrix that varies throughout particles), as a result of one colleague received in contact and wanted that function.

So don’t be shy, in the event you don’t see do one thing with particles, please get in contact. It’s probably our interplay will assist me to both enhance the documentation or add new, helpful options. In fact, I additionally welcome direct contributions (via pull requests)!

In any other case, I’ve a number of concepts for future releases, however, for the subsequent one, it’s probably I’ll give attention to the next two areas.

SMC samplers

My precedence #1 is to implement waste-free SMC within the bundle, following our current paper with Dang. (Dang already has launched his personal implementation, which is constructed on high of particles, however, provided that waste-free SMC appears to supply higher efficiency than customary SMC samplers, it appears necessary to have it obtainable in particles).

When that is executed, I plan so as to add a number of necessary purposes of SMC samplers, equivalent to:

I additionally plan to doc SMC samplers a bit higher.

integration with Pytorch (or JAX, Tensorflow, and so forth)

Python libraries equivalent to Tensorflow, Pytorch or JAX are all the fashion in machine studying. They provide entry to very fancy stuff, equivalent to auto-differentation, and computation on the GPU.

I’ve began to play a bit with Pytorch, and also have a working implementation of a particle filter that runs completely on the GPU. The concept is to make the core elements of particles utterly unbiased of numpy. In that manner, one might use Pytorch tensors to retailer the particles and their weights. That is actually work in progress.

Related Articles

Latest Articles