particles

Sequential Monte Carlo in python

https://github.com/nchopin/particles

Science Score: 46.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: springer.com
  • Committers with academic emails
    1 of 9 committers (11.1%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (11.3%) to scientific vocabulary

Keywords

bayesian-inference kalman-filter particle-filter pmcmc quasi-monte-carlo sequential-monte-carlo smc2
Last synced: 6 months ago · JSON representation

Repository

Sequential Monte Carlo in python

Basic Info
  • Host: GitHub
  • Owner: nchopin
  • License: mit
  • Language: Python
  • Default Branch: master
  • Size: 5.17 MB
Statistics
  • Stars: 465
  • Watchers: 14
  • Forks: 79
  • Open Issues: 12
  • Releases: 4
Topics
bayesian-inference kalman-filter particle-filter pmcmc quasi-monte-carlo sequential-monte-carlo smc2
Created over 7 years ago · Last pushed 6 months ago
Metadata Files
Readme Changelog Contributing License

README.md

logo

particles

Sequential Monte Carlo in python.

Motivation

This package was developed to complement the following book:

An introduction to Sequential Monte Carlo

by Nicolas Chopin and Omiros Papaspiliopoulos.

It now also implements algorithms and methods introduced after the book was published, see below.

Features

  • particle filtering: bootstrap filter, guided filter, APF.

  • resampling: multinomial, residual, stratified, systematic and SSP.

  • possibility to define state-space models using some (basic) form of probabilistic programming; see below for an example.

  • SQMC (Sequential quasi Monte Carlo); routines for computing the Hilbert curve, and generating RQMC sequences.

  • FFBS (forward filtering backward sampling): standard, O(N^2) variant, and faster variants based on either MCMC, pure rejection, or the hybrid scheme; see Dau & Chopin (2022) for a discussion. The QMC version of Gerber and Chopin (2017, Bernoulli) is also implemented.

  • other smoothing algorithms: fixed-lag smoothing, on-line smoothing, two-filter smoothing (O(N) and O(N^2) variants).

  • Exact filtering/smoothing algorithms: Kalman (for linear Gaussian models) and forward-backward recursions (for finite hidden Markov models).

  • Standard and waste-free SMC samplers: SMC tempering, IBIS (a.k.a. data tempering). SMC samplers for binary words (Schäfer and Chopin, 2014), with application to variable selection.

  • Bayesian parameter inference for state-space models: PMCMC (PMMH, Particle Gibbs) and SMC^2.

  • Basic support for parallel computation (i.e. running multiple SMC algorithms on different CPU cores).

  • Variance estimators (Chan and Lai, 2013 ; Lee and Whiteley, 2018; Olsson and Douc, 2019).

  • nested sampling: both the vanilla version and the SMC sampler of Salomone et al (2018).

Example

Here is how you may define a parametric state-space model:

```python import particles import particles.statespacemodels as ssm import particles.distributions as dists

class ToySSM(ssm.StateSpaceModel): def PX0(self): # Distribution of X0 return dists.Normal() # X0 ~ N(0, 1) def PX(self, t, xp): # Distribution of Xt given X{t-1} return dists.Normal(loc=xp) # Xt ~ N( X{t-1}, 1) def PY(self, t, xp, x): # Distribution of Yt given Xt (and X{t-1}) return dists.Normal(loc=x, scale=self.sigma) # Yt ~ N(X_t, sigma^2) ```

You may now choose a particular model within this class, and simulate data from it:

python my_model = ToySSM(sigma=0.2) x, y = my_model.simulate(200) # sample size is 200

To run a bootstrap particle filter for this model and data y, simply do:

python alg = particles.SMC(fk=ssm.Bootstrap(ssm=my_model, data=y), N=200) alg.run()

That's it! Head to the documentation for more examples, explanations, and installation instructions. (It is strongly recommended to start with the tutorials, to get a general idea on what you can do, and how.)

Who do I talk to?

Nicolas Chopin (nicolas.chopin@ensae.fr) is the main author, contributor, and person to blame if things do not work as expected.

Bug reports, feature requests, questions, rants, etc are welcome, preferably on the github page.

Owner

  • Name: Nicolas Chopin
  • Login: nchopin
  • Kind: user
  • Location: Paris
  • Company: ENSAE, Institut Polytechnique de Paris

Professor of Data Sciences at ENSAE, Paris

GitHub Events

Total
  • Issues event: 9
  • Watch event: 55
  • Issue comment event: 10
  • Push event: 5
  • Pull request event: 10
  • Fork event: 5
Last Year
  • Issues event: 9
  • Watch event: 55
  • Issue comment event: 10
  • Push event: 5
  • Pull request event: 10
  • Fork event: 5

Committers

Last synced: over 2 years ago

All Time
  • Total Commits: 317
  • Total Committers: 9
  • Avg Commits per committer: 35.222
  • Development Distribution Score (DDS): 0.139
Past Year
  • Commits: 96
  • Committers: 2
  • Avg Commits per committer: 48.0
  • Development Distribution Score (DDS): 0.24
Top Committers
Name Email Commits
Nicolas Chopin n****n@e****r 273
Virgile Andreani a****a@u****r 23
AdrienCorenflos a****s@g****m 7
CHOPIN Nicolas c****n@l****r 7
Dau Hai Dang h****u@p****u 2
Max Cohen m****i@p****e 2
Finn Catling f****g@g****m 1
Jon Wiersma g****n@j****g 1
Jacob Louis Hoover p****m 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 7 months ago

All Time
  • Total issues: 65
  • Total pull requests: 37
  • Average time to close issues: about 1 month
  • Average time to close pull requests: 15 days
  • Total issue authors: 32
  • Total pull request authors: 13
  • Average comments per issue: 2.57
  • Average comments per pull request: 1.14
  • Merged pull requests: 20
  • Bot issues: 0
  • Bot pull requests: 1
Past Year
  • Issues: 8
  • Pull requests: 8
  • Average time to close issues: 11 days
  • Average time to close pull requests: about 1 month
  • Issue authors: 7
  • Pull request authors: 3
  • Average comments per issue: 2.38
  • Average comments per pull request: 0.38
  • Merged pull requests: 2
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • nchopin (12)
  • hai-dang-dau (6)
  • AdrienCorenflos (6)
  • MaurusGubser (5)
  • MercuryBench (3)
  • vwiela (3)
  • jorgemcgomes (3)
  • diolegend (2)
  • CaesarMordred (1)
  • williamzhao1999 (1)
  • draktr (1)
  • glandfried (1)
  • finncatling (1)
  • lionfish0 (1)
  • SchroederAdrian (1)
Pull Request Authors
  • MercuryBench (14)
  • AdrienCorenflos (12)
  • Armavica (5)
  • G-Kossi (4)
  • SchroederAdrian (2)
  • finncatling (2)
  • afcarzero1 (1)
  • hai-dang-dau (1)
  • maxjcohen (1)
  • jonjonw (1)
  • sakira (1)
  • dependabot[bot] (1)
  • postylem (1)
Top Labels
Issue Labels
bug (7) enhancement (4)
Pull Request Labels
dependencies (1)

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 1,688 last-month
  • Total dependent packages: 1
  • Total dependent repositories: 5
  • Total versions: 4
  • Total maintainers: 1
pypi.org: particles

Sequential Monte Carlo in Python

  • Versions: 4
  • Dependent Packages: 1
  • Dependent Repositories: 5
  • Downloads: 1,688 Last month
Rankings
Dependent packages count: 4.8%
Dependent repos count: 6.6%
Average: 7.9%
Downloads: 12.4%
Maintainers (1)
Last synced: 6 months ago

Dependencies

pyproject.toml pypi
  • joblib *
  • numba *
  • numpy >= 1.18, < 2
  • scikit-learn *
  • scipy >= 1.7