{"id":13699460,"url":"https://github.com/nchopin/particles","last_synced_at":"2025-05-04T16:34:20.239Z","repository":{"id":37818108,"uuid":"157890438","full_name":"nchopin/particles","owner":"nchopin","description":"Sequential Monte Carlo in python","archived":false,"fork":false,"pushed_at":"2024-10-14T14:21:02.000Z","size":5446,"stargazers_count":409,"open_issues_count":10,"forks_count":75,"subscribers_count":15,"default_branch":"master","last_synced_at":"2024-10-29T22:54:01.253Z","etag":null,"topics":["bayesian-inference","kalman-filter","particle-filter","pmcmc","quasi-monte-carlo","sequential-monte-carlo","smc2"],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/nchopin.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG","contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2018-11-16T16:03:47.000Z","updated_at":"2024-10-29T18:31:15.000Z","dependencies_parsed_at":"2023-10-01T16:30:07.454Z","dependency_job_id":"dd61d9c5-c1f7-4e1b-bcee-95b197370b06","html_url":"https://github.com/nchopin/particles","commit_stats":{"total_commits":317,"total_committers":9,"mean_commits":35.22222222222222,"dds":"0.13880126182965302","last_synced_commit":"a760322e3e0f5e0888334373d784f52aec917b12"},"previous_names":[],"tags_count":4,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/nchopin%2Fparticles","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/nchopin%2Fparticles/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/nchopin%2Fparticles/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/nchopin%2Fparticles/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/nchopin","download_url":"https://codeload.github.com/nchopin/particles/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":224398825,"owners_count":17304661,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["bayesian-inference","kalman-filter","particle-filter","pmcmc","quasi-monte-carlo","sequential-monte-carlo","smc2"],"created_at":"2024-08-02T20:00:33.789Z","updated_at":"2024-11-13T05:31:07.309Z","avatar_url":"https://github.com/nchopin.png","language":"Python","readme":"![logo](logo.png)\n\n# particles #\n\nSequential Monte Carlo in python. \n\n## Motivation ##\n\nThis package was developed to complement the following book:\n\n[An introduction to Sequential Monte Carlo](https://www.springer.com/gp/book/9783030478445)\n\nby Nicolas Chopin and Omiros Papaspiliopoulos. \n\nIt now also implements algorithms and methods introduced after the book was\npublished, see below. \n\n## Features ##\n\n* **particle filtering**: bootstrap filter, guided filter, APF.\n\n* **resampling**: multinomial, residual, stratified, systematic and SSP. \n\n* possibility to define **state-space models** using some (basic) form of \n  probabilistic programming; see below for an example. \n\n* **SQMC** (Sequential quasi Monte Carlo);  routines for computing the Hilbert curve, \n  and generating RQMC sequences. \n\n* **FFBS (forward filtering backward sampling)**: standard, O(N^2) variant, and\n  faster variants based on either MCMC, pure rejection, or the hybrid scheme;\n  see Dau \u0026 Chopin (2022) for a discussion. The QMC version of Gerber and\n  Chopin (2017, Bernoulli) is also implemented.\n\n* **other smoothing algorithms**: fixed-lag smoothing, on-line smoothing,\n  two-filter smoothing (O(N) and O(N^2) variants).  \n\n* Exact filtering/smoothing algorithms: **Kalman** (for linear Gaussian models) \n  and **forward-backward recursions** (for finite hidden Markov models).\n\n* **Standard and waste-free SMC samplers**: SMC tempering, IBIS (a.k.a. data\n  tempering). SMC samplers for binary words (Schäfer and Chopin, 2014), with\n  application to **variable selection**.\n\n* Bayesian parameter inference for state-space models: **PMCMC** (PMMH, Particle Gibbs) \n  and **SMC^2**. \n\n* Basic support for **parallel computation** (i.e. running multiple SMC algorithms \n  on different CPU cores). \n\n* **Variance estimators** (Chan and Lai, 2013 ; Lee and Whiteley, 2018; Olsson\n  and Douc, 2019).\n\n* **nested sampling**: both the vanilla version and the SMC sampler of Salomone\n  et al (2018).\n\n## Example ##\n\nHere is how you may define a parametric state-space model: \n\n```python\nimport particles\nimport particles.state_space_models as ssm\nimport particles.distributions as dists\n\nclass ToySSM(ssm.StateSpaceModel):\n    def PX0(self):  # Distribution of X_0 \n        return dists.Normal()  # X_0 ~ N(0, 1)\n    def PX(self, t, xp):  # Distribution of X_t given X_{t-1}\n        return dists.Normal(loc=xp)  # X_t ~ N( X_{t-1}, 1)\n    def PY(self, t, xp, x):  # Distribution of Y_t given X_t (and X_{t-1}) \n        return dists.Normal(loc=x, scale=self.sigma)  # Y_t ~ N(X_t, sigma^2)\n```\n\nYou may now choose a particular model within this class, and simulate data from it:\n\n```python\nmy_model = ToySSM(sigma=0.2)\nx, y = my_model.simulate(200)  # sample size is 200\n```\n\nTo run a bootstrap particle filter for this model and data `y`, simply do:\n\n```python\nalg = particles.SMC(fk=ssm.Bootstrap(ssm=my_model, data=y), N=200)\nalg.run()\n```\n\nThat's it! Head to the\n[documentation](https://particles-sequential-monte-carlo-in-python.readthedocs.io/en/latest/)\nfor more examples, explanations, and installation instructions. (It is strongly\nrecommended to start with the\n[tutorials](https://particles-sequential-monte-carlo-in-python.readthedocs.io/en/latest/tutorials.html),\nto get a general idea on what you can do, and how.)\n\n\n## Who do I talk to? ##\n\nNicolas Chopin (nicolas.chopin@ensae.fr) is the main author, contributor, and \nperson to blame if things do not work as expected. \n\nBug reports, feature requests, questions, rants, etc are welcome, preferably \non the github page. \n","funding_links":[],"categories":["\u003cspan id=\"head30\"\u003e3.4. Bayesian Inference\u003c/span\u003e"],"sub_categories":["\u003cspan id=\"head33\"\u003e3.4.3. Data Assimilation (SMC, particles filter)\u003c/span\u003e"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fnchopin%2Fparticles","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fnchopin%2Fparticles","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fnchopin%2Fparticles/lists"}