{"id":31856737,"url":"https://github.com/acerbilab/svbmc","last_synced_at":"2026-01-20T16:32:15.735Z","repository":{"id":318334488,"uuid":"1015303462","full_name":"acerbilab/svbmc","owner":"acerbilab","description":"Stacking Variational Bayesian Monte Carlo (S-VBMC) algorithm for combining Variational Bayesian Monte Carlo (VBMC) posteriors to boost inference performance.","archived":false,"fork":false,"pushed_at":"2025-10-07T10:50:29.000Z","size":10115,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":0,"default_branch":"main","last_synced_at":"2025-10-07T11:40:26.717Z","etag":null,"topics":["bayesian-inference","data-analysis","machine-learning","model-fitting","python","stacking","variational-inference"],"latest_commit_sha":null,"homepage":"https://arxiv.org/abs/2504.05004","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"bsd-3-clause","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/acerbilab.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE.txt","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2025-07-07T09:53:35.000Z","updated_at":"2025-10-07T10:50:33.000Z","dependencies_parsed_at":"2025-10-07T11:50:56.011Z","dependency_job_id":null,"html_url":"https://github.com/acerbilab/svbmc","commit_stats":null,"previous_names":["acerbilab/s-vbmc","acerbilab/svbmc"],"tags_count":null,"template":false,"template_full_name":null,"purl":"pkg:github/acerbilab/svbmc","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/acerbilab%2Fsvbmc","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/acerbilab%2Fsvbmc/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/acerbilab%2Fsvbmc/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/acerbilab%2Fsvbmc/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/acerbilab","download_url":"https://codeload.github.com/acerbilab/svbmc/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/acerbilab%2Fsvbmc/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":279011614,"owners_count":26084967,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-10-12T02:00:06.719Z","response_time":53,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["bayesian-inference","data-analysis","machine-learning","model-fitting","python","stacking","variational-inference"],"created_at":"2025-10-12T14:54:41.769Z","updated_at":"2026-01-20T16:32:15.729Z","avatar_url":"https://github.com/acerbilab.png","language":"Python","readme":"# Stacking Variational Bayesian Monte Carlo (S-VBMC)\n\n\u003e **tl;dr**: S-VBMC improves posterior inference via [VBMC](https://github.com/acerbilab/pyvbmc) by combining multiple independent runs into a **single optimized posterior**. It requires no additional model evaluations and no communication between runs: just run VBMC in parallel with different initializations, then stack the results at the end. Start [here](https://github.com/acerbilab/S-VBMC/blob/main/examples/svbmc_example_1_basic_usage.ipynb) for a quick usage demo.\n\n## Overview\nStacking Variational Bayesian Monte Carlo (S-VBMC)[[1](#references-and-citation)] is a fast post-processing step for [Variational Bayesian Monte Carlo (VBMC)](https://github.com/acerbilab/pyvbmc). VBMC is an approximate Bayesian inference technique that produces a variational posterior in the form of a Gaussian mixture (see the relevant papers [[2-4](#references-and-citation)] for more details). S-VBMC improves upon this by combining (\"stacking\") the Gaussian mixture components from several independent VBMC runs into a single, larger mixture, which we call \"stacked posterior\". It then re-optimizes the weights of this combined mixture to maximize the combined Evidence Lower BOund (ELBO, a lower bound on log [model evidence](https://en.wikipedia.org/wiki/Marginal_likelihood)). \n\nA key advantage of S-VBMC is its efficiency: **the original model is never re-evaluated**, making it an inexpensive way to boost inference performance. Furthermore, **no communication is needed among VBMC runs**, making it possible to run them in parallel before applying S-VBMC as a post-processing step with negligible computational overhead.\n\nRefer to the S-VBMC paper for further details [[1](#references-and-citation)].\n\n## When to use S-VBMC\n\nS-VBMC works as a post-processing step for VBMC, so it shares its use cases (described [here](https://github.com/acerbilab/pyvbmc/tree/main?tab=readme-ov-file#when-should-i-use-pyvbmc)).\n\nPerforming several VBMC inference runs with different initialization points [is already recommended](https://github.com/acerbilab/pyvbmc/blob/main/examples/pyvbmc_example_4_validation.ipynb) for robustness and convergence diagnostics; therefore, S-VBMC naturally fits into VBMC's best practices. Because S-VBMC is inexpensive and effective, we recommend using it whenever you first perform inference with VBMC. It is especially useful when separate VBMC runs yield noticeably different variational posteriors, which might happen when the target distribution has a particularly complex shape (see [this notebook](https://github.com/acerbilab/S-VBMC/blob/main/examples/svbmc_example_1_basic_usage.ipynb) for two examples of this).\n\n-----\n\n## How to use S-VBMC\n\n### 1. Installation\n\nCreate a new environment in `conda` and activate it:\n   ```bash\n   conda create -n svbmc-env python=3.11\n   conda activate svbmc-env\n   ```\nInstall `svbmc` with `pip`:\n   ```bash\n   pip install svbmc \n   ```\n\n### 2. Running S-VBMC\n\nYou should have already run VBMC multiple times on the same problem and saved the resulting `VariationalPosterior` objects as `.pkl` files. Refer to [these notebooks](https://github.com/acerbilab/pyvbmc/tree/main/examples) for VBMC usage examples.\n\nFirst, load these objects into a single list. For example, if you have your files in a folder named `vbmc_runs/`:\n\n```python\nimport pickle\nimport glob\n\nvp_files = glob.glob(\"vbmc_runs/*.pkl\")\nvp_list = []\nfor file in vp_files:\n    with open(file, \"rb\") as f:\n        vp_list.append(pickle.load(f))\n```\n\nNext, initialize the `SVBMC` object with this list and run the optimization.\n\n```python\nfrom svbmc.svbmc import SVBMC\n\n# Initialize the SVBMC object and optimize the weights\nvp_stacked = SVBMC(vp_list=vp_list)\nvp_stacked.optimize()\n\n# The SVBMC object now contains the optimized weights and ELBO estimates\nprint(f\"Stacked ELBO: {vp_stacked.elbo['estimated']}\")\n```\n\n**Note**: For compatibility with VBMC, this implementation of S-VBMC stores results in `NumPy` arrays. However, it uses `PyTorch` under the hood to run the ELBO optimization.\n\n### 3. Tutorials\n\nWe include two detailed walkthroughs:\n\n1. [**Basic Usage**](https://github.com/acerbilab/S-VBMC/blob/main/examples/svbmc_example_1_basic_usage.ipynb): This notebook shows how to run S-VBMC, with an optional guide on how to run VBMC multiple times.\n2. [**Noisy Log-Density Evaluations**](https://github.com/acerbilab/S-VBMC/blob/main/examples/svbmc_example_2_noisy_likelihoods.ipynb): This notebook addresses scenarios where the target log-density evaluations are noisy and a discussion of ELBO debiasing.\n   \n-----\n\n## How to use the final posterior\n\nIf you want to compute estimates (or visualize) the final stacked posterior, you can draw samples from it using `.sample()`:\n\n```python\n# Draw 10,000 samples from the final, stacked posterior\nsamples = vp_stacked.sample(n_samples=10000)\n```\n\nYou can also extract the ELBO estimates for model comparison (see [here](https://github.com/acerbilab/svbmc/blob/main/examples/svbmc_example_1_basic_usage.ipynb); and [here](https://github.com/acerbilab/svbmc/blob/main/examples/svbmc_example_2_noisy_likelihoods.ipynb) if your log-density evaluations are noisy).\n\n### For advanced users\n\nDo **not** \"open the box\" to get S-VBMC stacked posterior's individual components' means and covariance matrices. This is because each VBMC run may use different internal parameter transformations. Consequently, the component means and covariance matrices from different VBMC posteriors exist in **incompatible parameter spaces**. Combining them creates a mixture whose individual Gaussian components are not directly meaningful. **Always use samples from the final stacked posterior**, which are correctly transformed back into the original parameter space. These are available via the `.sample()` method.\n\n## References and citation\n\n1. Silvestrin, F., Li, C., \u0026 Acerbi, L. (2025). Stacking Variational Bayesian Monte Carlo. In *Transactions on Machine Learning Research*. ([paper on arXiv](https://arxiv.org/abs/2504.05004), [TMLR](https://openreview.net/forum?id=M2ilYAJdPe)). \n2. Acerbi, L. (2018). Variational Bayesian Monte Carlo. In *Advances in Neural Information Processing Systems 31*: 8222-8232. ([paper + supplement on arXiv](https://arxiv.org/abs/1810.05558), [NeurIPS Proceedings](https://papers.nips.cc/paper/8043-variational-bayesian-monte-carlo))\n3. Acerbi, L. (2020). Variational Bayesian Monte Carlo with Noisy Likelihoods. In *Advances in Neural Information Processing Systems 33*: 8211-8222 ([paper + supplement on arXiv](https://arxiv.org/abs/2006.08655), [NeurIPS Proceedings](https://papers.nips.cc/paper/2020/hash/5d40954183d62a82257835477ccad3d2-Abstract.html)).\n4. Huggins, B., Li, C., Tobaben, M., Aarnos, M., \u0026 Acerbi, L. (2023). [PyVBMC: Efficient Bayesian inference in Python](https://joss.theoj.org/papers/10.21105/joss.05428). *Journal of Open Source Software* 8(86), 5428, https://doi.org/10.21105/joss.05428.\n\nPlease cite all four references if you use S-VBMC in your work.\n\n## Additional references\n\n5. Acerbi, L. (2019). An Exploration of Acquisition and Mean Functions in Variational Bayesian Monte Carlo. In *Proc. Machine Learning Research* 96: 1-10. 1st Symposium on Advances in Approximate Bayesian Inference, Montréal, Canada. ([paper in PMLR](http://proceedings.mlr.press/v96/acerbi19a.html))\n\n## BibTeX\n\n```BibTeX\n@article{\n  silvestrin2025stacking,\n  title={Stacking Variational Bayesian Monte Carlo},\n  author={Francesco Silvestrin and Chengkun LI and Luigi Acerbi},\n  journal={Transactions on Machine Learning Research},\n  issn={2835-8856},\n  year={2025},\n  url={https://openreview.net/forum?id=M2ilYAJdPe},\n  note={}\n}\n\n@article{acerbi2018variational,\n  title={{V}ariational {B}ayesian {M}onte {C}arlo},\n  author={Acerbi, Luigi},\n  journal={Advances in Neural Information Processing Systems},\n  volume={31},\n  pages={8222--8232},\n  year={2018}\n}\n\n@article{acerbi2020variational,\n  title={{V}ariational {B}ayesian {M}onte {C}arlo with noisy likelihoods},\n  author={Acerbi, Luigi},\n  journal={Advances in Neural Information Processing Systems},\n  volume={33},\n  pages={8211--8222},\n  year={2020}\n}\n\n@article{huggins2023pyvbmc,\n    title = {PyVBMC: Efficient Bayesian inference in Python},\n    author = {Bobby Huggins and Chengkun Li and Marlon Tobaben and Mikko J. Aarnos and Luigi Acerbi},\n    publisher = {The Open Journal},\n    journal = {Journal of Open Source Software},\n    url = {https://doi.org/10.21105/joss.05428},\n    doi = {10.21105/joss.05428},\n    year = {2023},\n    volume = {8},\n    number = {86},\n    pages = {5428}\n  }\n\n@article{acerbi2019exploration,\n  title={An Exploration of Acquisition and Mean Functions in {V}ariational {B}ayesian {M}onte {C}arlo},\n  author={Acerbi, Luigi},\n  journal={PMLR},\n  volume={96},\n  pages={1--10},\n  year={2019}\n}\n```\n\n## License\n\nS-VBMC is released under the terms of the [BSD 3-Clause License](LICENSE.txt).\n\n## Acknowledgments\n\nS-VBMC was developed by [members](https://www.helsinki.fi/en/researchgroups/machine-and-human-intelligence/people) of the [Machine and Human Intelligence Lab](https://www.helsinki.fi/en/researchgroups/machine-and-human-intelligence/) at the University of Helsinki. This work was supported by [Research Council of Finland](https://www.aka.fi/en/) (grants 358980 and 356498).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Facerbilab%2Fsvbmc","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Facerbilab%2Fsvbmc","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Facerbilab%2Fsvbmc/lists"}