{"id":21409276,"url":"https://github.com/acerbilab/ibs","last_synced_at":"2026-03-01T22:01:58.383Z","repository":{"id":50525336,"uuid":"225665691","full_name":"acerbilab/ibs","owner":"acerbilab","description":"Inverse binomial sampling for efficient log-likelihood estimation of simulator models in MATLAB","archived":false,"fork":false,"pushed_at":"2023-06-25T16:47:25.000Z","size":80,"stargazers_count":33,"open_issues_count":0,"forks_count":3,"subscribers_count":4,"default_branch":"master","last_synced_at":"2026-01-12T19:23:42.568Z","etag":null,"topics":["likelihood-free","log-likelihood","matlab","sampling","simulation-model"],"latest_commit_sha":null,"homepage":"","language":"MATLAB","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/acerbilab.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null}},"created_at":"2019-12-03T16:25:50.000Z","updated_at":"2025-12-25T03:17:46.000Z","dependencies_parsed_at":"2023-01-20T11:31:17.102Z","dependency_job_id":"af934ee3-f48a-468a-b73a-2ccc0b33a221","html_url":"https://github.com/acerbilab/ibs","commit_stats":{"total_commits":63,"total_committers":2,"mean_commits":31.5,"dds":0.4920634920634921,"last_synced_commit":"2229c00c4a19eb9f236f9f257100dab9e87b6f92"},"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/acerbilab/ibs","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/acerbilab%2Fibs","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/acerbilab%2Fibs/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/acerbilab%2Fibs/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/acerbilab%2Fibs/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/acerbilab","download_url":"https://codeload.github.com/acerbilab/ibs/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/acerbilab%2Fibs/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":29986241,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-03-01T21:06:37.093Z","status":"ssl_error","status_checked_at":"2026-03-01T21:05:45.052Z","response_time":124,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["likelihood-free","log-likelihood","matlab","sampling","simulation-model"],"created_at":"2024-11-22T17:24:26.165Z","updated_at":"2026-03-01T22:01:58.358Z","avatar_url":"https://github.com/acerbilab.png","language":"MATLAB","readme":"# Inverse binomial sampling (IBS)\n\nThis repository contains MATLAB implementations and examples of IBS. For Python implementations, see [PyIBS](https://github.com/acerbilab/pyibs).\n\n## What is it?\n\nInverse binomial sampling (IBS) is a technique to obtain unbiased, efficient estimates of the log-likelihood of a model by simulation. [[1](#references)]\n\nThe typical scenario is the case in which you have a *simulator*, that is a model from which you can randomly draw synthetic observations (for a given parameter vector), but cannot evaluate the log-likelihood analytically or numerically. In other words, IBS affords likelihood-based inference for models without explicit likelihood functions (also known as implicit models).\n\n## Quickstart\n\n\nThe main function is `ibslike.m`, which computes the estimate of the negative log-likelihood for a given simulator model and dataset via IBS.\nWe recommend to start with the tutorial in `ibs_example.m`, which contains a full walkthrough with working example usages of IBS.\n\nIBS is commonly used as a part of an algorithm for maximum-likelihood estimation or Bayesian inference:\n- For maximum-likelihood (or maximum-a-posteriori) estimation, we recommend to use IBS combined with [Bayesian Adaptive Direct Search (BADS)](https://github.com/acerbilab/bads). BADS is required to run the first part of the tutorial.\n- For Bayesian inference of posterior distributions and model evidence, we recommend to use IBS with [Variational Bayesian Monte Carlo (VBMC)](https://github.com/acerbilab/vbmc). VBMC is required to run the second part of the tutorial.\n\nFor practical recommendations and any other question, check out the FAQ on the [IBS wiki](https://github.com/acerbilab/ibs/wiki).\n\n## Code\n\nWe describe below the files in this repository:\n\n- `ibs_basic.m` is a bare-bone implementation of IBS for didactic purposes.\n- `ibslike.m` is an advanced vectorized implementation of IBS, which supports several advanced features (still work in progress as we add more features). Please read the documentation in the file for usage information, and refer to the [dedicated section on the FAQ](https://github.com/acerbilab/ibs/wiki#matlab-implementation-ibslike) for specific questions about this MATLAB implementation.\n  - Note that `ibslike` returns the *negative* log-likelihood as it is meant to be used with an optimization method such as BADS.\n  - Run `ibslike('test')` for a battery of unit tests.\n  - If you want to run `ibslike` with [Variational Bayesian Monte Carlo](https://github.com/acerbilab/vbmc) (VBMC), note that you need to set the IBS options as\n    - `OPTIONS.ReturnPositive = true` to return the *positive* log-likelihood;\n    - `OPTIONS.ReturnStd = true` to return as second output the standard deviation (SD) of the estimate (as opposed to the variance).\n- `ibs_example.m` is a tutorial with usage cases for IBS.\n- `psycho_gen.m` and `psycho_nll.m` are functions implementing, respectively, the generative model (simulator) and the negative log-likelihood function for the orientation discrimination model used as example in the tutorial (see also Section 5.2 in the IBS paper).\n\nThe code used to produce results in the paper [[1](#references)] is available in the development repository [here](https://github.com/basvanopheusden/ibs-development).\n\n## References\n\n1. van Opheusden\\*, B., Acerbi\\*, L. \u0026 Ma, W.J. (2020). Unbiased and efficient log-likelihood estimation with inverse binomial sampling. *PLoS Computational Biology* 16(12): e1008483. (\\* equal contribution) ([link](https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1008483)) \n\nYou can cite IBS in your work with something along the lines of\n\n\u003e We estimated the log-likelihood using inverse binomial sampling (IBS; van Opheusden, Acerbi \u0026 Ma, 2020), a technique that produces unbiased and efficient estimates of the log-likelihood via simulation. \n\nIf you use IBS in conjunction with [Bayesian Adaptive Direct Search](https://github.com/acerbilab/bads), as recommended in the paper, you could add\n\n\u003e We obtained maximum-likelihood estimates of the model parameters via Bayesian Adaptive Direct Search (BADS; Acerbi \u0026 Ma, 2017), a hybrid Bayesian optimization algorithm which affords stochastic objective evaluations.\n\nand cite the appropriate paper:\n\n2. Acerbi, L. \u0026 Ma, W. J. (2017). Practical Bayesian optimization for model fitting with Bayesian Adaptive Direct Search. In *Advances in Neural Information Processing Systems* 30:1834-1844.\n\nSimilarly, if you use IBS in combination with [Variational Bayesian Monte Carlo](https://github.com/acerbilab/vbmc), you should cite these papers:\n\n3. Acerbi, L. (2018). Variational Bayesian Monte Carlo. In *Advances in Neural Information Processing Systems* 31: 8222-8232.\n4. Acerbi, L. (2020). Variational Bayesian Monte Carlo with Noisy Likelihoods. In *Advances in Neural Information Processing Systems* 33: 8211-8222.\n\nBesides formal citations, you can demonstrate your appreciation for our work in the following ways:\n\n- *Star* the IBS repository on GitHub;\n- Follow us on Twitter ([Luigi](https://twitter.com/AcerbiLuigi), [Bas](https://twitter.com/basvanopheusden)) for updates about IBS and other projects we are involved with;\n- Tell us about your model-fitting problem and your experience with IBS (positive or negative) in the [Discussions forum](https://github.com/orgs/acerbilab/discussions).\n\n### BibTex\n\n```\n@article{vanOpheusden2020unbiased,\n  title = {Unbiased and Efficient Log-Likelihood Estimation with Inverse Binomial Sampling},\n  author = {van Opheusden, Bas and Acerbi, Luigi and Ma, Wei Ji},\n  year = {2020},\n  journal = {PLOS Computational Biology},\n  volume = {16},\n  number = {12},\n  pages = {e1008483},\n  publisher = {{Public Library of Science}},\n  issn = {1553-7358},\n  doi = {10.1371/journal.pcbi.1008483},\n}\n```\n\n### License\n\nThe IBS code is released under the terms of the [MIT License](https://github.com/acerbilab/ibs/blob/master/LICENSE.txt).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Facerbilab%2Fibs","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Facerbilab%2Fibs","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Facerbilab%2Fibs/lists"}