Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/andymiller/vboost
code supplement for variational boosting (https://arxiv.org/abs/1611.06585)
https://github.com/andymiller/vboost
Last synced: about 1 month ago
JSON representation
code supplement for variational boosting (https://arxiv.org/abs/1611.06585)
- Host: GitHub
- URL: https://github.com/andymiller/vboost
- Owner: andymiller
- License: mit
- Created: 2017-06-06T17:45:54.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2017-07-24T17:03:53.000Z (over 7 years ago)
- Last Synced: 2024-08-02T03:02:33.667Z (4 months ago)
- Language: Python
- Homepage:
- Size: 11.3 MB
- Stars: 11
- Watchers: 6
- Forks: 5
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-gradient-boosting-papers - [Code
README
# vboost
code for [Variational Boosting: Iteratively Refining Posterior Approximations](https://arxiv.org/abs/1611.06585)
### Abstract
> We propose a black-box variational inference method to approximate
> intractable distributions with an increasingly rich approximating class.
> Our method, termed variational boosting, iteratively refines an existing
> variational approximation by solving a sequence of optimization problems,
> allowing the practitioner to trade computation time for accuracy.
> We show how to expand the variational approximating class by incorporating
> additional covariance structure and by introducing new components to form a
> mixture. We apply variational boosting to synthetic and real statistical
> models, and show that resulting posterior inferences compare favorably to
> existing posterior approximation algorithms in both accuracy and efficiency.Authors:
[Andrew Miller](http://andymiller.github.io/),
[Nick Foti](http://nfoti.github.io/), and
[Ryan Adams](http://people.seas.harvard.edu/~rpa/).### Requires
* [`autograd`](https://github.com/HIPS/autograd) + its requirements (`numpy`, etc).
Our code is compatible with [this `autograd` commit](https://github.com/HIPS/autograd/tree/42a57226442417785efe3bd5ba543b958680b765) or later.
You can install the master version with
`pip install git+git://github.com/HIPS/autograd.git@master`.
* [`pyprind `](https://github.com/rasbt/pyprind)
* [`sampyl`](https://github.com/mcleonard/sampyl) for MCMC experiments