Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ngiann/GaussianVariationalInference.jl
Approximate variational inference in Julia
https://github.com/ngiann/GaussianVariationalInference.jl
bayesian-inference bayesian-methods julia-language machine-learning posterior-distributions variational-inference
Last synced: 3 months ago
JSON representation
Approximate variational inference in Julia
- Host: GitHub
- URL: https://github.com/ngiann/GaussianVariationalInference.jl
- Owner: ngiann
- License: mit
- Created: 2020-10-21T08:14:31.000Z (about 4 years ago)
- Default Branch: master
- Last Pushed: 2023-07-25T00:10:32.000Z (over 1 year ago)
- Last Synced: 2024-05-22T06:37:52.472Z (6 months ago)
- Topics: bayesian-inference, bayesian-methods, julia-language, machine-learning, posterior-distributions, variational-inference
- Language: Julia
- Homepage:
- Size: 873 KB
- Stars: 4
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-sciml - ngiann/ApproximateVI.jl: Approximate variational inference in Julia
README
GaussianVariationalInference.jl
*Deterministic variational inference in Julia.*
[![Project Status: WIP – Initial development is in progress, but there has not yet been a stable, usable release suitable for the public.](https://www.repostatus.org/badges/latest/active.svg)](https://www.repostatus.org/#active)
[![Documentation](https://img.shields.io/badge/docs-master-blue.svg)](https://ngiann.github.io/GaussianVariationalInference.jl)
![GitHub](https://img.shields.io/github/license/ngiann/GaussianVariationalInference.jl)# What is this?
A Julia package for approximating a posterior distribution with a full-covariance Gaussian distribution by optimising a variational lower bound[^1]. In the near future it is planned to introduce a method for mean-field approximation. We recommend using this package for problems with a relatively small number of parameters, 2-20 parameters perhaps. The main focus of this package is to provide a method for approximating a target posterior with a Gaussian that does not need tuning learning rates (step sizes) and converges reliably.
## Basic usage
To install this package, please switch in the Julia REPL into package mode and add using `add GaussianVariationalInference`.
The package is fairly easy to use. Currently, the only function of interest to the user is `VI`. At the very minimum, the user needs to provide a function that codes the joint log-likelihood function.
Consider approximating the following target density:
```
using GaussianVariationalInferencelogp = exampleproblem1() # target log-posterior density to approximate
x₀ = randn(2) # random initial mean for approximating Gaussian
q, logev = VI(logp, randn(2), S = 100, iterations = 10_000, show_every = 50)# Plot target posterior, not log-posterior!
using Plots # must be indepedently installed.
x = -3:0.02:3
contour(x, x, map(x -> exp(logp(collect(x))), Iterators.product(x, x))', fill=true, c=:blues)# Plot Gaussian approximation on top using red colour
contour!(x, x, map(x -> pdf(q,(collect(x))), Iterators.product(x, x))', color="red", alpha=0.2)
```A plot similar to the one below should appear. The blue filled contours correspond to the exponentiated `logp`, and the red contours correspond to the produced Gaussian approximation `q`.
![image](docs/src/exampleproblem1.png)
For further information, please consult the documentation.
Should you use the software for academic purposes, please kindly consider citing the relevant paper[^1].
## Related packages
- [AdvancedVI.jl](https://github.com/TuringLang/AdvancedVI.jl): A library for variational Bayesian inference in Julia.
- [DynamicHMC.jl](https://github.com/tpapp/DynamicHMC.jl): Implementation of robust dynamic Hamiltonian Monte Carlo methods in Julia.[^1]:[Approximate Variational Inference Based on a Finite Sample of Gaussian Latent Variables](https://doi.org/10.1007/s10044-015-0496-9), [[Arxiv]](https://arxiv.org/pdf/1906.04507.pdf).