Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/ngiann/GaussianVariationalInference.jl

Approximate variational inference in Julia
https://github.com/ngiann/GaussianVariationalInference.jl

bayesian-inference bayesian-methods julia-language machine-learning posterior-distributions variational-inference

Last synced: 3 months ago
JSON representation

Approximate variational inference in Julia

Awesome Lists containing this project

README

        

GaussianVariationalInference.jl




*Deterministic variational inference in Julia.*

[![Project Status: WIP – Initial development is in progress, but there has not yet been a stable, usable release suitable for the public.](https://www.repostatus.org/badges/latest/active.svg)](https://www.repostatus.org/#active)
[![Documentation](https://img.shields.io/badge/docs-master-blue.svg)](https://ngiann.github.io/GaussianVariationalInference.jl)
![GitHub](https://img.shields.io/github/license/ngiann/GaussianVariationalInference.jl)

# What is this?

A Julia package for approximating a posterior distribution with a full-covariance Gaussian distribution by optimising a variational lower bound[^1]. In the near future it is planned to introduce a method for mean-field approximation. We recommend using this package for problems with a relatively small number of parameters, 2-20 parameters perhaps. The main focus of this package is to provide a method for approximating a target posterior with a Gaussian that does not need tuning learning rates (step sizes) and converges reliably.

## Basic usage

To install this package, please switch in the Julia REPL into package mode and add using `add GaussianVariationalInference`.

The package is fairly easy to use. Currently, the only function of interest to the user is `VI`. At the very minimum, the user needs to provide a function that codes the joint log-likelihood function.

Consider approximating the following target density:

```
using GaussianVariationalInference

logp = exampleproblem1() # target log-posterior density to approximate
x₀ = randn(2) # random initial mean for approximating Gaussian
q, logev = VI(logp, randn(2), S = 100, iterations = 10_000, show_every = 50)

# Plot target posterior, not log-posterior!
using Plots # must be indepedently installed.
x = -3:0.02:3
contour(x, x, map(x -> exp(logp(collect(x))), Iterators.product(x, x))', fill=true, c=:blues)

# Plot Gaussian approximation on top using red colour
contour!(x, x, map(x -> pdf(q,(collect(x))), Iterators.product(x, x))', color="red", alpha=0.2)
```

A plot similar to the one below should appear. The blue filled contours correspond to the exponentiated `logp`, and the red contours correspond to the produced Gaussian approximation `q`.

![image](docs/src/exampleproblem1.png)

For further information, please consult the documentation.

Should you use the software for academic purposes, please kindly consider citing the relevant paper[^1].

## Related packages

- [AdvancedVI.jl](https://github.com/TuringLang/AdvancedVI.jl): A library for variational Bayesian inference in Julia.
- [DynamicHMC.jl](https://github.com/tpapp/DynamicHMC.jl): Implementation of robust dynamic Hamiltonian Monte Carlo methods in Julia.

[^1]:[Approximate Variational Inference Based on a Finite Sample of Gaussian Latent Variables](https://doi.org/10.1007/s10044-015-0496-9), [[Arxiv]](https://arxiv.org/pdf/1906.04507.pdf).