Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/svsamsonov/ex2mcmc_new
https://github.com/svsamsonov/ex2mcmc_new
Last synced: 11 days ago
JSON representation
- Host: GitHub
- URL: https://github.com/svsamsonov/ex2mcmc_new
- Owner: svsamsonov
- License: mit
- Created: 2022-04-26T13:09:30.000Z (over 2 years ago)
- Default Branch: master
- Last Pushed: 2023-05-04T08:15:04.000Z (over 1 year ago)
- Last Synced: 2024-09-15T19:01:39.509Z (about 2 months ago)
- Language: Jupyter Notebook
- Size: 1.25 GB
- Stars: 6
- Watchers: 3
- Forks: 4
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
README
# Ex2MCMC: Local-Global MCMC kernels: the best of both worlds (NeurIPS 2022) [[Paper]](https://proceedings.neurips.cc/paper_files/paper/2022/hash/21c86d5b10cdc28664ccdadf0a29065a-Abstract-Conference.html)
## Implementation of Ex2MCMC, FlEx2MCMC and experiments[![build](https://github.com/svsamsonov/ex2mcmc_new/actions/workflows/main.yml/badge.svg)](https://github.com/svsamsonov/ex2mcmc_new/actions/workflows/main.yml)
[![pypi](http://img.shields.io/pypi/v/ex2mcmc)](https://pypi.org/project/ex2mcmc/)
[![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://github.com/svsamsonov/ex2mcmc_new/blob/master/LICENSE)
[![CodeFactor](https://www.codefactor.io/repository/github/svsamsonov/ex2mcmc_new/badge)](https://www.codefactor.io/repository/github/svsamsonov/ex2mcmc_new)
[[ArXiv]](https://arxiv.org/abs/2111.02702)Authors: Sergey Samsonov, Evgeny Lagutin, Marylou Gabrié, Alain Durmus, Alexey Naumov, Eric Moulines.
> **Abstract:** *In the present paper we study an Explore-Exploit Markov chain Monte Carlo strategy (Ex2MCMC) that combines local and global samplers showing that it enjoys the advantages of both approaches. We prove V-uniform geometric ergodicity of Ex2MCMC without requiring a uniform adaptation of the global sampler to the target distribution. We also compute explicit bounds on the mixing rate of the Explore-Exploit strategy under realistic conditions. Moreover, we also analyze an adaptive version of the strategy (FlEx2MCMC) where a normalizing flow is trained while sampling to serve as a proposal for global moves. We illustrate the efficiency of Ex2MCMC and its adaptive version on classical sampling benchmarks as well as in sampling high-dimensional distributions defined by Generative Adversarial Networks seen as Energy Based Models.*
>- [Ex2MCMC: Local-Global MCMC kernels: the best of both worlds (NeurIPS 2022) \[Paper\]](#ex2mcmc-local-global-mcmc-kernels-the-best-of-both-worlds-neurips-2022-paper)
- [Implementation of Ex2MCMC, FlEx2MCMC and experiments](#implementation-of-ex2mcmc-flex2mcmc-and-experiments)
- [Sampling from GAN as Energy-Based Model with MCMC](#sampling-from-gan-as-energy-based-model-with-mcmc)
- [Algorithms](#algorithms)
- [Quick start](#quick-start)
- [Requirements](#requirements)
- [Installation](#installation)
- [Checkpoints](#checkpoints)
- [Usage](#usage)
- [Experiments with synthetic distributions:](#experiments-with-synthetic-distributions)
- [Experiments with GANs on MNIST dataset](#experiments-with-gans-on-mnist-dataset)
- [Experiments with GANs on CIFAR10 dataset](#experiments-with-gans-on-cifar10-dataset)
- [Sampling and FID computation](#sampling-and-fid-computation)
- [Results](#results)
- [FID and Inception Score (CIFAR10)](#fid-and-inception-score-cifar10)
- [Energy landscape approximation (MNIST)](#energy-landscape-approximation-mnist)
- [Sampling trajectories (CIFAR10)](#sampling-trajectories-cifar10)
- [Citation](#citation)### Sampling from GAN as Energy-Based Model with MCMC
Metrics:
Samples from SNGAN with FlEx2MCMC:
### Algorithms
**Ex2MCMC:**
**FlEx2MCMC:**
### Quick start
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1Cy3Ion97F1kIWMNkF6wl-XODnfP9VQ5u?usp=sharing)```bash
pip install ex2mcmc
pip install git+https://github.com/kwotsin/mimicry.git
```### Requirements
* Python >= 3.8
* PyTorch >= 1.8.0
* torchvision
* pyro-ppl
* Jax
* POT### Installation
Create environment:
```bash
conda create -n ex2mcmc python=3.8
conda activate ex2mcmc
```Install poetry (if absent):
```bash
curl -sSL https://install.python-poetry.org | python3 -
poetry config virtualenvs.create false
```Install the project:
```bash
poetry install --with dev
poetry add git+https://github.com/kwotsin/mimicry.git@a7fda06c4aff1e6af8dc4c4a35ed6636e434c766
```### Checkpoints
CIFAR10 checkpoints:
| GAN | Steps | Path, G | Path, D |
|:----------|:-------------:|:------:|:------:|
| DCGAN NS | 100k | [netG_100000_steps.pth](https://drive.google.com/file/d/1gv8_qr_xa8hJzdJpBXiKr8v922EqcE-E/view?usp=share_link) | [netD_100000_steps.pth](https://drive.google.com/file/d/1u1sPUmlvyhcbNDX2DVsR-mGOzqQ6U8sh/view?usp=share_link) |
| SNGAN, Hinge | 100k | [netG.pth](https://drive.google.com/file/d/118zC_iEkN27jGLVNmDuQpMeyw7BKOUra/view?usp=share_link) | [netD.pth](https://drive.google.com/file/d/1xU5FV59TLhAlkFubJGmJVS87HnZZ2xHT/view?usp=share_link) |MNIST checkpoints:
| GAN | Path |
|:----------|:-------------:|
| Vanilla | [vanilla_gan.pth](https://drive.google.com/file/d/1xa1v4hPQQdU2RkhjMn5sFZCITxTJ5Dhj/view?usp=share_link) |
| WGAN CP | [wgan.pth](https://drive.google.com/file/d/17nQJnfs2_T6kyahnkW3fu8AVY54kmRmw/view?usp=share_link) |You also can run script to download checkpoints:
```bash
chmod +x get_ckpts.sh
./get_ckpts.sh
```Download statistics for FID cimputation for CIFAR10 dataset:
```bash
mkdir -p stats & gdown 1jjgB_iuvmoVAXPRvVTI_hBfuIz7mQgOg -O stats/fid_stats_cifar10.npz
```### Usage
Demonstration on SNGAN: [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1EQQ_OdwCLn5MsOzlG-GS7yNcjTBU-KMp?usp=sharing)
#### Experiments with synthetic distributions:
FlEx2MCMC vs NUTS:
| Experiment | Path | Colab |
|:----------|:-------|:-----:|
| Toyish Gaussian | ```experiments/exp_synthetic/toyish_gaussian.ipynb``` | [TBD]() |
| Ex2MCMC for Mixture of Gaussians | ```experiments/exp_synthetic/ex2mcmc_mog.ipynb``` | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1xmBOZr1YhN8E7Y8GuwjgdM7hqaCgE6ik?usp=sharing) |
| FlEx2MCMC for Mixture of Gaussians | ```experiments/exp_synthetic/flex_mog.ipynb``` | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1Cy3Ion97F1kIWMNkF6wl-XODnfP9VQ5u?usp=sharing) |
| FlEx2MCMC for banana-shaped distribution | ```experiments/exp_synthetic/flex_banana.ipynb``` | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/16ZjJH8id__6aPPeCPFEQXO86kvFBO1wb?usp=sharing) |
| FlEx2MCMC for Neal's funnel distribution | ```experiments/exp_synthetic/flex_funnel.ipynb``` | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/15MwmA3kY7sSPmk2i3mm1TUc_Vg-AJC8g?usp=sharing) |To reproduce the experimets on banana-shaped and funnel distributions:
```bash
python experiments/exp_synthetic/banana_funnel_metrics.py --distribution {banana,funnel} --device cuda:0
```#### Experiments with GANs on MNIST dataset
```experiments/exp_mnist/JSGAN_samples.ipynb``` [TBD]()```experiments/exp_mnist/WGAN_samples.ipynb``` [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1aarHEdILfnR-vB0j7NN4uBurti08BtZk?usp=sharing)
#### Experiments with GANs on CIFAR10 dataset
```experiments/exp_cifar10_demo/DCGAN_samples.ipynb```
```experiments/exp_cifar10_demo/SNGAN_samples.ipynb```
#### Sampling and FID computation
```bash
python experiments/exp_cifar10_fid/run.py configs/mcmc_configs/{ula,mala,isir,ex2mcmc,flex2mcmc}.yml configs/mmc_dcgan.yml
```To run a full experiment:
```bash
chmod +x experiments/exp_cifar10_fid/run.sh & ./experiments/exp_cifar10_fid/run.sh
```### Results
#### FID and Inception Score (CIFAR10)
| GAN | MCMC | Steps | Inception Score | FID |
|:----|:-----|:------:|:---------------:|:----:|
|DCGAN| none | 0 | 6.3 | 28.4 |
|DCGAN| i-SIR | 1k | 6.96 | 22.7 |
|DCGAN| MALA | 1k | 6.95 | 23.4 |
|DCGAN| Ex2MCMC (our) | 1k | 7.56 | 19.0 |
|DCGAN| FlEx2MCMC (our) | 1k | **7.92** | 19.2 |
|DCGAN| FlEx2MCMC (our) | 180 | 7.62 | **17.1** |#### Energy landscape approximation (MNIST)
Projection of GAN samples onto the energy landsape when trained on MNIST dataset:
#### Sampling trajectories (CIFAR10)
Generation trajectories for DCGAN.* ULA:
* MALA:
* i-SIR:
* Ex2MCMC:
* FlEx2MCMC:
### Citation
```bibtex
@article{samsonov2022local,
title={Local-Global MCMC kernels: the best of both worlds},
author={Samsonov, Sergey and Lagutin, Evgeny and Gabri{\'e}, Marylou and Durmus, Alain and Naumov, Alexey and Moulines, Eric},
journal={Advances in Neural Information Processing Systems},
volume={35},
pages={5178--5193},
year={2022}
}
```