Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/AWehenkel/Graphical-Normalizing-Flows
Combining smooth constraint for building DAG with normalizing flow in order to replace autoregressive transformations while keeping tractable Jacobian.
https://github.com/AWehenkel/Graphical-Normalizing-Flows
Last synced: 3 months ago
JSON representation
Combining smooth constraint for building DAG with normalizing flow in order to replace autoregressive transformations while keeping tractable Jacobian.
- Host: GitHub
- URL: https://github.com/AWehenkel/Graphical-Normalizing-Flows
- Owner: AWehenkel
- License: other
- Created: 2020-02-04T10:43:35.000Z (almost 5 years ago)
- Default Branch: master
- Last Pushed: 2023-09-21T13:22:05.000Z (about 1 year ago)
- Last Synced: 2024-07-12T12:39:20.018Z (4 months ago)
- Language: Python
- Size: 524 KB
- Stars: 43
- Watchers: 4
- Forks: 11
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- License: license
Awesome Lists containing this project
- awesome-normalizing-flows - Graphical Normalizing Flows
README
# Graphical Normalizing Flows
Offical codes and experiments for the paper:
> Graphical Normalizing Flows, Antoine Wehenkel and Gilles Louppe. (May 2020).
> [[arxiv]](https://arxiv.org/abs/2006.02548)
# Dependencies
The list of dependencies can be found in requirements.txt text file and installed with the following command:
```bash
pip install -r requirements.txt
```
# Code architecture
This repository provides some code to build diverse types normalizing flow models in PyTorch. The core components are located in the **models** folder. The different flow models are described in the file **NormalizingFlow.py** and they all follow the structure of the parent **class NormalizingFlow**.
A flow step is usually designed as a combination of a **normalizer** (such as the ones described in Normalizers sub-folder) with a **conditioner** (such as the ones described in Conditioners sub-folder). Following the code hierarchy provided makes the implementation of new conditioners, normalizers or even complete flow architecture very easy.
# Paper's experiments
## UCI Datasets
You first have to download the datasets with the following command:
```bash
python UCIdatasets/download_dataset.py
```
Then you can run the experiment of your choice with the following command:
```bash
python UCIExperiments.py -load_config
```
where defines the experimental configuration loaded from *UCIExperimentsConfigurations.yml* file, e.g. *power-mono-DAG*.
See also UCIExperiments.py for other optional arguments.
## MNIST
### Affine Normalizers
##### Graphical Conditioner
```bash
python ImageExperiments.py -dataset MNIST -b_size 100 -normalizer Affine -conditioner DAG -nb_flow 1 -nb_steps_dual 10 -l1 0. -prior_A_kernel 2
```
##### Autoregressive Conditioner
```bash
python ImageExperiments.py -dataset MNIST -b_size 100 -normalizer Affine -conditioner Autoregressive -nb_flow 1 -emb_net 1024 1024 1024 2
```
##### Coupling Conditioner```bash
python ImageExperiments.py -dataset MNIST -b_size 100 -normalizer Affine -conditioner Coupling -nb_flow 1 -emb_net 1024 1024 1024 2
```
### Monotonic Normalizers
##### Graphical Conditioner
```bash
python ImageExperiments.py -dataset MNIST -b_size 100 -normalizer Monotonic -conditioner DAG -nb_flow 1 -nb_steps_dual 10 -l1 0. -prior_A_kernel 2
```
##### Autoregressive Conditioner
```bash
python ImageExperiments.py -dataset MNIST -b_size 100 -normalizer Monotonic -conditioner Autoregressive -nb_flow 1 -emb_net 1024 1024 1024 30
```
##### Coupling Conditioner```bash
python ImageExperiments.py -dataset MNIST -b_size 100 -normalizer Monotonic -conditioner Coupling -nb_flow 1 -emb_net 1024 1024 1024 30
```