https://github.com/louis-alexandre-laguet/realnvp
This repository implements REAL NVP (Real-valued Non-Volume Preserving), a normalizing flow model that uses coupling layers to transform complex data into a simpler distribution, like a Gaussian. It efficiently reconstructs realistic data from a normal distribution without iterative steps, making inference fast.
https://github.com/louis-alexandre-laguet/realnvp
ai deep-learning generative-models normalizing-flow real-nvp
Last synced: about 1 month ago
JSON representation
This repository implements REAL NVP (Real-valued Non-Volume Preserving), a normalizing flow model that uses coupling layers to transform complex data into a simpler distribution, like a Gaussian. It efficiently reconstructs realistic data from a normal distribution without iterative steps, making inference fast.
- Host: GitHub
- URL: https://github.com/louis-alexandre-laguet/realnvp
- Owner: louis-alexandre-laguet
- Created: 2025-03-22T09:21:29.000Z (about 2 months ago)
- Default Branch: main
- Last Pushed: 2025-03-22T10:29:49.000Z (about 2 months ago)
- Last Synced: 2025-03-22T11:27:13.186Z (about 2 months ago)
- Topics: ai, deep-learning, generative-models, normalizing-flow, real-nvp
- Language: Jupyter Notebook
- Homepage:
- Size: 2.14 MB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# REAL NVP (Real-valued Non-Volume Preserving)
REAL NVP is a **normalizing flow** model published in 2017 by **Laurent Dinh et al.**,
from the DeepMind team at Google. It belongs to the family of **normalizing flow models**,
where a complex random variable is transformed bijectively into a simpler distribution,
usually a Gaussian.For inference, a sample is first drawn from a standard normal distribution.
By applying the inverse transformations learned by the model, a realistic data point is then reconstructed
in the original space. This process is fast and does not require iterative steps
as in traditional generative models.The learning of this transformation relies on **coupling layers**, which ensure
a simple inversion while maintaining great flexibility in modeling the distribution.
Each coupling layer divides the data into two parts:
- One part is left unchanged
- One part is transformed parametrically based on the otherThis alternating transformation ensures that, over the layers, **all dimensions of the data**
are gradually modified. Additionally, it allows for an efficient calculation of the Jacobian, thus avoiding
the exponential cost often associated with explicit density models.The training of REAL NVP is based on **maximum likelihood maximization**,
enabling stable learning without the need for adversarial techniques.