https://github.com/mathisgerdes/bijx
Bijections & normalizing flows with JAX/NNX
https://github.com/mathisgerdes/bijx
bijections flax jax lattice-field-theory lie-groups nnx normalizing-flows sampling
Last synced: 7 days ago
JSON representation
Bijections & normalizing flows with JAX/NNX
- Host: GitHub
- URL: https://github.com/mathisgerdes/bijx
- Owner: mathisgerdes
- License: apache-2.0
- Created: 2025-07-31T15:18:22.000Z (6 months ago)
- Default Branch: master
- Last Pushed: 2025-12-08T09:14:32.000Z (about 2 months ago)
- Last Synced: 2025-12-10T06:26:21.478Z (about 2 months ago)
- Topics: bijections, flax, jax, lattice-field-theory, lie-groups, nnx, normalizing-flows, sampling
- Language: Python
- Homepage: https://www.mathisgerdes.com/bijx/
- Size: 2.52 MB
- Stars: 11
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Citation: CITATION.cff
Awesome Lists containing this project
README
bijection + jax = /ˈbaɪdʒæks/
# Bijections & normalizing flows with JAX/NNX
[](https://github.com/mathisgerdes/bijx)
[](https://mathisgerdes.github.io/bijx/)
[](https://github.com/mathisgerdes/bijx/actions/workflows/tests.yml)
[](https://codecov.io/github/mathisgerdes/bijx)
[](https://pypi.org/project/bijx/)
[](https://doi.org/10.21105/joss.09521)
This library provides flexible tools for building normalizing flows and bijections with tractable change of densities, focusing on research and applications in physics. It aims to provide reusable building blocks rather than a simplified interface for common use cases.
The library is built around two fundamental mathematical objects:
- **Bijections**: Invertible transformations that track their effect on probability densities.
- **Distributions**: Probability distributions with methods for sampling and density evaluation.
## Key Features
Core design goals:
- **Modular Building Blocks**: Composable components rather than monolithic frameworks
- **Research-Focused**: Prioritizes flexibility and expressiveness over convenience
- **Flax NNX Integration**: Modern state management for neural network components
Physics & advanced methods:
- **Continuous Normalizing Flows**: CNFs as core primitive with flexible ODE solver backends
- **Matrix Lie Group Operations**: Automatic differentiation on matrix groups
- **Structure-Preserving Integration**: Crouch-Grossmann solvers for ODEs on matrix groups
- **Symmetry-Aware Architectures**: Equivariant layers and transformations (e.g., lattice-symmetric CNFs)
- **Fourier-Space Operations**: Tools for momentum space transformations and complex field decomposition
## Quickstart
Here is a minimal example of building and sampling from a simple normalizing flow. We transform samples from a base distribution (a standard normal) using a chain of bijections to produce samples from a new, transformed distribution.
```python
import jax
import jax.numpy as jnp
import bijx
from flax import nnx
# Define simple base distributions
prior = bijx.IndependentNormal(event_shape=(2,))
# choose a base bijection for the coupling layer
base_bijection = bijx.ModuleReconstructor(
# for real NVP style, using affine linear as base bijection
bijx.AffineLinear(rngs=nnx.Rngs(0))
)
# build a simple coupling layer
coupling_layer = bijx.GeneralCouplingLayer(
# replace with more realistic NN (active -> params)
embedding_net=lambda x: jnp.zeros(x.shape + (base_bijection.params_total_size,)),
mask=bijx.BinaryMask.from_boolean_mask(jnp.array([True, False])),
bijection_reconstructor=base_bijection,
)
# build a very simple continuous flow
# note: automatic divergence is expensive in high dimensions
cnf = bijx.ContFlowDiffrax(
# vf must return divergence too; AutoJacVF computes it for us
vf=bijx.AutoJacVF(
lambda t, x: -x,
)
)
# Compose bijections
flow = bijx.Chain(
bijx.Shift(jnp.array([5.0, -2.0])),
bijx.Scaling(jnp.array([2.0, 0.5])),
# add more coupling layers for expressivity
coupling_layer,
# final continuous flow
cnf,
)
# sample from the base distribution
rng = jax.random.PRNGKey(42)
x, lp_x = prior.sample(batch_shape=(5,), rng=rng)
# transform samples with bijection
y, lp_y = flow.forward(x, lp_x)
# these can also be generated by combining prior and flow into a new distribution
dist = bijx.Transformed(prior, flow)
y2, lp_y2 = dist.sample(batch_shape=(5,), rng=rng)
# exactly the same as we used the same random key
assert jnp.allclose(y, y2)
# check bijectivity
x_rec, lp_x_rec = flow.reverse(y, lp_y)
assert jnp.allclose(x, x_rec, atol=1e-6)
assert jnp.allclose(lp_x, lp_x_rec, atol=1e-6)
```
## Design Principles
- **Modularity**: Different parts of the library should be usable on their own. Expose as many expressive building blocks as possible.
- **Flexibility**: Prioritize flexibility, sometimes at the cost of adding more ways to break things.
- **Runtime Shape Inference**: Use a `batch + space + channels` convention and automatic vectorization for flexible data shape handling.
## Installation
This package can be installed directly from PyPI:
```bash
pip install bijx
```
Alternatively, to install locally from source:
```bash
pip install -e .
```
For development, install as an editable package with all dependencies (optionally without `docs` if documentation building is not needed):
```bash
pip install -e ".[dev,docs]"
```
To keep the codebase tidy, please install `pip install pre-commit` and run `pre-commit install` before committing changes.
## Documentation
To compile and open a local server for the documentation, run `make livehtml` in the `docs/` directory.
## Testing
There are two types of tests that can be run:
the unit tests in `tests/` and the docstring examples in the source code `src/bijx/`.
```bash
# Run tests
pytest tests/
# Run doctests
pytest src/bijx/ --doctest-modules
# Run all tests
pytest tests/ src/bijx/ --doctest-modules
# Run tests in parallel using xdist
pytest -n auto
```
## Citation
If you use bijx in your research, please cite our JOSS paper:
```bibtex
@article{Gerdes2025Bijx,
doi = {10.21105/joss.09521},
url = {https://doi.org/10.21105/joss.09521},
year = {2025},
publisher = {The Open Journal},
volume = {10},
number = {116},
pages = {9521},
author = {Gerdes, Mathis and Cheng, Miranda C. N.},
title = {Bijx: Bijections and normalizing flows with JAX/NNX},
journal = {Journal of Open Source Software}
}
```
### Related Publications
This library extends research software developed for the following publications, which you may find useful and may consider citing if you use the relevant components:
- **[Learning Lattice Quantum Field Theories with Equivariant Continuous Flows](https://arxiv.org/abs/2207.00283)**
- **[Non-Perturbative Trivializing Flows for Lattice Gauge Theories](https://arxiv.org/abs/2410.13161)**
- **[Analytic Bijections for Smooth and Interpretable Normalizing Flows](https://arxiv.org/abs/2601.10774)**
## Module Layout
The library is organized into core mathematical tools, machine learning components, and specific applications.
```
bijx/
├── __init__.py # Main package exports
├── utils.py # General utilities
├── distributions.py # Core Distribution classes
├── samplers.py # Sampling helpers
├── solvers.py # ODE solver implementations
│
├── bijections/ # All bijection-related code
│ ├── base.py # Core Bijection, Chain, etc.
│ ├── continuous.py # Continuous flow wrappers
│ ├── conv_cnf.py # Convolutional CNF architecture
│ ├── coupling.py # Coupling layers
│ └── ...
│
├── nn/ # Neural network components (NNX)
│ ├── conv.py # Symmetric convolutions
│ ├── embeddings.py # Time and positional embeddings
│ └── features.py # Reusable feature-mappers
│
├── fourier.py # Tools for Fourier-space operations
├── lie.py # General-purpose Lie group operations
├── cg.py # Crouch-Grossmann ODE integrators
│
└── lattice/ # Application: Lattice Field Theory
├── scalar.py # Action and observables for phi^4 theory
└── gauge.py # Gauge field symmetry operations
```