Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/slimgroup/InvertibleNetworks.jl
A Julia framework for invertible neural networks
https://github.com/slimgroup/InvertibleNetworks.jl
bayesian-inference deep-invertible-networks deep-learning invertible-1x1-convolutions invertible-neural-networks julia julia-language machine-learning normalizing-flow normalizing-flows
Last synced: 9 days ago
JSON representation
A Julia framework for invertible neural networks
- Host: GitHub
- URL: https://github.com/slimgroup/InvertibleNetworks.jl
- Owner: slimgroup
- License: mit
- Created: 2020-02-07T20:38:09.000Z (over 4 years ago)
- Default Branch: master
- Last Pushed: 2024-10-28T20:09:34.000Z (11 days ago)
- Last Synced: 2024-10-30T05:56:50.863Z (9 days ago)
- Topics: bayesian-inference, deep-invertible-networks, deep-learning, invertible-1x1-convolutions, invertible-neural-networks, julia, julia-language, machine-learning, normalizing-flow, normalizing-flows
- Language: Julia
- Homepage:
- Size: 2.19 MB
- Stars: 154
- Watchers: 12
- Forks: 23
- Open Issues: 15
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Citation: CITATION.bib
Awesome Lists containing this project
- awesome-sciml - slimgroup/InvertibleNetworks.jl: A Julia framework for invertible neural networks
- awesome-normalizing-flows - InvertibleNetworks.jl
README
# InvertibleNetworks.jl
| **Documentation** | **Build Status** | **JOSS paper** |
|:-----------------:|:-----------------:|:----------------:|
|[![](https://img.shields.io/badge/docs-stable-blue.svg)](https://slimgroup.github.io/InvertibleNetworks.jl/stable/) [![](https://img.shields.io/badge/docs-dev-blue.svg)](https://slimgroup.github.io/InvertibleNetworks.jl/dev/)| [![CI](https://github.com/slimgroup/InvertibleNetworks.jl/actions/workflows/runtests.yml/badge.svg)](https://github.com/slimgroup/InvertibleNetworks.jl/actions/workflows/runtests.yml)| [![DOI](https://joss.theoj.org/papers/10.21105/joss.06554/status.svg)](https://doi.org/10.21105/joss.06554)Building blocks for invertible neural networks in the [Julia] programming language.
- Memory efficient building blocks for invertible neural networks
- Hand-derived gradients, Jacobians $J$ , and $\log |J|$
- [Flux] integration
- Support for [Zygote] and [ChainRules]
- GPU support
- Includes various examples of invertible neural networks, normalizing flows, variational inference, and uncertainty quantification## Installation
InvertibleNetworks is registered and can be added like any standard Julia package with the command:
```
] add InvertibleNetworks
```## Uncertainty-aware image reconstruction
Due to its memory scaling InvertibleNetworks.jl, has been particularily successful at Bayesian posterior sampling with simulation-based inference. To get started with this application refer to a simple example ([Conditional sampling for MNSIT inpainting](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/applications/conditional_sampling/amortized_glow_mnist_inpainting.jl)) but feel free to modify this script for your application and please reach out to us for help.
![mnist_sampling_cond](docs/src/figures/mnist_sampling_cond.png)
## Building blocks
- 1x1 Convolutions using Householder transformations ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_convolution_1x1.jl))
- Residual block ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_residual_block.jl))
- Invertible coupling layer from Dinh et al. (2017) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_coupling_glow.jl))
- Invertible hyperbolic layer from Lensink et al. (2019) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_coupling_hyperbolic.jl))
- Invertible coupling layer from Putzky and Welling (2019) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_coupling_irim.jl))
- Invertible recursive coupling layer HINT from Kruse et al. (2020) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_coupling_hint.jl))
- Activation normalization (Kingma and Dhariwal, 2018) ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/layers/layer_actnorm.jl))
- Various activation functions (Sigmoid, ReLU, leaky ReLU, GaLU)
- Objective and misfit functions (mean squared error, log-likelihood)
- Dimensionality manipulation: squeeze/unsqueeze (column, patch, checkerboard), split/cat
- Squeeze/unsqueeze using the wavelet transform
## Examples
- Invertible recurrent inference machines (Putzky and Welling, 2019) ([generic example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/networks/network_irim.jl))
- Generative models with maximum likelihood via the change of variable formula ([example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/applications/application_glow_banana_dist.jl))
- Glow: Generative flow with invertible 1x1 convolutions (Kingma and Dhariwal, 2018) ([generic example](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/examples/networks/network_glow.jl), [source](https://github.com/slimgroup/InvertibleNetworks.jl/tree/master/src/networks/invertible_network_glow.jl))
## GPU support
GPU support is supported via Flux/CuArray. To use the GPU, move the input and the network layer to GPU via `|> gpu`
```
using InvertibleNetworks, Flux# Input
nx = 64
ny = 64
k = 10
batchsize = 4# Input image: nx x ny x k x batchsize
X = randn(Float32, nx, ny, k, batchsize) |> gpu# Activation normalization
AN = ActNorm(k; logdet=true) |> gpu# Test invertibility
Y_, logdet = AN.forward(X)
```## Reference
If you use InvertibleNetworks.jl in your research, we would be grateful if you cite us with the following bibtex:
```
@article{Orozco2024, doi = {10.21105/joss.06554}, url = {https://doi.org/10.21105/joss.06554}, year = {2024}, publisher = {The Open Journal}, volume = {9}, number = {99}, pages = {6554}, author = {Rafael Orozco and Philipp Witte and Mathias Louboutin and Ali Siahkoohi and Gabrio Rizzuti and Bas Peters and Felix J. Herrmann}, title = {InvertibleNetworks.jl: A Julia package for scalable normalizing flows}, journal = {Journal of Open Source Software} }
```## Papers
The following publications use [InvertibleNetworks.jl]:
- **["Reliable amortized variational inference with physics-based latent distribution correction"]**
- paper: [https://arxiv.org/abs/2207.11640](https://arxiv.org/abs/2207.11640)
- [presentation](https://slim.gatech.edu/Publications/Public/Submitted/2022/siahkoohi2022ravi/slides.pdf)
- code: [ReliableAVI.jl]- **["Learning by example: fast reliability-aware seismic imaging with normalizing flows"]**
- paper: [https://arxiv.org/abs/2104.06255](https://arxiv.org/abs/2104.06255)
- [presentation](https://slim.gatech.edu/Publications/Public/Conferences/KAUST/2021/siahkoohi2021EarthMLfar/siahkoohi2021EarthMLfar.pdf)
- code: [ReliabilityAwareImaging.jl]- **["Enabling uncertainty quantification for seismic data pre-processing using normalizing flows (NF)—an interpolation example"]**
- [paper](https://slim.gatech.edu/Publications/Public/Conferences/SEG/2021/kumar2021SEGeuq/kumar2021SEGeuq.pdf)
- code: [WavefieldRecoveryUQ.jl]- **["Preconditioned training of normalizing flows for variational inference in inverse problems"]**
- paper: [https://arxiv.org/abs/2101.03709](https://arxiv.org/abs/2101.03709)
- [presentation](https://slim.gatech.edu/Publications/Public/Conferences/AABI/2021/siahkoohi2021AABIpto/siahkoohi2021AABIpto_pres.pdf)
- code: [FastApproximateInference.jl]- **["Parameterizing uncertainty by deep invertible networks, an application to reservoir characterization"]**
- paper: [https://arxiv.org/abs/2004.07871](https://arxiv.org/abs/2004.07871)
- [presentation](https://slim.gatech.edu/Publications/Public/Conferences/SEG/2020/rizzuti2020SEGuqavp/rizzuti2020SEGuqavp_pres.pdf)
- code: [https://github.com/slimgroup/Software.SEG2020](https://github.com/slimgroup/Software.SEG2020)- **["Generalized Minkowski sets for the regularization of inverse problems"]**
- paper: [http://arxiv.org/abs/1903.03942](http://arxiv.org/abs/1903.03942)
- code: [SetIntersectionProjection.jl]## Contributing
We welcome contributions and bug reports!
Please see [CONTRIBUTING.md](https://github.com/slimgroup/InvertibleNetworks.jl/blob/master/CONTRIBUTING.md) for guidance.InvertibleNetworks.jl development subscribes to the [Julia Community Standards](https://julialang.org/community/standards/).
## Authors
- Rafael Orozco, Georgia Institute of Technology [[email protected]]
- Philipp Witte, Georgia Institute of Technology (now Microsoft)
- Gabrio Rizzuti, Utrecht University
- Mathias Louboutin, Georgia Institute of Technology
- Ali Siahkoohi, Georgia Institute of Technology
## Acknowledgments
This package uses functions from [NNlib.jl](https://github.com/FluxML/NNlib.jl), [Flux.jl](https://github.com/FluxML/Flux.jl) and [Wavelets.jl](https://github.com/JuliaDSP/Wavelets.jl)
[Flux]:https://fluxml.ai
[Julia]:https://julialang.org
[Zygote]:https://github.com/FluxML/Zygote.jl
[ChainRules]:https://github.com/JuliaDiff/ChainRules.jl
[InvertibleNetworks.jl]:https://github.com/slimgroup/InvertibleNetworks.jl
["Learning by example: fast reliability-aware seismic imaging with normalizing flows"]:https://slim.gatech.edu/content/learning-example-fast-reliability-aware-seismic-imaging-normalizing-flows
["Enabling uncertainty quantification for seismic data pre-processing using normalizing flows (NF)—an interpolation example"]:https://slim.gatech.edu/content/ultra-low-memory-seismic-inversion-randomized-trace-estimation-0
["Preconditioned training of normalizing flows for variational inference in inverse problems"]:https://slim.gatech.edu/content/preconditioned-training-normalizing-flows-variational-inference-inverse-problems
[ReliabilityAwareImaging.jl]:https://github.com/slimgroup/Software.SEG2021/tree/main/ReliabilityAwareImaging.jl
[WavefieldRecoveryUQ.jl]:https://github.com/slimgroup/Software.SEG2021/tree/main/WavefieldRecoveryUQ.jl
[FastApproximateInference.jl]:https://github.com/slimgroup/Software.siahkoohi2021AABIpto
["Generalized Minkowski sets for the regularization of inverse problems"]:https://slim.gatech.edu/content/generalized-minkowski-sets-regularization-inverse-problems-1
[SetIntersectionProjection.jl]:https://github.com/slimgroup/SetIntersectionProjection.jl
["Parameterizing uncertainty by deep invertible networks, an application to reservoir characterization"]:https://slim.gatech.edu/content/parameterizing-uncertainty-deep-invertible-networks-application-reservoir-characterization
["Reliable amortized variational inference with physics-based latent distribution correction"]:https://slim.gatech.edu/content/reliable-amortized-variational-inference-physics-based-latent-distribution-correction
[ReliableAVI.jl]:https://github.com/slimgroup/ReliableAVI.jl