Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/tum-pbs/PhiFlow

A differentiable PDE solving framework for machine learning
https://github.com/tum-pbs/PhiFlow

deep-learning differentiable-simulations fluid-simulations neural-networks pde-solver

Last synced: about 2 months ago
JSON representation

A differentiable PDE solving framework for machine learning

Awesome Lists containing this project

README

        

# ![PhiFlow](docs/figures/Logo_DallE2_3_layout.png)

![Build Status](https://github.com/tum-pbs/PhiFlow/actions/workflows/unit-tests.yml/badge.svg)
[![PyPI pyversions](https://img.shields.io/pypi/pyversions/phiflow.svg)](https://pypi.org/project/phiflow/)
[![PyPI license](https://img.shields.io/pypi/l/phiflow.svg)](https://pypi.org/project/phiflow/)
[![Code Coverage](https://codecov.io/gh/tum-pbs/PhiFlow/branch/develop/graph/badge.svg)](https://codecov.io/gh/tum-pbs/PhiFlow/branch/develop/)
[![Google Collab Book](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/tum-pbs/PhiFlow/blob/develop/docs/Fluids_Tutorial.ipynb)

ΦFlow is an open-source simulation toolkit built for optimization and machine learning applications.
It is written mostly in Python and can be used with
[NumPy](https://numpy.org/),
[PyTorch](https://pytorch.org/),
[Jax](https://github.com/google/jax)
or [TensorFlow](https://www.tensorflow.org/).
The close integration with these machine learning frameworks allows it to leverage their automatic differentiation functionality,
making it easy to build end-to-end differentiable functions involving both learning models and physics simulations.

## Examples

### Grids









Fluid logo
Wake flow
Lid-driven cavity
Taylor-Green








Smoke plume
Variable boundaries
Parallel simulations
Moving obstacles








Rotating bar
Multi-grid fluid
Higher-order Kolmogorov
Heat flow








Burgers' equation
Reaction-diffusion
Waves
Julia Set

### Mesh









Backward facing step
Heat flow
Mesh construction
Wake flow

### Particles









SPH
FLIP
Streamlines
Terrain







Gravity
Billiards
Ropes

### Optimization & Networks









Gradient Descent
Optimize throw
Learning to throw
PIV







Close packing
Learning Φ(x,y)
Differentiable pressure

## Installation

Installation with [pip](https://pypi.org/project/pip/) on [Python 3.6](https://www.python.org/downloads/) and above:
``` bash
$ pip install phiflow
```
Install [PyTorch](https://pytorch.org/), [TensorFlow](https://www.tensorflow.org/install) or [Jax](https://github.com/google/jax#installation) in addition to ΦFlow to enable machine learning capabilities and GPU execution.
To enable the web UI, also install [Dash](https://pypi.org/project/dash/).
For optimal GPU performance, you may compile the custom CUDA operators, see the [detailed installation instructions](https://tum-pbs.github.io/PhiFlow/Installation_Instructions.html).

You can verify your installation by running
```bash
$ python3 -c "import phi; phi.verify()"
```
This will check for compatible PyTorch, Jax and TensorFlow installations as well.

## Features

* Tight integration with PyTorch, Jax and TensorFlow for straightforward neural network training with fully differentiable simulations that can [run on the GPU](https://tum-pbs.github.io/PhiFlow/GPU_Execution.html#enabling-gpu-execution).
* Built-in PDE operations with focus on fluid phenomena, allowing for concise formulation of simulations.
* Flexible, easy-to-use [web interface](https://tum-pbs.github.io/PhiFlow/Web_Interface.html) featuring live visualizations and interactive controls that can affect simulations or network training on the fly.
* Object-oriented, vectorized design for expressive code, ease of use, flexibility and extensibility.
* Reusable simulation code, independent of backend and dimensionality, i.e. the exact same code can run a 2D fluid sim using NumPy and a 3D fluid sim on the GPU using TensorFlow or PyTorch.
* High-level linear equation solver with automated sparse matrix generation.

## 📖 Documentation and Tutorials
[**Documentation Overview**](https://tum-pbs.github.io/PhiFlow/)
  •   [**▶ YouTube Tutorials**](https://www.youtube.com/playlist?list=PLYLhRkuWBmZ5R6hYzusA2JBIUPFEE755O)
  •   [**API**](https://tum-pbs.github.io/PhiFlow/phi/)
  •   [**Demos**](https://github.com/tum-pbs/PhiFlow/tree/master/demos)
  •   [ **Playground**](https://colab.research.google.com/drive/1zBlQbmNguRt-Vt332YvdTqlV4DBcus2S#offline=true&sandboxMode=true)

Φ-Flow builds on the tensor functionality from [ΦML](https://github.com/tum-pbs/PhiML).
To understand how ΦFlow works, check [named and typed dimensions](https://tum-pbs.github.io/PhiML/Introduction.html) first.

### Getting started

* [Installation instructions](https://tum-pbs.github.io/PhiFlow/Installation_Instructions.html)
* [](https://colab.research.google.com/github/tum-pbs/PhiFlow/blob/develop/docs/Math_Introduction.ipynb) [Tensors](https://tum-pbs.github.io/PhiFlow/Math_Introduction.html)
* [](https://colab.research.google.com/github/tum-pbs/PhiFlow/blob/develop/docs/Fluids_Tutorial.ipynb) [Fluids](https://tum-pbs.github.io/PhiFlow/Fluids_Tutorial.html)
* [](https://colab.research.google.com/github/tum-pbs/PhiFlow/blob/develop/docs/Cookbook.ipynb) [Cookbook](https://tum-pbs.github.io/PhiFlow/Cookbook.html)

### Physics

* [Grid-based fluids](https://tum-pbs.github.io/PhiFlow/Fluid_Simulation.html)
* [Higher-order schemes](https://tum-pbs.github.io/PhiFlow/Taylor_Green_Comparison.html)

### Fields

* [Overview](https://tum-pbs.github.io/PhiFlow/Fields.html)
* [Staggered grids](https://tum-pbs.github.io/PhiFlow/Staggered_Grids.html)
* [I/O](https://tum-pbs.github.io/PhiFlow/Reading_and_Writing_Data.html) & [scene format](https://tum-pbs.github.io/PhiFlow/Scene_Format_Specification.html)

### Geometry

* [Overview](https://tum-pbs.github.io/PhiFlow/Geometry.html)
* [Signed distance fields](https://tum-pbs.github.io/PhiFlow/SDF.html)
* [Heightmaps](https://tum-pbs.github.io/PhiFlow/Heightmaps.html)

### Tensors

* [▶️ Introduction Video](https://youtu.be/4nYwL8ZZDK8)
* [Introduction Notebook](Math_Introduction.html)
* [GPU execution](https://tum-pbs.github.io/PhiFlow/GPU_Execution.html#enabling-gpu-execution)

### Other

* [ΦFlow to Blender](https://github.com/intergalactic-mammoth/phiflow2blender)
* [What to Avoid](https://tum-pbs.github.io/PhiFlow/Known_Issues.html): How to keep your code compatible with PyTorch, TensorFlow and Jax
* [Legacy visualization](https://tum-pbs.github.io/PhiFlow/Visualization.html) & [Dash](https://tum-pbs.github.io/PhiFlow/Web_Interface.html) & [Console](https://tum-pbs.github.io/PhiFlow/ConsoleUI.html)
* [Legacy physics overview](https://tum-pbs.github.io/PhiFlow/Physics.html)

## 📄 Citation

Please use the following citation:

```
@inproceedings{holl2024phiflow,
title={${\Phi}_{\text{Flow}}$ ({PhiFlow}): Differentiable Simulations for PyTorch, TensorFlow and Jax},
author={Holl, Philipp and Thuerey, Nils},
booktitle={International Conference on Machine Learning},
year={2024},
organization={PMLR}
}
```

## Publications

We will upload a whitepaper, soon.
In the meantime, please cite the ICLR 2020 paper.

* [Learning to Control PDEs with Differentiable Physics](https://ge.in.tum.de/publications/2020-iclr-holl/), *Philipp Holl, Vladlen Koltun, Nils Thuerey*, ICLR 2020.
* [Solver-in-the-Loop: Learning from Differentiable Physics to Interact with Iterative PDE-Solvers](https://arxiv.org/abs/2007.00016), *Kiwon Um, Raymond Fei, Philipp Holl, Robert Brand, Nils Thuerey*, NeurIPS 2020.
* [ΦFlow: A Differentiable PDE Solving Framework for Deep Learning via Physical Simulations](https://montrealrobotics.ca/diffcvgp/), *Nils Thuerey, Kiwon Um, Philipp Holl*, DiffCVGP workshop at NeurIPS 2020.
* [Physics-based Deep Learning](https://physicsbaseddeeplearning.org/intro.html) (book), *Nils Thuerey, Philipp Holl, Maximilian Mueller, Patrick Schnell, Felix Trost, Kiwon Um*, DiffCVGP workshop at NeurIPS 2020.
* [Half-Inverse Gradients for Physical Deep Learning](https://arxiv.org/abs/2203.10131), *Patrick Schnell, Philipp Holl, Nils Thuerey*, ICLR 2022.
* [Scale-invariant Learning by Physics Inversion](https://arxiv.org/abs/2109.15048), *Philipp Holl, Vladlen Koltun, Nils Thuerey*, NeurIPS 2022.

## Benchmarks & Data Sets

ΦFlow has been used in the creation of various public data sets, such as
[PDEBench](https://github.com/pdebench/PDEBench) and [PDEarena](https://microsoft.github.io/pdearena/).

[See more packages that use ΦFlow](https://github.com/tum-pbs/PhiFlow/network/dependents)

## 🕒 Version History

The [Version history](https://github.com/tum-pbs/PhiFlow/releases) lists all major changes since release.
The releases are also listed on [PyPI](https://pypi.org/project/phiflow/).

## 👥 Contributions

Contributions are welcome! Check out [this document](CONTRIBUTING.md) for guidelines.

## Acknowledgements

This work is supported by the ERC Starting Grant realFlow (StG-2015-637014) and the Intel Intelligent Systems Lab.