https://github.com/amazon-science/probconserv
Datasets and code for results presented in the ProbConserv paper
https://github.com/amazon-science/probconserv
conservation-laws downstream-tasks partial-differential-equations porous-media-flow shock-capturing uncertainty-quantification
Last synced: 6 months ago
JSON representation
Datasets and code for results presented in the ProbConserv paper
- Host: GitHub
- URL: https://github.com/amazon-science/probconserv
- Owner: amazon-science
- License: apache-2.0
- Created: 2023-02-08T15:49:48.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2024-06-14T20:47:45.000Z (12 months ago)
- Last Synced: 2024-12-05T16:22:21.218Z (6 months ago)
- Topics: conservation-laws, downstream-tasks, partial-differential-equations, porous-media-flow, shock-capturing, uncertainty-quantification
- Language: Python
- Homepage:
- Size: 34.7 MB
- Stars: 51
- Watchers: 7
- Forks: 9
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
README
# ProbConserv: Probabilistic Framework to Enforce Conservation Laws
[](https://github.com/wemake-services/wemake-python-styleguide)
[Derek Hansen](http://www-personal.umich.edu/~dereklh/), [Danielle C. Maddix](https://dcmaddix.github.io/), [Shima Alizadeh](https://scholar.google.com/citations?user=r3qS03kAAAAJ&hl=en), [Gaurav Gupta](http://guptagaurav.me/index.html), [Michael W. Mahoney](https://www.stat.berkeley.edu/~mmahoney/) \
**Learning Physical Models that Can Respect Conservation Laws** \
[Proceedings of the 40th International Conference on Machine Learning (ICML)](https://proceedings.mlr.press/v202/hansen23b/hansen23b.pdf), PMLR. 202:12469-12510, 2023.## Installation
This project uses [poetry](https://python-poetry.org/) to manage dependencies.From the root directory:
```
poetry install
```Some of the plots require certain LaTeX packages are present. On Ubuntu, these are
```
sudo apt install cm-super dvipng texlive-latex-extra texlive-fonts-recommended
```You can then use `poetry run` followed by a command, or `poetry shell` to open a shell with the correct virtual environment.
To run the tests:
```
poetry run pytest
```
The code for this project is located in the `deep_pdes` folder. It consists of two libraries, `attentive_neural_process` and `datasets`, that comprise the models and datasets respectively.
These libraries are imported by the scripts in `experiments` that configure and run the specific case studies explored in the ProbConserv paper.
## Running experiments
The experiment code in `deep_pdes/experiment` uses [Hydra](https://hydra.cc/) to manage configuration and run experiments. The different stages of the experiments are broken into distinct commands for easier reproduceability
- `generate.py`: Generate synthetic datasets for training
- `train.py`: Train ProbConserv-ANP, ANP, and other baseline methods such as Physics-Informed Neural Networks (PINNs)
- `analyze.py`: Evaluate the trained models on test datasets and create tables/plots from the results.
- `plots.py`: Generate all plots used in the submission. Does not use the Hydra CLI but uses the compose API internally.Each script is run by passing an `+experiments=*` flag. The available experiments can be found in `deep_pdes/experiments/conf/experiments`. For example, to recreate the results on the Stefan GPME setting:
```
EXPERIMENT=2b_stefan_var_ppython generate.py +experiments=$EXPERIMENT
python train.py +experiments=$EXPERIMENT +train=${EXPERIMENT}_anp
python train.py +experiments=$EXPERIMENT +train=${EXPERIMENT}_pinp
python analyze.py +experiments=$EXPERIMENT
```
These commands are also available in convenience scripts; for example, the above is in `deep_pdes/experiments/2b_stefan_var_p.sh`. \
**Solution Profiles and UQ for the Stefan Equation** \
 \
**Downstream Task: Shock location detection**For the diffusion equation with constant diffusivity, see `deep_pdes/experiments/3b_heat_var_c.sh`.
 \
**Conservation of mass** can be violated by the black-box deep learning models, even with applying the PDE as a soft-constraint to the loss function a la Physics informed Neural Networks (PINNs). The true mass for this diffusion equation is zero over time since there is zero net flux from the domain boundaries and mass cannot be created or destroyed on the interior.## Sources
This repo contains modified versions of the code found in the following repos:
- `https://github.com/a1k12/characterizing-pinns-failure-modes`: For diffusion/heat equation analytical solution (MIT license)
- `https://github.com/soobinseo/Attentive-Neural-Process`: For implementation of the Attentive Neural Process (Apache 2.0 license)## Citation
If you use this code, or our work, please cite:
```
@inproceedings{hansen2023learning,
title={Learning Physical Models that Can Respect Conservation Laws},
author={Hansen, Derek and Maddix, Danielle C. and Alizadeh, Shima and Gupta, Gaurav and Mahoney, Michael W},
booktitle={International Conference on Machine Learning},
year={2023},
volume = {202},
pages={12469-12510},
organization={PMLR}
}