https://github.com/ml-jku/upt
Code for the paper Universal Physics Transformers
https://github.com/ml-jku/upt
universal-physics-transformer
Last synced: about 2 months ago
JSON representation
Code for the paper Universal Physics Transformers
- Host: GitHub
- URL: https://github.com/ml-jku/upt
- Owner: ml-jku
- License: mit
- Created: 2024-02-14T14:41:20.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-02-11T16:14:55.000Z (4 months ago)
- Last Synced: 2025-03-31T09:03:32.499Z (2 months ago)
- Topics: universal-physics-transformer
- Language: Python
- Homepage:
- Size: 1.37 MB
- Stars: 100
- Watchers: 5
- Forks: 12
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE.md
Awesome Lists containing this project
README
# Universal Physics Transformer (UPT)
[[`Project Page`](https://ml-jku.github.io/UPT)] [[`Paper`](https://arxiv.org/abs/2402.12365)] [[`Talk`](https://youtu.be/mfrmCPOn4bs)] [[`Tutorial`](https://github.com/BenediktAlkin/upt-tutorial)] [[`Codebase Demo Video`](https://youtu.be/80kc3hscTTg)] [[`BibTeX`](https://github.com/ml-jku/UPT#citation)]
# Train your own models
Instructions to setup the codebase on your own environment are provided in
[SETUP_CODE](https://github.com/ml-jku/UPT/blob/main/SETUP_CODE.md),
[SETUP_DATA](https://github.com/ml-jku/UPT/blob/main/SETUP_DATA.md).A video to motivate design choices of the codebase and give an overview of the codebase can be
found [here](https://youtu.be/80kc3hscTTg).Configurations to train models can be found [here](https://github.com/ml-jku/UPT/tree/main/src/yamls).
# Citation
If you like our work, please consider giving it a star :star: and cite us
```
@article{alkin2024upt,
title={Universal Physics Transformers},
author={Benedikt Alkin and Andreas Fürst and Simon Schmid and Lukas Gruber and Markus Holzleitner and Johannes Brandstetter},
journal={arXiv preprint arXiv:2402.12365},
year={2024}
}
```