https://github.com/zichunhao/gnn-jet-autoencoder
Graph neural network autoencoders for jets in HEP
https://github.com/zichunhao/gnn-jet-autoencoder
anomaly-detection autoencoder compression deep-learning graph-neural-network graph-neural-networks high-energy-physics machine-learning message-passing-neural-network particle-physics permutation-equivariant permutation-invariance pytorch
Last synced: 8 months ago
JSON representation
Graph neural network autoencoders for jets in HEP
- Host: GitHub
- URL: https://github.com/zichunhao/gnn-jet-autoencoder
- Owner: zichunhao
- License: mit
- Created: 2022-07-27T08:24:43.000Z (about 3 years ago)
- Default Branch: main
- Last Pushed: 2023-04-06T17:18:00.000Z (over 2 years ago)
- Last Synced: 2025-01-18T02:34:26.353Z (9 months ago)
- Topics: anomaly-detection, autoencoder, compression, deep-learning, graph-neural-network, graph-neural-networks, high-energy-physics, machine-learning, message-passing-neural-network, particle-physics, permutation-equivariant, permutation-invariance, pytorch
- Language: Python
- Homepage:
- Size: 635 KB
- Stars: 1
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Citation: CITATION.cff
Awesome Lists containing this project
README
# Graph Neural Network Autoencoders Autoencoder for Jets
[](https://zenodo.org/badge/latestdoi/518371541)# Overview
A graph autoencoder (GNNAE) for jets in particle physics implemented in PyTorch, mainly used as a baseline for [LGAE](https://github.com/zichunhao/lgn-autoencoder)
## Data
To download data:
1. Install `JetNet`:
```
pip3 install jetnet;
```
2. Run `preprocess.py`
```
python utils/data/preprocess.py \
--jet-types g q t w z \
--save-dir "./data"
```## Training
To train the model, run `train.py`. An example is provided in `examples/train.sh`.## Architecture
Both the encoder and decoder are built upon the `GraphNet` architecture implemented in [models/graphnet.py]("models/graphnet.py"), which is a fully connected massage passing neural network.
The message passing step of `GraphNet` is shown in the diagram below. Here, $d$ is any distance function, and `EdgeNet` and `NodeNet` are edge and node functions at the $t$-th message passing step, respectively, both of which are MLPs with LeakyReLU activation.
![]()