Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mbarbetti/calotron
:package: Transformer-based models to flash-simulate the LHCb ECAL detector
https://github.com/mbarbetti/calotron
calorimeter deep-learning flash-simulation keras lhcb-experiment lhcb-lamarr machine-learning tensorflow transformer ultrafast-simulation
Last synced: 19 days ago
JSON representation
:package: Transformer-based models to flash-simulate the LHCb ECAL detector
- Host: GitHub
- URL: https://github.com/mbarbetti/calotron
- Owner: mbarbetti
- License: gpl-3.0
- Created: 2022-11-24T10:49:17.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2024-07-02T17:40:45.000Z (4 months ago)
- Last Synced: 2024-10-09T12:32:44.377Z (30 days ago)
- Topics: calorimeter, deep-learning, flash-simulation, keras, lhcb-experiment, lhcb-lamarr, machine-learning, tensorflow, transformer, ultrafast-simulation
- Language: Python
- Homepage:
- Size: 2.49 MB
- Stars: 0
- Watchers: 2
- Forks: 1
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
Transformer-based models to flash-simulate the LHCb ECAL detector### Transformers
| Models | Implementation | Generative ability* | Test | Design inspired by |
|:---------------------:|:--------------:|:-------------------:|:----:|:------------------:|
| `Transformer` | ✅ | ❌ | ✅ | [1](https://arxiv.org/abs/1706.03762), [4](https://arxiv.org/abs/2004.08249) |
| `OptionalTransformer` | ✅ | ❌ | ✅ | [1](https://arxiv.org/abs/1706.03762), [4](https://arxiv.org/abs/2004.08249) |
| `MaskedTransformer` | 🛠️ | ❌ | ❌ | |
| `GigaGenerator` | ✅ | ✅ | ✅ | [5](https://arxiv.org/abs/2303.05511), [6](https://arxiv.org/abs/2107.04589) |*TBA
### Discriminators
| Models | Algorithm | Implementation | Test | Design inspired by |
|:-----------------------:|:-----------:|:--------------:|:----:|:------------------:|
| `Discriminator` | DeepSets | ✅ | ✅ | [2](https://cds.cern.ch/record/2721094), [3](https://arxiv.org/abs/1703.06114) |
| `PairwiseDiscriminator` | DeepSets | ✅ | ✅ | [2](https://cds.cern.ch/record/2721094), [3](https://arxiv.org/abs/1703.06114) |
| `GNNDiscriminator` | GNN | 🛠️ | ❌ | |
| `GigaDiscriminator` | Transformer | ✅ | ✅ | [5](https://arxiv.org/abs/2303.05511), [6](https://arxiv.org/abs/2107.04589), [7](https://arxiv.org/abs/2006.04710) |### References
1. A. Vaswani _et al._, "Attention Is All You Need", [arXiv:1706.03762](https://arxiv.org/abs/1706.03762)
2. N.M. Hartman, M. Kagan and R. Teixeira De Lima, "Deep Sets for Flavor Tagging on the ATLAS Experiment", [ATL-PHYS-PROC-2020-043](https://cds.cern.ch/record/2721094)
3. M. Zaheer _et al._, "Deep Sets", [arXiv:1703.06114](https://arxiv.org/abs/1703.06114)
4. L. Liu _et al._, "Understanding the Difficulty of Training Transformers", [arXiv:2004.08249](https://arxiv.org/abs/2004.08249)
5. M. Kang _et al._, "Scaling up GANs for Text-to-Image Synthesis", [arXiv:2303.05511](https://arxiv.org/abs/2303.05511)
6. K. Lee _et al._, "ViTGAN: Training GANs with Vision Transformers", [arXiv:2107.04589](https://arxiv.org/abs/2107.04589)
7. H. Kim, G. Papamakarios and A. Mnih, "The Lipschitz Constant of Self-Attention", [arXiv:2006.04710](https://arxiv.org/abs/2006.04710)### Credits
Transformer implementation freely inspired by the TensorFlow tutorial [Neural machine translation with a Transformer and Keras](https://www.tensorflow.org/text/tutorials/transformer) and the Keras tutorial [Image classification with Vision Transformer](https://keras.io/examples/vision/image_classification_with_vision_transformer).