Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/mbarbetti/calotron

:package: Transformer-based models to flash-simulate the LHCb ECAL detector
https://github.com/mbarbetti/calotron

calorimeter deep-learning flash-simulation keras lhcb-experiment lhcb-lamarr machine-learning tensorflow transformer ultrafast-simulation

Last synced: 19 days ago
JSON representation

:package: Transformer-based models to flash-simulate the LHCb ECAL detector

Awesome Lists containing this project

README

        


calotron logo


Transformer-based models to flash-simulate the LHCb ECAL detector


TensorFlow versions
Python versions
PyPI - Version
GitHub - License


GitHub - Tests
Codecov


GitHub - Style
Ruff

### Transformers

| Models | Implementation | Generative ability* | Test | Design inspired by |
|:---------------------:|:--------------:|:-------------------:|:----:|:------------------:|
| `Transformer` | ✅ | ❌ | ✅ | [1](https://arxiv.org/abs/1706.03762), [4](https://arxiv.org/abs/2004.08249) |
| `OptionalTransformer` | ✅ | ❌ | ✅ | [1](https://arxiv.org/abs/1706.03762), [4](https://arxiv.org/abs/2004.08249) |
| `MaskedTransformer` | 🛠️ | ❌ | ❌ | |
| `GigaGenerator` | ✅ | ✅ | ✅ | [5](https://arxiv.org/abs/2303.05511), [6](https://arxiv.org/abs/2107.04589) |

*TBA

### Discriminators

| Models | Algorithm | Implementation | Test | Design inspired by |
|:-----------------------:|:-----------:|:--------------:|:----:|:------------------:|
| `Discriminator` | DeepSets | ✅ | ✅ | [2](https://cds.cern.ch/record/2721094), [3](https://arxiv.org/abs/1703.06114) |
| `PairwiseDiscriminator` | DeepSets | ✅ | ✅ | [2](https://cds.cern.ch/record/2721094), [3](https://arxiv.org/abs/1703.06114) |
| `GNNDiscriminator` | GNN | 🛠️ | ❌ | |
| `GigaDiscriminator` | Transformer | ✅ | ✅ | [5](https://arxiv.org/abs/2303.05511), [6](https://arxiv.org/abs/2107.04589), [7](https://arxiv.org/abs/2006.04710) |

### References
1. A. Vaswani _et al._, "Attention Is All You Need", [arXiv:1706.03762](https://arxiv.org/abs/1706.03762)
2. N.M. Hartman, M. Kagan and R. Teixeira De Lima, "Deep Sets for Flavor Tagging on the ATLAS Experiment", [ATL-PHYS-PROC-2020-043](https://cds.cern.ch/record/2721094)
3. M. Zaheer _et al._, "Deep Sets", [arXiv:1703.06114](https://arxiv.org/abs/1703.06114)
4. L. Liu _et al._, "Understanding the Difficulty of Training Transformers", [arXiv:2004.08249](https://arxiv.org/abs/2004.08249)
5. M. Kang _et al._, "Scaling up GANs for Text-to-Image Synthesis", [arXiv:2303.05511](https://arxiv.org/abs/2303.05511)
6. K. Lee _et al._, "ViTGAN: Training GANs with Vision Transformers", [arXiv:2107.04589](https://arxiv.org/abs/2107.04589)
7. H. Kim, G. Papamakarios and A. Mnih, "The Lipschitz Constant of Self-Attention", [arXiv:2006.04710](https://arxiv.org/abs/2006.04710)

### Credits
Transformer implementation freely inspired by the TensorFlow tutorial [Neural machine translation with a Transformer and Keras](https://www.tensorflow.org/text/tutorials/transformer) and the Keras tutorial [Image classification with Vision Transformer](https://keras.io/examples/vision/image_classification_with_vision_transformer).