Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/lucidrains/vn-transformer
A Transformer made of Rotation-equivariant Attention using Vector Neurons
https://github.com/lucidrains/vn-transformer
artificial-intelligence attention-mechanism deep-learning geometric-deep-learning transformers
Last synced: 20 days ago
JSON representation
A Transformer made of Rotation-equivariant Attention using Vector Neurons
- Host: GitHub
- URL: https://github.com/lucidrains/vn-transformer
- Owner: lucidrains
- License: mit
- Created: 2022-06-12T18:13:05.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2023-08-01T14:49:11.000Z (over 1 year ago)
- Last Synced: 2024-10-23T11:43:11.259Z (28 days ago)
- Topics: artificial-intelligence, attention-mechanism, deep-learning, geometric-deep-learning, transformers
- Language: Python
- Homepage:
- Size: 392 KB
- Stars: 82
- Watchers: 7
- Forks: 4
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
## VN (Vector Neuron) Transformer
A Transformer made of Rotation-equivariant Attention using Vector Neurons
## Appreciation
- StabilityAI for the generous sponsorship, as well as my other sponsors, for affording me the independence to open source artificial intelligence.
## Install
```bash
$ pip install VN-transformer
```## Usage
```python
import torch
from VN_transformer import VNTransformermodel = VNTransformer(
dim = 64,
depth = 2,
dim_head = 64,
heads = 8,
dim_feat = 64, # will default to early fusion, since this was the best performing
bias_epsilon = 1e-6 # in this paper, they propose breaking equivariance with a tiny bit of bias noise in the VN linear. they claim this leads to improved stability. setting this to 0 would turn off the epsilon approximate equivariance
)coors = torch.randn(1, 32, 3) # (batch, sequence, spatial coordinates)
feats = torch.randn(1, 32, 64)coors_out, feats_out = model(coors, feats = feats) # (1, 32, 3), (1, 32, 64)
```## Tests
Confidence in equivariance
```bash
$ python setup.py test
```## Example
First install `sidechainnet`
```bash
$ pip install sidechainnet
```Then run the protein backbone denoising task
```bash
$ python denoise.py
```It does not perform as well as En-Transformer, nor Equiformer
## Citations
```bibtex
@inproceedings{Assaad2022VNTransformerRA,
title = {VN-Transformer: Rotation-Equivariant Attention for Vector Neurons},
author = {Serge Assaad and C. Downey and Rami Al-Rfou and Nigamaa Nayakanti and Benjamin Sapp},
year = {2022}
}
``````bibtex
@article{Deng2021VectorNA,
title = {Vector Neurons: A General Framework for SO(3)-Equivariant Networks},
author = {Congyue Deng and Or Litany and Yueqi Duan and Adrien Poulenard and Andrea Tagliasacchi and Leonidas J. Guibas},
journal = {2021 IEEE/CVF International Conference on Computer Vision (ICCV)},
year = {2021},
pages = {12180-12189},
url = {https://api.semanticscholar.org/CorpusID:233394028}
}
``````bibtex
@inproceedings{Kim2020TheLC,
title = {The Lipschitz Constant of Self-Attention},
author = {Hyunjik Kim and George Papamakarios and Andriy Mnih},
booktitle = {International Conference on Machine Learning},
year = {2020},
url = {https://api.semanticscholar.org/CorpusID:219530837}
}
``````bibtex
@inproceedings{dao2022flashattention,
title = {Flash{A}ttention: Fast and Memory-Efficient Exact Attention with {IO}-Awareness},
author = {Dao, Tri and Fu, Daniel Y. and Ermon, Stefano and Rudra, Atri and R{\'e}, Christopher},
booktitle = {Advances in Neural Information Processing Systems},
year = {2022}
}
```