Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/lucidrains/tranception-pytorch
Implementation of Tranception, an attention network, paired with retrieval, that is SOTA for protein fitness prediction
https://github.com/lucidrains/tranception-pytorch
artificial-intelligence attention-mechanism deep-learning protein-fitness-prediction retrieval transformers
Last synced: 29 days ago
JSON representation
Implementation of Tranception, an attention network, paired with retrieval, that is SOTA for protein fitness prediction
- Host: GitHub
- URL: https://github.com/lucidrains/tranception-pytorch
- Owner: lucidrains
- License: mit
- Created: 2022-06-02T20:58:26.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2022-06-19T15:42:16.000Z (over 2 years ago)
- Last Synced: 2024-10-15T00:16:50.201Z (about 1 month ago)
- Topics: artificial-intelligence, attention-mechanism, deep-learning, protein-fitness-prediction, retrieval, transformers
- Language: Python
- Homepage:
- Size: 206 KB
- Stars: 31
- Watchers: 3
- Forks: 1
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
## Tranception - Pytorch (wip)
Implementation of Tranception, an attention network, paired with retrieval, that is SOTA for protein fitness prediction. The Transformer architecture is inspired by Primer, and uses ALiBi relative positional encoding
## Install
```bash
$ pip install tranception-pytorch
```## Usage
```python
import torch
from tranception_pytorch import Tranceptionmodel = Tranception(
dim = 512,
depth = 6,
heads = 8,
dim_head = 64
)amino_acids = torch.randint(0, 21, (1, 512))
logits = model(amino_acids) # (1, 512, 21)
```## Todo
- [x] grouped heads with customizable depthwise convs (for variable k-mers), as well as grouped alibi pos bias
- [ ] figure out attention to retrieved (looks like axial attention?)
- [ ] play around with protein gym, and start betting on huggingface's accelerate## Citations
```bibtex
@article{Notin2022TranceptionPF,
title = {Tranception: protein fitness prediction with autoregressive transformers and inference-time retrieval},
author = {Pascal Notin and Mafalda Dias and Jonathan Frazer and Javier Marchena-Hurtado and Aidan N. Gomez and Debora S. Marks and Yarin Gal},
journal = {ArXiv},
year = {2022},
volume = {abs/2205.13760}
}
```