https://github.com/lucidrains/global-self-attention-network
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
https://github.com/lucidrains/global-self-attention-network
artificial-intelligence attention attention-mechanism image-classification self-attention
Last synced: about 2 months ago
JSON representation
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
- Host: GitHub
- URL: https://github.com/lucidrains/global-self-attention-network
- Owner: lucidrains
- License: mit
- Created: 2020-10-02T21:11:56.000Z (about 5 years ago)
- Default Branch: main
- Last Pushed: 2020-11-21T21:55:48.000Z (almost 5 years ago)
- Last Synced: 2024-12-10T14:21:49.792Z (10 months ago)
- Topics: artificial-intelligence, attention, attention-mechanism, image-classification, self-attention
- Language: Python
- Homepage:
- Size: 95.7 KB
- Stars: 93
- Watchers: 7
- Forks: 7
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
## Global Self-attention Network
An implementation of Global Self-Attention Network, which proposes an all-attention vision backbone that achieves better results than convolutions with less parameters and compute.
They use a previously discovered linear attention variant with a small modification for further gains (no normalization of the queries), paired with relative positional attention, computed axially for efficiency.
The result is an extremely simple circuit composed of 8 einsums, 1 softmax, and normalization.
## Install
```bash
$ pip install gsa-pytorch
```## Usage
```python
import torch
from gsa_pytorch import GSAgsa = GSA(
dim = 3,
dim_out = 64,
dim_key = 32,
heads = 8,
rel_pos_length = 256 # in paper, set to max(height, width). you can also turn this off by omitting this line
)x = torch.randn(1, 3, 256, 256)
gsa(x) # (1, 64, 256, 256)
```## Citations
```bibtex
@inproceedings{
anonymous2021global,
title={Global Self-Attention Networks},
author={Anonymous},
booktitle={Submitted to International Conference on Learning Representations},
year={2021},
url={https://openreview.net/forum?id=KiFeuZu24k},
note={under review}
}
```