https://github.com/veb-101/attention-and-transformers
Transformers goes brrr... Attention and Transformers from scratch in TensorFlow. Currently contains Vision transformers, MobileViT-v1, MobileViT-v2, MobileViT-v3
https://github.com/veb-101/attention-and-transformers
attention-mechanism mobile mobilevit mobilevitv1 mobilevitv2 mobilevitv3 tensorflow tensorflow2 transformer vision-transformer
Last synced: 2 months ago
JSON representation
Transformers goes brrr... Attention and Transformers from scratch in TensorFlow. Currently contains Vision transformers, MobileViT-v1, MobileViT-v2, MobileViT-v3
- Host: GitHub
- URL: https://github.com/veb-101/attention-and-transformers
- Owner: veb-101
- License: mit
- Created: 2022-09-10T17:31:49.000Z (almost 3 years ago)
- Default Branch: main
- Last Pushed: 2025-04-02T11:56:21.000Z (3 months ago)
- Last Synced: 2025-04-02T12:36:47.472Z (3 months ago)
- Topics: attention-mechanism, mobile, mobilevit, mobilevitv1, mobilevitv2, mobilevitv3, tensorflow, tensorflow2, transformer, vision-transformer
- Language: Python
- Homepage:
- Size: 250 KB
- Stars: 13
- Watchers: 2
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
## Attention mechanisms and Transformers
### Updates: I'm moving the codebase to a [new repository](https://github.com/veb-101/keras-vision) and rewriting it using the latest Keras 3.x version for multi-backend support. It will also include the pretrained model weights in Keras format.
[](https://www.python.org/) [](https://github.com/tensorflow/tensorflow/releases/) [](https://badge.fury.io/py/Attention-and-Transformers) [](https://www.tensorflow.org/)
* This goal of this repository is to host basic architecture and model traning code associated with the different attention mechanisms and transformer architecture.
* At the moment, I more interested in learning and recreating these new architectures from scratch than full-fledged training. For now, I'll just be training these models on small datasets.#### Installation
* Using pip to install from [pypi](https://pypi.org/project/Attention-and-Transformers/)
```bash
pip install Attention-and-Transformers
```* Using pip to install latest version from github
```bash
pip install git+https://github.com/veb-101/Attention-and-Transformers.git
```* Local clone and install
```bash
git clone https://github.com/veb-101/Attention-and-Transformers.git atf
cd atf
python setup.py install
```**Example Use**
```bash
python load_test.py
```**Attention Mechanisms**
# No.
Mechanism
Paper
1
2
2
**Transformer Models**
# No.
Models
Paper
1
An Image is Worth 16x16 Words:
2
MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer
Separable Self-attention for Mobile Vision Transformers
2