Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/veb-101/attention-and-transformers

Transformers goes brrr... Attention and Transformers from scratch in TensorFlow. Currently contains Vision transformers, MobileViT-v1, MobileViT-v2, MobileViT-v3
https://github.com/veb-101/attention-and-transformers

attention-mechanism mobile mobilevit mobilevitv1 mobilevitv2 mobilevitv3 tensorflow tensorflow2 transformer vision-transformer

Last synced: 14 days ago
JSON representation

Transformers goes brrr... Attention and Transformers from scratch in TensorFlow. Currently contains Vision transformers, MobileViT-v1, MobileViT-v2, MobileViT-v3

Awesome Lists containing this project

README

        

## Attention mechanisms and Transformers

### Updates: I'm moving the codebase to a [new repository](https://github.com/veb-101/keras-vision) and rewriting it using the latest Keras 3.x version for multi-backend support. I will also port the pretrained weights of the models written here.

[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/Attention-and-Transformers)](https://www.python.org/) [![TensorFlow](https://img.shields.io/badge/Tensorflow-2.10%20%7C%202.11-orange?logo=tensorflow)](https://github.com/tensorflow/tensorflow/releases/) [![PyPI version](https://badge.fury.io/py/Attention-and-Transformers.svg)](https://badge.fury.io/py/Attention-and-Transformers) [![TensorFlow](https://img.shields.io/badge/TensorFlow-%23FF6F00.svg?style=for-the-badge&logo=TensorFlow&logoColor=white)](https://www.tensorflow.org/)

* This goal of this repository is to host basic architecture and model traning code associated with the different attention mechanisms and transformer architecture.
* At the moment, I more interested in learning and recreating these new architectures from scratch than full-fledged training. For now, I'll just be training these models on small datasets.

#### Installation

* Using pip to install from [pypi](https://pypi.org/project/Attention-and-Transformers/)

```bash
pip install Attention-and-Transformers
```

* Using pip to install latest version from github

```bash
pip install git+https://github.com/veb-101/Attention-and-Transformers.git
```

* Local clone and install

```bash
git clone https://github.com/veb-101/Attention-and-Transformers.git atf
cd atf
python setup.py install
```

**Example Use**

```bash
python load_test.py
```

**Attention Mechanisms**

# No.

Mechanism

Paper

1

Multi-head Self Attention

Attention is all you need

2

Multi-head Self Attention 2D

MobileViT V1

2

Separable Self Attention

MobileViT V2

**Transformer Models**

# No.

Models

Paper

1

Vision Transformer

An Image is Worth 16x16 Words:

2

MobileViT-V1

MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer

3
MobileViT-V2

Separable Self-attention for Mobile Vision Transformers

2

MobileViT-V3

MobileViTv3: Mobile-Friendly Vision Transformer