Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/lucidrains/transganformer
Implementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper
https://github.com/lucidrains/transganformer
artificial-intelligence attention-mechanism deep-learning generative-adversarial-networks transformers
Last synced: about 11 hours ago
JSON representation
Implementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper
- Host: GitHub
- URL: https://github.com/lucidrains/transganformer
- Owner: lucidrains
- License: mit
- Created: 2021-03-11T03:31:10.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2021-04-27T15:14:45.000Z (over 3 years ago)
- Last Synced: 2024-08-08T18:56:34.395Z (about 2 months ago)
- Topics: artificial-intelligence, attention-mechanism, deep-learning, generative-adversarial-networks, transformers
- Language: Python
- Homepage:
- Size: 292 KB
- Stars: 154
- Watchers: 16
- Forks: 15
- Open Issues: 5
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
## TransGanFormer (wip)
Implementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GansFormer and TransGan paper. It will also contain a bunch of tricks I have picked up building transformers and GANs for the last year or so, including efficient linear attention and pixel level attention.
## Install
```bash
$ pip install transganformer
```## Usage
```bash
$ transganformer --data ./path/to/data
```## Citations
```bibtex
@misc{jiang2021transgan,
title = {TransGAN: Two Transformers Can Make One Strong GAN},
author = {Yifan Jiang and Shiyu Chang and Zhangyang Wang},
year = {2021},
eprint = {2102.07074},
archivePrefix = {arXiv},
primaryClass = {cs.CV}
}
``````bibtex
@misc{hudson2021generative,
title = {Generative Adversarial Transformers},
author = {Drew A. Hudson and C. Lawrence Zitnick},
year = {2021},
eprint = {2103.01209},
archivePrefix = {arXiv},
primaryClass = {cs.CV}
}
```