Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/tunz/transformer-pytorch
Transformer implementation in PyTorch.
https://github.com/tunz/transformer-pytorch
pytorch transformer
Last synced: about 23 hours ago
JSON representation
Transformer implementation in PyTorch.
- Host: GitHub
- URL: https://github.com/tunz/transformer-pytorch
- Owner: tunz
- License: mit
- Created: 2019-01-08T08:01:17.000Z (almost 6 years ago)
- Default Branch: master
- Last Pushed: 2019-03-07T14:46:06.000Z (over 5 years ago)
- Last Synced: 2023-11-07T18:55:15.954Z (about 1 year ago)
- Topics: pytorch, transformer
- Language: Python
- Homepage: https://tunz.kr/post/4
- Size: 31.3 KB
- Stars: 381
- Watchers: 5
- Forks: 88
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Transformer
This is a pytorch implementation of the
[Transformer](https://arxiv.org/abs/1706.03762) model like
[tensorflow/tensor2tensor](https://github.com/tensorflow/tensor2tensor).## Prerequisite
I tested it with PyTorch 1.0.0 and Python 3.6.8.
It's using [SpaCy](https://spacy.io/usage/) to tokenize languages for wmt32k
dataset. So, if you want to run `wmt32k` problem which is a de/en translation
dataset, you should download language models first with the following command.```
$ pip install spacy
$ python -m spacy download en
$ python -m spacy download de
```## Usage
1. Train a model.
```
$ python train.py --problem wmt32k --output_dir ./output --data_dir ./wmt32k_data
or
$ python train.py --problem lm1b --output_dir ./output --data_dir ./lm1b_data
```If you want to try `fast_transformer`, give a `model` argument after installing
[tcop-pytorch](https://github.com/tunz/tcop-pytorch).
```
$ python train.py --problem lm1b --output_dir ./output --data_dir ./lm1b_data --model fast_transformer
```2. You can translate a single sentence with the trained model.
```
$ python decoder.py --translate --data_dir ./wmt32k_data --model_dir ./output/last/models
```