Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/elbayadm/attn2d
Pervasive Attention: 2D Convolutional Networks for Sequence-to-Sequence Prediction
https://github.com/elbayadm/attn2d
fairseq neural-machine-translation nlp nmt pytorch
Last synced: about 1 month ago
JSON representation
Pervasive Attention: 2D Convolutional Networks for Sequence-to-Sequence Prediction
- Host: GitHub
- URL: https://github.com/elbayadm/attn2d
- Owner: elbayadm
- License: mit
- Created: 2018-08-10T09:47:53.000Z (almost 6 years ago)
- Default Branch: master
- Last Pushed: 2021-05-08T10:55:34.000Z (about 3 years ago)
- Last Synced: 2024-01-26T20:35:07.206Z (4 months ago)
- Topics: fairseq, neural-machine-translation, nlp, nmt, pytorch
- Language: Python
- Homepage:
- Size: 6.21 MB
- Stars: 498
- Watchers: 16
- Forks: 75
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Lists
- Awesome-pytorch-list - attn2d - to-Sequence Prediction (Paper implementations / Other libraries:)
- Awesome-pytorch-list-CNVersion - attn2d - to-Sequence Prediction (Paper implementations|论文实现 / Other libraries|其他库:)
README
This is a fork of Fairseq(-py) with implementations of the following models:
## Pervasive Attention - 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction
An NMT models with two-dimensional convolutions to jointly encode the source and the target sequences.
Pervasive Attention also provides an extensive decoding grid that we leverage to efficiently train wait-k models.
See [README](examples/pervasive/README.md).
## Efficient Wait-k Models for Simultaneous Machine Translation
Transformer Wait-k models (Ma et al., 2019) with unidirectional encoders and with joint training of multiple wait-k paths.
See [README](examples/waitk/README.md).
# Fairseq Requirements and Installation
* [PyTorch](http://pytorch.org/) version >= 1.4.0
* Python version >= 3.6
* For training new models, you'll also need an NVIDIA GPU and [NCCL](https://github.com/NVIDIA/nccl)**Installing Fairseq**
```bash
git clone https://github.com/elbayadm/attn2d
cd attn2d
pip install --editable .
```# License
fairseq(-py) is MIT-licensed.
The license applies to the pre-trained models as well.# Citation
For Pervasive Attention, please cite:
```bibtex
@InProceedings{elbayad18conll,
author ="Elbayad, Maha and Besacier, Laurent and Verbeek, Jakob",
title = "Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction",
booktitle = "Proceedings of the 22nd Conference on Computational Natural Language Learning",
year = "2018",
}
```For our wait-k models, please cite:
```bibtex
@article{elbayad20waitk,
title={Efficient Wait-k Models for Simultaneous Machine Translation},
author={Elbayad, Maha and Besacier, Laurent and Verbeek, Jakob},
journal={arXiv preprint arXiv:2005.08595},
year={2020}
}
```For Fairseq, please cite:
```bibtex
@inproceedings{ott2019fairseq,
title = {fairseq: A Fast, Extensible Toolkit for Sequence Modeling},
author = {Myle Ott and Sergey Edunov and Alexei Baevski and Angela Fan and Sam Gross and Nathan Ng and David Grangier and Michael Auli},
booktitle = {Proceedings of NAACL-HLT 2019: Demonstrations},
year = {2019},
}
```