Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/1ytic/warp-rna
Recurrent Neural Aligner
https://github.com/1ytic/warp-rna
cuda forward-backward rna rnn-transducer
Last synced: about 2 months ago
JSON representation
Recurrent Neural Aligner
- Host: GitHub
- URL: https://github.com/1ytic/warp-rna
- Owner: 1ytic
- License: mit
- Created: 2019-08-11T08:12:41.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2020-04-14T10:48:16.000Z (almost 5 years ago)
- Last Synced: 2024-11-14T22:44:22.665Z (3 months ago)
- Topics: cuda, forward-backward, rna, rnn-transducer
- Language: Python
- Size: 71.3 KB
- Stars: 49
- Watchers: 7
- Forks: 7
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Recurrent Neural Aligner
Recurrent Neural Aligner (RNA) is a restricted version of RNN-Transducer loss (RNN-T). It assumes that the length of input sequence is equal to or greater than the length of target sequence ([Sak, et al., 2017](https://www.isca-speech.org/archive/Interspeech_2017/pdfs/1705.PDF); [Dong, et al., 2018](https://arxiv.org/abs/1806.06342)).
In this repository you can find a reference implementation of RNA loss in python, as well as a fast GPU version on CUDA. In order to apply the same efficient procedure from [warp_rnnt](https://github.com/1ytic/warp-rnnt), the alphas/betas arrays are represented as shown below. Because RNA loss assumes that a model produces only one output at each input step, we can reduce the T dimension to S=T-U+2.
![](aligner.gif)
## Convergence
In the figure below shows a sanity check of the implementation for the speech recognition task with a small dataset. The decoding procedure was the same for both models. As you can see RNN-T loss more stable in this case.
![](check.png)
If you have a successful example of using RNA loss or if you find errors in this implementation, please make an issue for this repository.
## Install
Currently, there is only a binding for PyTorch 1.0 and higher.```bash
pip install warp_rna
```## Test
There is a unittest in pytorch_binding/warp_rna which includes tests for arguments and outputs as well.```bash
cd ..
python -m warp_rna.test
```