https://github.com/kyzhouhzau/pytorch-bilstm-attention-crf
https://github.com/kyzhouhzau/pytorch-bilstm-attention-crf
Last synced: 7 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/kyzhouhzau/pytorch-bilstm-attention-crf
- Owner: kyzhouhzau
- Created: 2019-04-08T01:02:12.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2019-04-08T06:45:55.000Z (over 6 years ago)
- Last Synced: 2025-01-22T12:45:21.360Z (9 months ago)
- Size: 55.7 KB
- Stars: 4
- Watchers: 1
- Forks: 2
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Pytorch-BiLSTM-Attention-CRF
**Since some of the tricks will be used for article writing, so the code will is opened later.**
Use pytorch to finish BiLSTM-CRF and intergrate Attention mechanism!
----------------------------------2019-04-07--------------------------------------
Upload models, so that you can test the dev set directly !
---------------------------------upload models------------------------------------
**Notice**: This code can only run on the GPU, mainly because the test found that the CPU would consume considerable time. If you
want to transfer to CPU all you need is to remove ```.cuda()``` in the whole code!## Usage
For train:
```
python train.py
```
For test:
```
python predict.py```
**Notice: you could use ```-h``` for details of parameter usage**## About attention
Provide code for visualization self-attention part!
example:

## Results
Current result in dev set! Here i just caculate the mean result of every batch on dev set with 50 EPOCHS!
```
=========================================F_score=0.927 Recal_score=0.934
=========================================
```