https://github.com/nlpatvcu/multitasking_transformers
Multitask Learning with Pretrained Transformers
https://github.com/nlpatvcu/multitasking_transformers
bert clinical-nlp deep-learning multi-task-learning
Last synced: 6 months ago
JSON representation
Multitask Learning with Pretrained Transformers
- Host: GitHub
- URL: https://github.com/nlpatvcu/multitasking_transformers
- Owner: NLPatVCU
- License: mit
- Created: 2020-03-20T17:23:57.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2021-03-20T18:27:55.000Z (over 4 years ago)
- Last Synced: 2025-03-29T06:11:17.462Z (6 months ago)
- Topics: bert, clinical-nlp, deep-learning, multi-task-learning
- Language: Python
- Homepage:
- Size: 2.68 MB
- Stars: 39
- Watchers: 3
- Forks: 9
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# :arrows_clockwise: Multitasking Transformers :arrows_clockwise:
training nlp models that can perform multiple tasks with the same set of representations.pre-trained models are currently available that multitask over eight clinical note tasks.
This codebase can be utilized to replicate results for our paper. See the Replication section
for details.
# InstallationInstall with
```
pip install https://s3-us-west-2.amazonaws.com/ai2-s2-scispacy/releases/v0.2.0/en_core_sci_sm-0.2.0.tar.gz
```
```
pip install git+https://github.com/AndriyMulyar/multitasking_transformers
```# Use
[Examples](/examples) are available for training, evaluation and text prediction.Running the script [predict_ner.py](/examples/predict_ner.py) will automatically
download a pre-trained clinical note multi-tasking model, run the model through a de-identified
clinical note snippet and display the entity tags in your browser with displacy.# Replication
See the directory [/examples/experiment_replication](/examples/experiment_replication).# Preprint
https://arxiv.org/abs/2004.10220# Acknowledgement
Implementation, development and training in this project were supported by funding from the McInnes NLP Lab at Virginia Commonwealth University.