Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/kevinmtian/distill-bert
Knowledge Distillation from BERT
https://github.com/kevinmtian/distill-bert
Last synced: 6 days ago
JSON representation
Knowledge Distillation from BERT
- Host: GitHub
- URL: https://github.com/kevinmtian/distill-bert
- Owner: kevinmtian
- License: apache-2.0
- Created: 2019-02-20T19:32:49.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2019-01-07T20:05:11.000Z (almost 6 years ago)
- Last Synced: 2024-08-02T08:09:52.727Z (3 months ago)
- Language: Python
- Size: 29.3 KB
- Stars: 51
- Watchers: 4
- Forks: 31
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-bert - kevinmtian/distill-bert
README
# Distilled BERT
This work aims Knowledge Distillation from [Google BERT model](https://github.com/google-research/bert) to compact Convolutional Models. (Not done yet)
## Requirements
Python > 3.6, fire, tqdm, tensorboardx,
tensorflow (for loading checkpoint file)## Example Usage
### Fine-tuning (MRPC) Classifier with Pre-trained Transformer
Download [BERT-Base, Uncased](https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-12_H-768_A-12.zip) and
[GLUE Benchmark Datasets]( https://github.com/nyu-mll/GLUE-baselines)
before fine-tuning.
* make sure that "total_steps" in train.json should be greater than n_epochs*(num_data/batch_size)Modify several config json files before following commands for training and evaluating.
```
python finetune.py config/finetune/mrpc/train.json
python finetune.py config/finetune/mrpc/eval.json
```### Training Blend CNN from scratch
See [Transformer to CNN](https://openreview.net/forum?id=HJxM3hftiX).
Modify several config json files before following commands for training and evaluating.
```
python classify.py config/blendcnn/mrpc/train.json
python classify.py config/blendcnn/mrpc/eval.json
```### Knowledge Distillation from finetuned Transformer to CNN
Modify several config json files before following commands for training and evaluating.
```
python distill.py config/distill/mrpc/train.json
python distill.py config/distill/mrpc/eval.json
```