https://github.com/bzantium/bert-aad
Adversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
https://github.com/bzantium/bert-aad
adda bert domain-adaptation knowledge-distillation pytorch
Last synced: 20 days ago
JSON representation
Adversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
- Host: GitHub
- URL: https://github.com/bzantium/bert-aad
- Owner: bzantium
- Created: 2018-12-11T11:52:10.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2020-10-27T03:17:44.000Z (over 4 years ago)
- Last Synced: 2025-04-30T21:09:57.277Z (20 days ago)
- Topics: adda, bert, domain-adaptation, knowledge-distillation, pytorch
- Language: Python
- Homepage:
- Size: 14.1 MB
- Stars: 34
- Watchers: 1
- Forks: 5
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
## Knowledge Distillation for BERT Unsupervised Domain Adaptation
Official PyTorch implementation | [Paper](https://arxiv.org/abs/2010.11478)
### Abstract
A pre-trained language model, BERT, has brought significant performance improvements across a range of natural language processing tasks. Since the model is trained on a large corpus of diverse topics, it shows robust performance for domain shift problems in which data distributions at training (source data) and testing (target data) differ while sharing similarities. Despite its great improvements compared to previous models, it still suffers from performance degradation due to domain shifts. To mitigate such problems, we propose a simple but effective unsupervised domain adaptation method, adversarial adaptation with distillation (AAD), which combines the adversarial discriminative domain adaptation (ADDA) framework with knowledge distillation. We evaluate our approach in the task of cross-domain sentiment classification on 30 domain pairs, advancing the state-of-the-art performance for unsupervised domain adaptation in text sentiment classification.### Requirements
- pandas
- pytorch
- transformers### Run the test
```
$ python main.py --pretrain --adapt --src books --tgt dvd
```### How to cite
```
@article{ryu2020knowledge,
title={Knowledge Distillation for BERT Unsupervised Domain Adaptation},
author={Ryu, Minho and Lee, Kichun},
journal={arXiv preprint arXiv:2010.11478},
year={2020}
}
```