An open API service indexing awesome lists of open source software.

https://github.com/huawei-noah/efficient-nlp


https://github.com/huawei-noah/efficient-nlp

Last synced: 6 months ago
JSON representation

Awesome Lists containing this project

README

          





Apache License

--------------------------------------------------------------------------------

# KD-NLP
This repository is a collection of Knowledge Distillation (KD) methods implemented by the Huawei Montreal NLP team.

Included Projects

* [**MATE-KD**](MATE-KD)
* KD for model compression and study of use of adversarial training to improve student accuracy using just the logits of the teacher as in standard KD.
* [MATE-KD: Masked Adversarial TExt, a Companion to Knowledge Distillation](https://arxiv.org/abs/2105.05912v1)
* [**Combined-KD**](Combined-KD)
* Proposition of Combined-KD (ComKD) that takes advantage of data-augmentation and progressive training.
* [How to Select One Among All? An Extensive Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding](https://arxiv.org/abs/2109.05696v1)
* [**Minimax-kNN**](Minimax-kNN)
* A sample-efficient semi-supervised kNN data augmentation technique.
* [Not Far Away, Not So Close: Sample Efficient Nearest Neighbour Data Augmentation via MiniMax](https://aclanthology.org/2021.findings-acl.309/)
* [**Glitter**](Glitter)
* A universal sample-efficient framework for incorporating augmented data into training.
* [When Chosen Wisely, More Data Is What You Need: A Universal Sample-Efficient Strategy For Data Augmentation](https://aclanthology.org/2022.findings-acl.84/)

# License
This project's license is under the Apache 2.0 license.