Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/malteos/awesome-contrastive-learning-for-nlp
A collection of papers about contrastive learning for natural language processing.
https://github.com/malteos/awesome-contrastive-learning-for-nlp
List: awesome-contrastive-learning-for-nlp
Last synced: 3 months ago
JSON representation
A collection of papers about contrastive learning for natural language processing.
- Host: GitHub
- URL: https://github.com/malteos/awesome-contrastive-learning-for-nlp
- Owner: malteos
- License: mit
- Created: 2022-06-30T21:43:57.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2022-07-13T08:17:41.000Z (over 2 years ago)
- Last Synced: 2024-05-20T06:11:54.260Z (6 months ago)
- Size: 5.86 KB
- Stars: 7
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- ultimate-awesome - awesome-contrastive-learning-for-nlp - A collection of papers about contrastive learning for natural language processing. (Other Lists / PowerShell Lists)
README
# Awesome Contrastive Learning for NLP [![Awesome](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](https://github.com/sindresorhus/awesome)
![contributing-image](https://img.shields.io/badge/contributions-welcome-brightgreen.svg?style=flat)A collection of papers about contrastive learning for natural language processing.
**Please feel free to create a pull request if you would like to add other awesome papers.**
## 2012
- [A fast and simple algorithm for training neural probabilistic language models](https://arxiv.org/abs/1206.6426)## 2013
- [Efficient Estimation of Word Representations in Vector Space](https://arxiv.org/abs/1301.3781)
- [Distributed Representations of Words and Phrases and their Compositionality](https://arxiv.org/abs/1310.4546)## 2018
- [GILE: A Generalized Input-Label Embedding for Text Classification](https://arxiv.org/abs/1806.06219)
- [Noise Contrastive Estimation and Negative Sampling for Conditional Models: Consistency and Statistical Efficiency](https://arxiv.org/abs/1809.01812)
- [Representation Learning with Contrastive Predictive Coding](https://arxiv.org/abs/1807.03748)
- [An efficient framework for learning sentence representations](https://arxiv.org/abs/1803.02893)
- [Learning deep representations by mutual information estimation and maximization](https://arxiv.org/abs/1808.06670)## 2019
- [A Theoretical Analysis of Contrastive Unsupervised Representation Learning](https://arxiv.org/abs/1902.09229)
- [Transferable Contrastive Network for Generalized Zero-Shot Learning](https://arxiv.org/abs/1908.05832)
- [Contrastive Attention Mechanism for Abstractive Sentence Summarization](https://arxiv.org/abs/1910.13114)
- [The MuCoW Test Suite at WMT 2019: Automatically Harvested Multilingual Contrastive Word Sense Disambiguation Test Sets for Machine Translation](http://doi.org/10.18653/v1/W19-5354)## 2020
- [Pre-Training Transformers as Energy-Based Cloze Models](https://arxiv.org/abs/2012.08561)
- [A Survey on Contrastive Self-supervised Learning](https://arxiv.org/abs/2011.00362)
- [Contrastive Distillation on Intermediate Representations for Language Model Compression](https://arxiv.org/abs/2009.14167)
- [CERT: Contrastive Self-supervised Learning for Language Understanding](https://arxiv.org/abs/2005.12766)
- [Supervised Contrastive Learning](https://arxiv.org/abs/2004.11362)
- [Big Self-Supervised Models are Strong Semi-Supervised Learners](https://arxiv.org/abs/2006.10029)
- [Residual Energy-Based Models for Text Generation](https://arxiv.org/abs/2004.11714)
- [CoDA: Contrast-enhanced and Diversity-promoting Data Augmentation for Natural Language Understanding](https://arxiv.org/abs/2010.08670)
- [Contrastive Self-Supervised Learning for Commonsense Reasoning](https://arxiv.org/abs/2005.00669)
- [On the Stability of Fine-tuning BERT: Misconceptions, Explanations, and Strong Baselines](https://arxiv.org/abs/2006.04884)
- [Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models](https://arxiv.org/abs/2005.10389)
- [MixText: Linguistically-Informed Interpolation of Hidden Space for Semi-Supervised Text Classification](https://arxiv.org/abs/2004.12239)
- [Learning with Contrastive Examples for Data-to-Text Generation](http://doi.org/10.18653/V1/2020.COLING-MAIN.213)
- [Long-Tail Zero and Few-Shot Learning via Contrastive Pretraining on and for Small Data](https://arxiv.org/abs/2010.01061)## 2021
- [COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining](https://arxiv.org/abs/2102.08473)
- [Learning Transferable Visual Models From Natural Language Supervision](https://arxiv.org/abs/2103.00020)
- [Contrasting distinct structured views to learn sentence embeddings](http://doi.org/10.18653/v1/2021.eacl-srw.11)
- [A Primer on Contrastive Pretraining in Language Processing: Methods, Lessons Learned and Perspectives](https://arxiv.org/abs/2102.12982)-----
## Contributing
Have anything in mind that you think is awesome and would fit in this list? Feel free to send a [pull request](https://github.com/malteos/awesome-contrastive-learning-for-nlp/pulls).
## License
MIT