Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
awesome-contrastive-learning-for-nlp
A collection of papers about contrastive learning for natural language processing.
https://github.com/malteos/awesome-contrastive-learning-for-nlp
- A fast and simple algorithm for training neural probabilistic language models
- Efficient Estimation of Word Representations in Vector Space
- Distributed Representations of Words and Phrases and their Compositionality
- GILE: A Generalized Input-Label Embedding for Text Classification
- Noise Contrastive Estimation and Negative Sampling for Conditional Models: Consistency and Statistical Efficiency
- Representation Learning with Contrastive Predictive Coding
- An efficient framework for learning sentence representations
- Learning deep representations by mutual information estimation and maximization
- A Theoretical Analysis of Contrastive Unsupervised Representation Learning
- Transferable Contrastive Network for Generalized Zero-Shot Learning
- Contrastive Attention Mechanism for Abstractive Sentence Summarization
- The MuCoW Test Suite at WMT 2019: Automatically Harvested Multilingual Contrastive Word Sense Disambiguation Test Sets for Machine Translation
- Pre-Training Transformers as Energy-Based Cloze Models
- A Survey on Contrastive Self-supervised Learning
- Contrastive Distillation on Intermediate Representations for Language Model Compression
- CERT: Contrastive Self-supervised Learning for Language Understanding
- Supervised Contrastive Learning
- Big Self-Supervised Models are Strong Semi-Supervised Learners
- Residual Energy-Based Models for Text Generation
- CoDA: Contrast-enhanced and Diversity-promoting Data Augmentation for Natural Language Understanding
- Contrastive Self-Supervised Learning for Commonsense Reasoning
- On the Stability of Fine-tuning BERT: Misconceptions, Explanations, and Strong Baselines
- Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models
- MixText: Linguistically-Informed Interpolation of Hidden Space for Semi-Supervised Text Classification
- Learning with Contrastive Examples for Data-to-Text Generation
- Long-Tail Zero and Few-Shot Learning via Contrastive Pretraining on and for Small Data
- COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
- Learning Transferable Visual Models From Natural Language Supervision
- Contrasting distinct structured views to learn sentence embeddings
- A Primer on Contrastive Pretraining in Language Processing: Methods, Lessons Learned and Perspectives