Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/JudePark96/awesome-nlp-references
A curated list of resources dedicated to Knowledge Distillation, Recommendation System, especially Natural Language Processing (NLP).
https://github.com/JudePark96/awesome-nlp-references
List: awesome-nlp-references
awesome-list awesome-nlp deep-learning knowledge-distillation machine-learning natural-language-processing neural-networks nlp nlp-references nlp-resources resources-dedicated tutorial
Last synced: 16 days ago
JSON representation
A curated list of resources dedicated to Knowledge Distillation, Recommendation System, especially Natural Language Processing (NLP).
- Host: GitHub
- URL: https://github.com/JudePark96/awesome-nlp-references
- Owner: JudePark96
- Created: 2020-02-13T16:31:43.000Z (almost 5 years ago)
- Default Branch: master
- Last Pushed: 2021-04-11T14:53:24.000Z (over 3 years ago)
- Last Synced: 2024-05-22T23:01:43.918Z (7 months ago)
- Topics: awesome-list, awesome-nlp, deep-learning, knowledge-distillation, machine-learning, natural-language-processing, neural-networks, nlp, nlp-references, nlp-resources, resources-dedicated, tutorial
- Homepage:
- Size: 125 KB
- Stars: 34
- Watchers: 7
- Forks: 5
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- ultimate-awesome - awesome-nlp-references - A curated list of resources dedicated to Knowledge Distillation, Recommendation System, especially Natural Language Processing (NLP). (Other Lists / Monkey C Lists)
README
# Introduction
This is a curated list of resources dedicated to Knowledge Distillation, Recommendation System, especially Natural Language Processing (NLP).The goal of this repository is not only storing the references personally but also sharing with people outside.
# Reference
## Retrieval
- [Dense Passage Retrieval for Open-Domain Question Answering](https://arxiv.org/abs/2004.04906)## Language Model
- [Introducing MASS – A pre-training method that outperforms BERT and GPT in sequence to sequence language generation tasks](https://www.microsoft.com/en-us/research/blog/introducing-mass-a-pre-training-method-that-outperforms-bert-and-gpt-in-sequence-to-sequence-language-generation-tasks/)
- [A new model and dataset for long-range memory](https://deepmind.com/blog/article/A_new_model_and_dataset_for_long-range_memory?fbclid=IwAR2XGjVqZgx90_S1y6e7CWR4BmAbsSspdn6Rks7BuN2Xuy3qnOpdf211bnc)
- [Visual Paper Summary: ALBERT (A Lite BERT)](https://amitness.com/2020/02/albert-visual-summary/)
- [reformer-pytorch](https://github.com/lucidrains/reformer-pytorch)
- Implementation in PyTorch
- [Fixed Encoder Self-Attention Patterns in Transformer-Based Machine Translation](https://arxiv.org/abs/2002.10260?fbclid=IwAR3pr2SUEBe3L4eRZE8dR1X9lPpNcRc2aZUnEQZ5Y8B4sARgpqBeO76shc0)
- [A Primer in BERTology: What we know about how BERT works](https://arxiv.org/pdf/2002.12327.pdf)
- [ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators](https://github.com/google-research/electra)
- [Universal Transformer](https://arxiv.org/pdf/1807.03819.pdf)
- [Contextualized Non-local Neural Networks for Sequence Learning](https://arxiv.org/abs/1811.08600)
- [An Efficient Framework for Learning Sentence Representations](https://arxiv.org/abs/1803.02893)
- [DeeBERT: Dynamic Early Exiting for Accelerating BERT Inference](https://arxiv.org/pdf/2004.12993.pdf)
- [Implementation in PyTorch](https://github.com/castorini/DeeBERT)
- [A Generative Model for Joint Natural Language Understanding and Generation](https://arxiv.org/pdf/2006.07499.pdf)## Conversational Agents
- [Towards a Human-like Open-Domain Chatbot](https://arxiv.org/abs/2001.09977?fbclid=IwAR1-8Qi3MNs8I8Q3yLIajkTHEJJjMWdAWRLIRC7A464mxSMJoEJDHxpGs9s)
- [Explaination in Korean by PingPong Team](https://blog.pingpong.us/meena-presentation/?fbclid=IwAR3epnb8NOQQcUQfXaJZLfGF-fhSsXV_FuVCD0yU78KOlb93Fi7KdMM51Kg)
- [Self-Supervised Dialogue Learning](https://arxiv.org/pdf/1907.00448.pdf)
- [Sequential Attention-based Network for Noetic End-to-End Response Selection](https://arxiv.org/abs/1901.02609)
- [Neural Text Generation from Rich Semantic Representations](https://arxiv.org/abs/1904.11564)
- [Pretraining Methods for Dialog Context Representation Learning](https://arxiv.org/pdf/1906.00414.pdf)
- [Deep Generative Models with Learnable Knowledge Constraints](http://papers.nips.cc/paper/8250-deep-generative-models-with-learnable-knowledge-constraints)## Pre-Processing
- [A Deep Dive into the Wonderful World of Preprocessing in NLP](https://mlexplained.com/2019/11/06/a-deep-dive-into-the-wonderful-world-of-preprocessing-in-nlp/)## Graph Neural Network
- [Graph Neural Networks: Models and Applications](http://cse.msu.edu/~mayao4/tutorials/aaai2020/?fbclid=IwAR285UMlV8mq1PWsIyYp233m-KHTueKzJorK2uyjQeh2yIli9zw9MxLhbjs)
- [Generating Logical Forms from Graph Representations of Text and Entities](https://arxiv.org/pdf/1905.08407.pdf)
- [K-BERT: Enabling Language Representation With Knowledge Graph](https://www.aaai.org/Papers/AAAI/2020GB/AAAI-LiuW.5594.pdf?fbclid=IwAR1U02U0REq-up-9upNT-ujKgX6uEidMnynqOOLUXswK7AcOYJPd59xF6WQ)## Recommendation System
- [Learning and Reasoning on Graph for Recommendation](https://next-nus.github.io/)
- [Natural Language Recommendations: A novel research paper search engine developed entirely with embedding and transformer models](https://github.com/Santosh-Gupta/NaturalLanguageRecommendations)## Knowledge Distillation
- [Distilling Transformers into Simple Neural Networks with Unlabeled Transfer Data](https://arxiv.org/abs/1910.01769)
- [Attentive Student Meets Multi-Task Teacher: Improved Knowledge Distillation for Pretrained Models](https://arxiv.org/pdf/1911.03588.pdf)
- [Robust Language Representation Learning via Multi-task Knowledge Distillation](https://www.microsoft.com/en-us/research/blog/robust-language-representation-learning-via-multi-task-knowledge-distillation/)
- [Understanding Knowledge Distillation in Neural Sequence Generation](https://www.microsoft.com/en-us/research/video/understanding-knowledge-distillation-in-neural-sequence-generation/)
- [Distilling Task-Specific Knowledge from BERT into Simple Neural Networks](https://arxiv.org/abs/1903.12136)## Meta Learning
- [From zero to research — An introduction to Meta-learning](https://medium.com/huggingface/from-zero-to-research-an-introduction-to-meta-learning-8e16e677f78a)## Named Entity Recognition
- [Named Entity Recognition as Dependency Parsing](https://arxiv.org/pdf/2005.07150.pdf)
- [Implementation GitHub](https://github.com/amir-zeldes/biaffine-ner)## Metric Learning
- [Metric Learning for Dynamic Text Classification](https://www.aclweb.org/anthology/D19-6116/)
- [RUBER: An Unsupervised Method for Automatic Evaluation of Open-Domain Dialog Systems](https://arxiv.org/abs/1701.03079)
- [Better Automatic Evaluation of Open-Domain Dialogue Systems with Contextualized Embeddings](https://arxiv.org/abs/1904.10635)
- [Matching Natural Language Sentences with Hierarchical Sentence Factorization](https://arxiv.org/abs/1803.00179)
- [Instance Cross Entropy for Deep Metric Learning](https://arxiv.org/abs/1911.09976)
- [Matching Embeddings for Domain Adaptation](https://arxiv.org/abs/1909.11651)
- [Deep Metric Learning using Similarities from Nonlinear Rank Approximations](https://arxiv.org/abs/1909.09427)
- [Keyword-Attentive Deep Semantic Matching](https://arxiv.org/abs/2003.11516)
- [BERTScore: Evaluating Text Generation with BERT](https://arxiv.org/abs/1904.09675)
- [Sampling Matters! An Empirical Study of Negative Sampling Strategies
for Learning of Matching Models in Retrieval-based Dialogue Systems](https://www.aclweb.org/anthology/D19-1128.pdf?fbclid=IwAR3cnvcmRmf2Cwa5lXWx5aZLySGTinZla49sBgJZOJuKM9ifRTOh3yQVNWY)## Data Argumentation
- [Bi-Decoder Augmented Network for Neural Machine Translation](https://arxiv.org/pdf/2001.04586.pdf)## Sequence Labeling
- [Low-Resource Sequence Labeling via Unsupervised Multilingual Contextualized Representations](https://arxiv.org/abs/1910.10893)## Keyphrase Extraction/Generation
- [Title-Guided Encoding for Keyphrase Generation](https://arxiv.org/pdf/1808.08575.pdf)## Relation Extraction
- [Zero-shot Entity Linking with Dense Entity Retrieval](https://arxiv.org/pdf/1911.03814.pdf?fbclid=IwAR3z8-1qEsoJ6h8k3R6Q5SnSN80AlHrenUmEOYAsDfqFwqels0BZc9qmNME)## Machine Translation
- [Domain specialization: a post-training domain adaptation for Neural Machine Translation](https://arxiv.org/pdf/1612.06141.pdf)## Evaluation Metric
- [Guiding Extractive Summarization with Question-Answering Rewards](https://www.aclweb.org/anthology/N19-1264.pdf)## Tutorial
- [pytorch-seq2seq tutorial](https://github.com/bentrevett/pytorch-seq2seq)
- [Learn NLP With Me – Information Extraction – Relations – Introduction](https://ryanong.co.uk/2020/02/21/day-52-learn-nlp-with-me-information-extraction-relations-introduction/)
- [Stanford CS224N](http://web.stanford.edu/class/cs224n/)
- [fast.ai course-nlp](https://github.com/fastai/course-nlp)
- [Daniel Jurafsky and James H Martin. Speech and Language Processing (3rd Edition). Draft, 2019.](https://web.stanford.edu/~jurafsky/slp3/)
- [Hands on Machine Learning](https://github.com/ageron/handson-ml2?fbclid=IwAR1wilnC9h2P9yFNwvgrnpLE4i7xsYJ0o0APGVji5zDb2AGobajua434qHg)## Tool
- [Open Knowledge Embedding](https://github.com/thunlp/OpenKE)## Contributors
I am waiting for people who wants to contribute to this document. If you know good papers, tutorial, whatsoever, Please pull request! :)- [@JudePark96](https://github.com/JudePark96/)