Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

Awesome_Few_Shot_Learning

Advances of few-shot learning, especially for NLP applications.
https://github.com/wutong8023/Awesome_Few_Shot_Learning

Last synced: 3 days ago
JSON representation

  • Outline

  • ACL

    • ![ - acl.214)<a href="https://scholar.google.com.hk/scholar?q=Adaptive+Knowledge-Enhanced+Bayesian+Meta-Learning+for+Few-shot+Event+Detection"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Adaptive Knowledge-Enhanced Bayesian Meta-Learning for Few-shot Event
    • ![ - 1277)<a href="https://scholar.google.com.hk/scholar?q=Multi-Level+Matching+and+Aggregation+Network+for+Few-Shot+Relation+Classification"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Multi-Level Matching and Aggregation Network for Few-Shot Relation
    • ![ - short.36)<a href="https://scholar.google.com.hk/scholar?q=Exploiting+Language+Model+Prompts+Using+Similarity+Measures:+A+Case+Study+on+the+Word-in-Context+Task"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Exploiting Language Model Prompts Using Similarity Measures: A Case Study on the Word-in-Context Task**](https://aclanthology.org/2022.acl-short.36) , <br> by *Tabasi, Mohsen and
    • ![ - short.2)<a href="https://scholar.google.com.hk/scholar?q=On+Training+Instance+Selection+for+Few-Shot+Neural+Text+Generation"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**On Training Instance Selection for Few-Shot Neural Text Generation**](https://aclanthology.org/2021.acl-short.2) , <br> by *Chang, Ernie and
    • ![ - main.722)<a href="https://scholar.google.com.hk/scholar?q=Soft+Gazetteers+for+Low-Resource+Named+Entity+Recognition"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Soft Gazetteers for Low-Resource Named Entity Recognition**](https://www.aclweb.org/anthology/2020.acl-main.722) , <br> by *Rijhwani, Shruti and
  • COLING

    • ![ - main.563)<a href="https://scholar.google.com.hk/scholar?q=Bridging+Text+and+Knowledge+with+Multi-Prototype+Embedding+for+Few-Shot+Relational+Triple+Extraction"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Bridging Text and Knowledge with Multi-Prototype Embedding for Few-Shot
  • EMNLP

    • ![ - 1649)<a href="https://scholar.google.com.hk/scholar?q=FewRel+2.0:+Towards+More+Challenging+Few-Shot+Relation+Classification"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**FewRel 2.0: Towards More Challenging Few-Shot Relation Classification**](https://doi.org/10.18653/v1/D19-1649) , <br> by *Tianyu Gao and
    • ![ - 1514)<a href="https://scholar.google.com.hk/scholar?q=FewRel:+A+Large-Scale+Supervised+Few-shot+Relation+Classification+Dataset+with+State-of-the-Art+Evaluation"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**FewRel: A Large-Scale Supervised Few-shot Relation Classification
    • ![ - main.549)<a href="https://scholar.google.com.hk/scholar?q=Few-Shot+Emotion+Recognition+in+Conversation+with+Sequential+Prototypical+Networks"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Few-Shot Emotion Recognition in Conversation with Sequential Prototypical Networks**](https://aclanthology.org/2021.emnlp-main.549) , <br> by *Guibon, Ga{\"e}l and
    • ![ - main.660)<a href="https://scholar.google.com.hk/scholar?q=Universal+Natural+Language+Processing+with+Limited+Annotations:+Try+Few-shot+Textual+Entailment+as+a+Start"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Universal Natural Language Processing with Limited Annotations: Try Few-shot Textual Entailment as a Start**](https://www.aclweb.org/anthology/2020.emnlp-main.660) , <br> by *Yin, Wenpeng and
    • ![ - emnlp.303)<a href="https://scholar.google.com.hk/scholar?q=Composed+Variational+Natural+Language+Generation+for+Few-shot+Intents"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Composed Variational Natural Language Generation for Few-shot Intents**](https://www.aclweb.org/anthology/2020.findings-emnlp.303) , <br> by *Xia, Congying and
  • EACL

    • ![ - main.20)<a href="https://scholar.google.com.hk/scholar?q=Exploiting+Cloze-Questions+for+Few-Shot+Text+Classification+and+Natural+Language+Inference"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Exploiting Cloze-Questions for Few-Shot Text Classification and Natural Language Inference**](https://www.aclweb.org/anthology/2021.eacl-main.20) , <br> by *Schick, Timo and
  • ICLR

    • ![ - Kcll)<a href="https://scholar.google.com.hk/scholar?q=Optimization+as+a+Model+for+Few-Shot+Learning"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Optimization as a Model for Few-Shot Learning**](https://openreview.net/forum?id=rJY0-Kcll) , <br> by *Sachin Ravi and
    • ![ - -gvHfE3Xf5)<a href="https://scholar.google.com.hk/scholar?q=Meta-Learning+of+Structured+Task+Distributions+in+Humans+and+Machines"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Meta-Learning of Structured Task Distributions in Humans and Machines**](https://openreview.net/forum?id=--gvHfE3Xf5) , <br> by *Sreejan Kumar, Ishita Dasgupta, Jonathan Cohen, Nathaniel Daw and Thomas Griffiths* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L2088-L2094) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```kumar2021metalearning```
    • ![ - Domain+Few-Shot+Classification+via+Learned+Feature-Wise+Transformation"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Cross-Domain Few-Shot Classification via Learned Feature-Wise Transformation**](https://openreview.net/forum?id=SJl5Np4tPr) , <br> by *Hung-Yu Tseng, Hsin-Ying Lee, Jia-Bin Huang and Ming-Hsuan Yang* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L2407-L2413) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```Tseng2020Cross-Domain```
    • ![ - Shot+Classification"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**How to Train Your MAML to Excel in Few-Shot Classification**](https://openreview.net/forum?id=49h_IkpJtaE) , <br> by *Han-Jia Ye and Wei-Lun Chao* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L709-L715) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```ye2022how```
    • ![ - green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Learning to Learn with Conditional Class Dependencies**](https://openreview.net/forum?id=BJfOXnActQ) , <br> by *Xiang Jiang, Mohammad Havaei, Farshid Varno, Gabriel Chartrand, Nicolas Chapados and Stan Matwin* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L2461-L2467) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```jiang2018learning```
    • ![ - green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**META LEARNING SHARED HIERARCHIES**](https://openreview.net/forum?id=SyX0IeWAW) , <br> by *Kevin Frans, Jonathan Ho, Xi Chen, Pieter Abbeel and John Schulman* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L2506-L2512) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```frans2018meta```
  • AAAI

    • ![ - MSRE:+A+Few-Shot+Learning+based+Approach+to+Multimodal+Social+Relation+Extraction"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**FL-MSRE: A Few-Shot Learning based Approach to Multimodal Social
    • ![ - Shot+Relation+Learning"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Neural Snowball for Few-Shot Relation Learning**](https://aaai.org/ojs/index.php/AAAI/article/view/6281) , <br> by *Tianyu Gao and
    • ![ - Based+Prototypical+Networks+for+Noisy+Few-Shot+Relation+Classification"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Hybrid Attention-Based Prototypical Networks for Noisy Few-Shot Relation
  • NeurIPS

    • ![ - Abstract.html)<a href="https://scholar.google.com.hk/scholar?q=Matching+Networks+for+One+Shot+Learning"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Matching Networks for One Shot Learning**](https://proceedings.neurips.cc/paper/2016/hash/90e1357833654983612fb05e3ec9148c-Abstract.html) , <br> by *Oriol Vinyals and
    • ![ - benchmarks-proceedings.neurips.cc/paper/2021/hash/3644a684f98ea8fe223c713b77189a77-Abstract-round2.html)<a href="https://scholar.google.com.hk/scholar?q=Few-Shot+Learning+Evaluation+in+Natural+Language+Understanding"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Few-Shot Learning Evaluation in Natural Language Understanding**](https://datasets-benchmarks-proceedings.neurips.cc/paper/2021/hash/3644a684f98ea8fe223c713b77189a77-Abstract-round2.html) , <br> by *Subhabrata Mukherjee and
    • ![ - Abstract.html)<a href="https://scholar.google.com.hk/scholar?q=Language+Models+are+Few-Shot+Learners"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Language Models are Few-Shot Learners**](https://proceedings.neurips.cc/paper/2020/hash/1457c0d6bfcb4967418bfb8ac142f64a-Abstract.html) , <br> by *Tom B. Brown and
    • ![ - Abstract.html)<a href="https://scholar.google.com.hk/scholar?q=Low-shot+Learning+via+Covariance-Preserving+Adversarial+Augmentation+Networks"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Low-shot Learning via Covariance-Preserving Adversarial Augmentation
    • ![ - Abstract.html)<a href="https://scholar.google.com.hk/scholar?q=Meta+Learning+with+Relational+Information+for+Short+Sequences"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Meta Learning with Relational Information for Short Sequences**](https://proceedings.neurips.cc/paper/2019/hash/6fe43269967adbb64ec6149852b5cc3e-Abstract.html) , <br> by *Yujia Xie and
    • ![ - Abstract.html)<a href="https://scholar.google.com.hk/scholar?q=Few-Shot+Adversarial+Domain+Adaptation"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Few-Shot Adversarial Domain Adaptation**](https://proceedings.neurips.cc/paper/2017/hash/21c5bba1dd6aed9ab48c2b34c1a0adde-Abstract.html) , <br> by *Saeid Motiian and
  • TACL

    • ![ - 1.42)<a href="https://scholar.google.com.hk/scholar?q=Revisiting+Few-shot+Relation+Classification:+Evaluation+Data+and+Classification+Schemes"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Revisiting Few-shot Relation Classification: Evaluation Data and Classification Schemes**](https://aclanthology.org/2021.tacl-1.42) , <br> by *Sabo, Ofer and
    • ![ - green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**How Can We Know What Language Models Know**](https://transacl.org/ojs/index.php/tacl/article/view/1983) , <br> by *Zhengbao Jiang and
  • CVPR

    • ![ - Shot+Learning+With+Adaptive+Margin+Loss"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Boosting Few-Shot Learning With Adaptive Margin Loss**](https://doi.org/10.1109/CVPR42600.2020.01259) , <br> by *Aoxue Li and
    • ![ - Shot_CVPR_2021_paper.html)<a href="https://scholar.google.com.hk/scholar?q=Exploring+Complementary+Strengths+of+Invariant+and+Equivariant+Representations+for+Few-Shot+Learning"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Exploring Complementary Strengths of Invariant and Equivariant Representations for Few-Shot Learning**](https://openaccess.thecvf.com/content/CVPR2021/html/Rizve_Exploring_Complementary_Strengths_of_Invariant_and_Equivariant_Representations_for_Few-Shot_CVPR_2021_paper.html) , <br> by *Rizve, Mamshad Nayeem, Khan, Salman, Khan, Fahad Shahbaz and Shah, Mubarak* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L1912-L1920) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```Rizve_2021_CVPR```
  • ACM Comput. Surv.

    • ![ - shot+Learning"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Generalizing from a Few Examples: A Survey on Few-shot Learning**](https://doi.org/10.1145/3386252) , <br> by *Yaqing Wang and
  • ICML

    • ![ - Adaptive+Representation+for+Incremental+Few-Shot+Learning"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**XtarNet: Learning to Extract Task-Adaptive Representation for Incremental Few-Shot Learning**](http://proceedings.mlr.press/v119/yoon20b.html) , <br> by *Yoon, Sung Whan, Kim, Do-Yeon, Seo, Jun and Moon, Jaekyun* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L2165-L2172) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```pmlr-v119-yoon20b```
    • ![ - green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Meta Networks**](http://proceedings.mlr.press/v70/munkhdalai17a.html) , <br> by *Tsendsuren Munkhdalai and Hong Yu* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L2275-L2282) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```pmlr-v70-munkhdalai17a```
    • ![ - Tree:+A+Gaussian+Process+Classifier+for+Few-Shot+Incremental+Learning"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**GP-Tree: A Gaussian Process Classifier for Few-Shot Incremental Learning**](http://proceedings.mlr.press/v139/achituve21a.html) , <br> by *Achituve, Idan, Navon, Aviv, Yemini, Yochai, Chechik, Gal and Fetaya, Ethan* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L1735-L1742) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```pmlr-v139-achituve21a```
    • ![
    • ![ - learners+for+Few-shot+Polythetic+Classification"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Attentional Meta-learners for Few-shot Polythetic Classification**](https://proceedings.mlr.press/v162/day22a.html) , <br> by *Day, Ben J, Torn{\'e}, Ramon Vi{\~n}as, Simidjievski, Nikola and Li{\'o}, Pietro* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L54-L61) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```pmlr-v162-day22a```
    • ![ - Based+Meta-Learning+with+Learned+Layerwise+Metric+and+Subspace"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace**](http://proceedings.mlr.press/v80/lee18a.html) , <br> by *Lee, Yoonho and Choi, Seungjin* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L2256-L2263) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```pmlr-v80-lee18a```
  • IJCAI

    • ![ - Shot+Learning+with+Part+Discovery+and+Augmentation+from+Unlabeled+Images"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Few-Shot Learning with Part Discovery and Augmentation from Unlabeled Images**](https://doi.org/10.24963/ijcai.2021/313) , <br> by *Chen, Wentao, Si, Chenyang, Wang, Wei, Wang, Liang, Wang, Zilei and Tan, Tieniu* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L1238-L1245) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```ijcai2021-313```
  • NAACL

    • ![ - main.401)<a href="https://scholar.google.com.hk/scholar?q=Automatic+Multi-Label+Prompting:+Simple+and+Interpretable+Few-Shot+Classification"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Automatic Multi-Label Prompting: Simple and Interpretable Few-Shot Classification**](https://aclanthology.org/2022.naacl-main.401) , <br> by *Wang, Han and
    • ![ - main.434)<a href="https://scholar.google.com.hk/scholar?q=Few-Shot+Text+Classification+with+Triplet+Networks,+Data+Augmentation,+and+Curriculum+Learning"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Few-Shot Text Classification with Triplet Networks, Data Augmentation, and Curriculum Learning**](https://www.aclweb.org/anthology/2021.naacl-main.434) , <br> by *Wei, Jason and
  • ICCV

    • ![ - Shot_Classification_ICCV_2021_paper.html)<a href="https://scholar.google.com.hk/scholar?q=On+the+Importance+of+Distractors+for+Few-Shot+Classification"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**On the Importance of Distractors for Few-Shot Classification**](https://openaccess.thecvf.com/content/ICCV2021/html/Das_On_the_Importance_of_Distractors_for_Few-Shot_Classification_ICCV_2021_paper.html) , <br> by *Das, Rajshekhar, Wang, Yu-Xiong and Moura, Jos\'e M. F.* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L1138-L1145) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```Das_2021_ICCV```
  • arXiv

    • ![ - Shot+Learning+with+Language+Models"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Cutting Down on Prompts and Parameters: Simple Few-Shot Learning with Language Models**](https://arxiv.org/abs/2106.13353) , <br> by *Robert L. Logan IV au2, Ivana Balažević, Eric Wallace, Fabio Petroni, Sameer Singh and Sebastian Riedel* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L3732-L3739) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```logan2021cutting```
    • ![ - green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Discrete and Soft Prompting for Multilingual Models**](https://arxiv.org/abs/2109.03630) , <br> by *Mengjie Zhao and Hinrich Schütze* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L3812-L3822) <br>```EMNLP 2021
  • SIGIR

    • ![ - Shot+Intent+Generation"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Pseudo Siamese Network for Few-Shot Intent Generation**](https://doi.org/10.1145/3404835.3462995) , <br> by *Xia, Congying, Xiong, Caiming and Yu, Philip* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L1431-L1438) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```3404835.3462995```
  • KDD

    • ![ - Training+for+Few-Shot+Neural+Sequence+Labeling"><img src="https://img.shields.io/badge/-green.svg?&logo=google-scholar&logoColor=white" height="18" align="bottom"></a> [**Meta Self-Training for Few-Shot Neural Sequence Labeling**](https://doi.org/10.1145/3447548.3467235) , <br> by *Wang, Yaqing, Mukherjee, Subhabrata, Chu, Haoda, Tu, Yuancheng, Wu, Ming, Gao, Jing and Awadallah, Ahmed Hassan* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L1404-L1411) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```3447548.3467235```