Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
Awesome_Few_Shot_Learning
Advances of few-shot learning, especially for NLP applications.
https://github.com/wutong8023/Awesome_Few_Shot_Learning
Last synced: about 13 hours ago
JSON representation
-
Outline
-
ACL
-  , <br> by *Chang, Ernie and
-  , <br> by *Rijhwani, Shruti and
-  , <br> by *Tabasi, Mohsen and
-
COLING
-  , <br> by *Tianyu Gao and
-  , <br> by *Guibon, Ga{\"e}l and
-  , <br> by *Yin, Wenpeng and
-  , <br> by *Xia, Congying and
-
EACL
-  , <br> by *Schick, Timo and
-
ICLR
-  , <br> by *Sachin Ravi and
-  , <br> by *Han-Jia Ye and Wei-Lun Chao* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L709-L715) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```ye2022how```
-  , <br> by *Sreejan Kumar, Ishita Dasgupta, Jonathan Cohen, Nathaniel Daw and Thomas Griffiths* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L2088-L2094) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```kumar2021metalearning```
-  , <br> by *Hung-Yu Tseng, Hsin-Ying Lee, Jia-Bin Huang and Ming-Hsuan Yang* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L2407-L2413) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```Tseng2020Cross-Domain```
-  , <br> by *Xiang Jiang, Mohammad Havaei, Farshid Varno, Gabriel Chartrand, Nicolas Chapados and Stan Matwin* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L2461-L2467) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```jiang2018learning```
-  , <br> by *Kevin Frans, Jonathan Ho, Xi Chen, Pieter Abbeel and John Schulman* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L2506-L2512) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```frans2018meta```
-
AAAI
-  , <br> by *Tianyu Gao and
-  , <br> by *Oriol Vinyals and
-  , <br> by *Subhabrata Mukherjee and
-  , <br> by *Tom B. Brown and
-  , <br> by *Yujia Xie and
-  , <br> by *Saeid Motiian and
-
TACL
-  , <br> by *Sabo, Ofer and
-  , <br> by *Zhengbao Jiang and
-
CVPR
-  , <br> by *Aoxue Li and
-  , <br> by *Rizve, Mamshad Nayeem, Khan, Salman, Khan, Fahad Shahbaz and Shah, Mubarak* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L1912-L1920) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```Rizve_2021_CVPR```
-
ACM Comput. Surv.
-  , <br> by *Yaqing Wang and
-
ICML
-  , <br> by *Tsendsuren Munkhdalai and Hong Yu* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L2275-L2282) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```pmlr-v70-munkhdalai17a```
-  , <br> by *Achituve, Idan, Navon, Aviv, Yemini, Yochai, Chechik, Gal and Fetaya, Ethan* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L1735-L1742) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```pmlr-v139-achituve21a```
-  , <br> by *Yoon, Sung Whan, Kim, Do-Yeon, Seo, Jun and Moon, Jaekyun* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L2165-L2172) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```pmlr-v119-yoon20b```
-  , <br> by *Day, Ben J, Torn{\'e}, Ramon Vi{\~n}as, Simidjievski, Nikola and Li{\'o}, Pietro* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L54-L61) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```pmlr-v162-day22a```
-  , <br> by *Lee, Yoonho and Choi, Seungjin* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L2256-L2263) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```pmlr-v80-lee18a```
-
NAACL
-  , <br> by *Wang, Han and
-  , <br> by *Wei, Jason and
-
arXiv
-  , <br> by *Mengjie Zhao and Hinrich Schütze* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L3812-L3822) <br>```EMNLP 2021
-  , <br> by *Robert L. Logan IV au2, Ivana Balažević, Eric Wallace, Fabio Petroni, Sameer Singh and Sebastian Riedel* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L3732-L3739) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```logan2021cutting```
-
ICCV
-  , <br> by *Das, Rajshekhar, Wang, Yu-Xiong and Moura, Jos\'e M. F.* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L1138-L1145) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```Das_2021_ICCV```
-
SIGIR
-  , <br> by *Xia, Congying, Xiong, Caiming and Yu, Philip* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L1431-L1438) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```3404835.3462995```
-
KDD
-  , <br> by *Wang, Yaqing, Mukherjee, Subhabrata, Chu, Haoda, Tu, Yuancheng, Wu, Ming, Gao, Jing and Awadallah, Ahmed Hassan* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L1404-L1411) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```3447548.3467235```
-
IJCAI
-  , <br> by *Chen, Wentao, Si, Chenyang, Wang, Wei, Wang, Liang, Wang, Zilei and Tan, Tieniu* [[bib]](https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/./bibtex.bib#L1238-L1245) <br></details><details><summary><img src=https://github.com/wutong8023/Awesome_Few_Shot_Learning/blob/master/scripts/svg/copy_icon.png height="20" align="bottom"></summary><pre>```ijcai2021-313```