Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
awesome-pretrain-on-molecules
[IJCAI 2023 survey track]A curated list of resources for chemical pre-trained models
https://github.com/junxia97/awesome-pretrain-on-molecules
- ArXiv 2023
- ECML-PKDD 2023
- ChemRXiv
- Nature Machine Intelligence
- IJCAI 2023 - pretrain-on-molecules)
- Arxiv 2023
- Nature Machine Intelligence
- Digital Discovery
- ArXiv 2023
- JPCM
- ArXiv 2023
- ArXiv 2023
- ArXiv 2023
- ChemRXiv
- ICLR 2023 - BERT)
- ICLR 2023
- ICLR 2023 - training-via-denoising)
- ICLR 2023
- Research 2022
- Briefings in Bioinformatics - xuan1314/Molecular-graph-BERT)
- ArXiv 2023
- JMGM 2023
- Nature Machine Intelligence 2022
- AAAI 2023 - EMGP)
- ArXiv 2023
- Openreview 2022
- KDD 2022
- EMNLP 2022 - nlp/MolT5)
- JCIM
- Bioinformatics
- ECCV 2022
- ArXiv
- ArXiv
- Bioinformatics
- BioArXiv
- ChemRxiv
- ICML 2022
- ICML 2022
- Ai4Science@ICML 2022
- TNNLS 2022
- Information Science
- ArXiv 2022
- ArXiv 2022
- ArXiv 2022
- ArXiv 2022
- ArXiv 2022
- ArXiv 2022
- ArXiv 2022
- ArXiv 2022
- KDD 2022
- ArXiv 2022
- TNSE 2022
- TSIPN 2022
- Easychair
- WWW 2022 Workshop
- Arxiv 2022
- ICASSP 2022
- TCYB 2022
- Arxiv 2022
- Arxiv 2022
- Arxiv 2022
- CVPR 2022
- Arxiv 2022
- AAAI 2022
- SDM 2022
- Nature Machine Intelligence 2022
- WWW 2022
- WWW 2022
- WWW 2022
- WWW 2022
- WWW 2022
- WWW 2022
- WWW 2022
- WWW 2022
- TKDE 2022
- BIBM 2021
- WSDM 2022 - Lab/GraphCL_Automated)
- SDM 2022
- AAAI 2022
- AAAI 2022
- AAAI 2022
- AAAI 2022
- ICOIN 2022
- arXiv 2022
- arXiv 2022
- arXiv 2022
- arXiv 2021
- arXiv 2021
- arXiv 2021
- arXiv 2021
- NeurIPS 2021 Workshop
- NeurIPS 2021
- NeurIPS 2021
- NeurIPS 2021
- NeurIPS 2021
- NeurIPS 2021
- NeurIPS 2021
- NeurIPS 2021
- NeurIPS 2021 - SSG)
- NeurIPS 2021
- NeurIPS 2021
- NeurIPS 2021
- arXiv 2021
- arXiv 2021
- arXiv 2021
- CIKM 2021
- CIKM 2021
- CIKM 2021
- CIKM 2021
- arXiv 2021
- arXiv 2021
- arXiv 2021
- arXiv 2021
- arXiv 2021
- IJCAI 2021
- IJCAI 2021
- arXiv 2021
- ICML 2021 - Lab/GraphCL_Automated)
- ICML 2021
- arXiv 2021
- arXiv 2021
- KDD 2021
- KDD 2021 - online/HeCo)
- arXiv 2021
- arXiv 2021
- arXiv 2021
- arXiv 2021
- IJCNN 2021
- arXiv 2021
- arXiv 2021
- arXiv 2021
- arXiv 2021
- WWW 2021 - DIG/GCA)
- Arxiv 2020
- Openreview 2020
- Openreview 2020
- Openreview 2020
- Openreview 2020
- NeurIPS 2020
- NeurIPS 2020
- NeurIPS 2020 - Lab/GraphCL)
- Arxiv 2020
- ICML 2020 - Lab/SS-GCNs)
- ICML 2020
- ICML 2020 Workshop
- Arxiv 2020
- Arxiv 2020
- KDD 2020 - GNN)
- KDD 2020
- Arxiv 2020 - sourcecode/Graph-Bert)
- ICLR 2020 - sun/InfoGraph)
- ICLR 2020 - stanford/pretrain-gnns)
- KDD 2019 Workshop
- ICLR 2019 workshop
- Arxiv 2019 - Deep-Graph-Infomax)
- ICLR 2019 - /DGI)
- Arxiv 2022
- Nature Machine Itelligence 2022
- ICLR 2022
- AAAI 2022
- KDD 2021 - DK)
- arXiv 2021
- ICLR 2022
- ICML 2022
- SDM 2022
- Signal Processing 2021
- IJCAI 2021
- IJCAI 2021
- arXiv 2021
- KDD 2021
- BioRxiv 2022
- AAAI 2022 - Daniel/Expert-Linking)
- Arxiv 2022
- The Journal of Chemical Physics
- SIGIR 2022
- Arxiv 22
- Nature Communications 2021
- NPL 2022
- arXiv 2022
- arXiv 2022
- arXiv 2022
- WWW 2021
- BIBM 2021
- ICBD 2021
- arXiv 2021
- arXiv 2021
- NeurIPS 2021 Workshop
- ICCSNT 2021
- arXiv 2021
- CIKM 2021
- arXiv 2021
- arXiv 2021
- KBS 2021
- arXiv 2021
- arXiv 2021
- IJCAI 2021
- arXiv 2021
- arXiv 2021
- KDD 2021
- arXiv 2021
- arXiv 2021
- arXiv 2021
- Arxiv 2021 - Yu/RecQ)
- ICLR 2021 - kim/SuperGAT)
- WSDM 2021 - Recsys)
- ICML 2020
- arXiv 2022
- NeurIPS 2021 datasets and benchmark track
- arXiv 2021
- arXiv 2021
- arXiv 2021
- Arxiv 2020 - GNN)
- ICLR 2019 Workshop
- Hu et al. - layer GIN| ZINC15 (2M) + ChEMBL (456K)| ~ 2M |[Link](https://github.com/snap-stanford/pretrain-gnns/tree/master/chem/model_gin)|
- Graph-BERT - Bert/tree/master/result/PreTrained_GraphBert)|
- GraphCL - layer GIN| ZINC15 (2M) + ChEMBL (456K) | ~ 2M|[Link](https://github.com/Shen-Lab/GraphCL/tree/master/transferLearning_MoleculeNet_PPI)|
- GPT-GNN - GNN)|
- GCC - layer GIN | Academia + DBLP + IMDB + Facebook + LiveJournal | <1M|[Link](https://github.com/THUDM/GCC#download-pretrained-models)|
- JOAO - layer GIN | ZINC15 (2M) + ChEMBL (456K) | ~ 2M |[Link](https://github.com/Shen-Lab/GraphCL_Automated/tree/master/transferLearning_MoleculeNet_PPI)|
- AD-GCL - layer GIN | ZINC15 (2M) + ChEMBL (456K) | ~ 2M |N/A|
- GraphLog - layer GIN | ZINC15 (2M) + ChEMBL (456K)| ~ 2M |[Link](https://github.com/DeepGraphLearning/GraphLoG/tree/main/models)|
- GROVER - ailab/grover)|
- MGSSL - layer GIN | ZINC15 (250K) | ~ 2M |[Link](https://github.com/zaixizhang/MGSSL/tree/main/motif_based_pretrain/saved_model)|
- CPT-HG
- MPG
- LP-Info - layer GIN | ZINC15 (2M) + ChEMBL (456K) | ~ 2M|[Link](https://github.com/Shen-Lab/GraphCL_Automated/tree/master/transferLearning_MoleculeNet_PPI_LP)|
- SimGRACE - layer GIN | ZINC15 (2M) + ChEMBL (456K) | ~ 2M |[Link](https://github.com/junxia97/SimGRACE)|
- MolCLR
- DMP
- ChemRL-GEM
- KCL
- 3D Infomax - drugs(140K) + QMugs(620K) | N/A |[Link](https://github.com/HannesStark/3DInfomax)|
- GraphMVP
- Link
- Link
- Link
- Link
- Link
- Link
- awesome-pretrained-chinese-nlp-models
- awesome-self-supervised-gnn
- awesome-self-supervised-learning-for-graphs
Programming Languages
Keywords
self-supervised-learning
2
machine-learning
2
deep-learning
2
bert
1
chinese
1
dataset
1
ernie
1
gpt
1
gpt-2
1
large-language-models
1
llm
1
multimodel
1
nezha
1
nlp
1
nlu-nlg
1
pangu
1
pretrained-models
1
roberta
1
simbert
1
xlnet
1
graph-mining
1
graph-neural-networks
1
graph-self-supervised-learning
1
pre-training
1
pretraining
1
contrastive-learning
1
graph-representation-learning
1