Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
Awesome-MIM
[Survey] Masked Modeling for Self-supervised Representation Learning on Vision and Beyond (https://arxiv.org/abs/2401.00897)
https://github.com/Lupin1998/Awesome-MIM
- arXiv - MIM/blob/master/files/Survey_on_Masked_Modeling_Latest_Version.pdf).
- Connected Papers
- arXiv
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Paper
- [Paper
- [Code
- [Paper
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Project
- [Paper
- [Code
- [Paper
- [Code
- [Project
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper - Med3D)]
- [Paper - AIM-Group/MRM)]
- [Paper
- [Paper
- [Paper
- [Paper
- [Code
- [Paper
- [Paper
- [Paper - LHRS/official-CMID)]
- [Paper
- [Paper
- [Paper
- [Paper - BERT)]
- [Paper - Yatian/Point-MAE)]
- [Paper - liu/MaskPoint)]
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper - m2ae)]
- [Paper - MAE)]
- [Paper
- [Paper - mars-lab/geomae)]
- [Paper
- [Paper
- [Paper - BIT/PointGPT)]
- [Paper
- [Paper - Pre)]
- [Paper - n/UniPAD)]
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper - AST-Public)]
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Paper
- [Code
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper
- [Paper - detr)]
- [Paper - mae)]
- [Paper - MAE)]
- [Paper
- [Paper
- [Paper - risk-decomposition)]
- [Paper
- [Paper
- [Paper
- [Paper - MIM)]
- @Lupin1998 - MIM`, and current contributors include:
- Awesome-Masked-Autoencoders
- awesome-MIM
- Awesome-MIM - supervised visual representation.
- awesome-self-supervised-learning - supervised methods.
- unilm - scale Self-supervised Pre-training Across Tasks, Languages, and Modalities.
- OpenMixup - and Self-Supervised Visual Representation Learning Toolbox and Benchmark.
- MMPretrain - supervised pre-training toolbox and benchmark.
- solo-learn - supervised methods for visual representation learning powered by Pytorch Lightning.
- VISSL - Supervised Learning with images.
- lightly - supervised learning on images.
- Fairseq - to-Sequence Toolkit written in Python.
Programming Languages
Keywords
self-supervised-learning
17
pytorch
14
masked-image-modeling
11
deep-learning
11
vision-transformer
9
computer-vision
8
mae
8
transformer
7
masked-autoencoder
6
machine-learning
4
pre-training
4
representation-learning
4
bert
3
cvpr2023
3
multimodal
3
object-detection
3
action-recognition
3
contrastive-learning
3
video-representation-learning
3
video-understanding
3
ssl
3
unsupervised-learning
3
foundation-models
3
ade20k
3
coco
3
imagenet
3
beit
2
artificial-intelligence
2
protein-structure
2
text-to-image
2
swin-transformer
2
image-classification
2
self-supervised
2
moco
2
vae
2
vq-vae
2
vqvae
2
tensorflow
1
wmt
1
vq-vae-2
1
autoencoder
1
generative-models
1
vision-language-model
1
stable-diffusion
1
multi-task-learning
1
autoencoders
1
large-language-models
1
neurips-2023
1
roberta
1
vq
1