https://github.com/yeonghyeon/dino_mnist-pytorch
Pytorch implementation of "Emerging Properties in Self-Supervised Vision Transformers" (a.k.a. DINO)
https://github.com/yeonghyeon/dino_mnist-pytorch
knowledge-distillation mnist mnist-dataset pytorch self-distillation self-supervised-learning torch toy-example
Last synced: 5 months ago
JSON representation
Pytorch implementation of "Emerging Properties in Self-Supervised Vision Transformers" (a.k.a. DINO)
- Host: GitHub
- URL: https://github.com/yeonghyeon/dino_mnist-pytorch
- Owner: YeongHyeon
- License: mit
- Created: 2023-07-23T09:12:38.000Z (about 2 years ago)
- Default Branch: main
- Last Pushed: 2023-12-17T07:58:04.000Z (almost 2 years ago)
- Last Synced: 2023-12-17T08:33:31.137Z (almost 2 years ago)
- Topics: knowledge-distillation, mnist, mnist-dataset, pytorch, self-distillation, self-supervised-learning, torch, toy-example
- Language: Python
- Homepage:
- Size: 3.33 MB
- Stars: 5
- Watchers: 3
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
[PyTorch] DINO: self-DIstillation with NO labels
=====
PyTorch implementation of "Emerging Properties in Self-Supervised Vision Transformers"## Concept
![]()
Concept ot the DINO [1].
## Results
### Summary
||Student ($x_1$)|Teacher ($x_2$)|
|:---|:---:|:---:|
|Before(mismatch)||
|
|After(matched)||
|
### Detail (in Training)
|Epoch|Student ($x_1$)|Teacher ($x_2$)|
|:---|:---:|:---:|
|0||
|
|1||
|
|30||
|
|150||
|
|300||
|
## Requirements
* PyTorch 2.0.1## Reference
[1] Mathilde Caron, et al. "Emerging Properties in Self-Supervised Vision Transformers." Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2021.