https://github.com/sharpiless/zskd-pytorch
Pytorch implement of Zero-shot Knowledge Distillation in Deep Neural Networks
https://github.com/sharpiless/zskd-pytorch
knowledge-distillation zero-data zero-shot-learning zskd
Last synced: about 2 months ago
JSON representation
Pytorch implement of Zero-shot Knowledge Distillation in Deep Neural Networks
- Host: GitHub
- URL: https://github.com/sharpiless/zskd-pytorch
- Owner: Sharpiless
- License: gpl-3.0
- Created: 2021-08-27T03:19:58.000Z (about 4 years ago)
- Default Branch: main
- Last Pushed: 2021-08-27T11:24:30.000Z (about 4 years ago)
- Last Synced: 2025-06-17T13:47:04.289Z (4 months ago)
- Topics: knowledge-distillation, zero-data, zero-shot-learning, zskd
- Language: Python
- Homepage:
- Size: 248 KB
- Stars: 6
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Zero-Shot Knowledge Distillation in Deep Networks Pytorch
| | | | |
|-------------------|--------|---------|---------------|
| Model | Method | Dataset | top1-accuracy |
| LeNet5-LeNet5Half | Paper | Mnist | 98.77 |
| LeNet5-LeNet5Half | Ours | Mnist | 96.98 |# Usage
```bash
python main.py --dataset=mnist --lr=3.0 --t_train=False --num_sample=24000 --batch_size=100
```# TO-DO
Hard to repruduce the results in paper. The paper author also does not reply.