https://github.com/coincheung/densecl
DenseCL + regionCL-D
https://github.com/coincheung/densecl
deep-learning metric-learning pretrained pytorch regioncl-d self-supervised-learning
Last synced: 5 months ago
JSON representation
DenseCL + regionCL-D
- Host: GitHub
- URL: https://github.com/coincheung/densecl
- Owner: CoinCheung
- License: other
- Created: 2020-08-31T09:18:42.000Z (about 5 years ago)
- Default Branch: master
- Last Pushed: 2022-12-27T08:47:57.000Z (almost 3 years ago)
- Last Synced: 2025-05-07T11:56:51.345Z (5 months ago)
- Topics: deep-learning, metric-learning, pretrained, pytorch, regioncl-d, self-supervised-learning
- Language: Python
- Homepage:
- Size: 116 KB
- Stars: 15
- Watchers: 1
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- Contributing: .github/CONTRIBUTING.md
- License: LICENSE
- Code of conduct: .github/CODE_OF_CONDUCT.md
Awesome Lists containing this project
README
## DenseCL: Dense Contrastive Learning for Self-Supervised Visual Pre-Training
![]()
This is an unofficial PyTorch implementation of the [DenseCL paper](https://arxiv.org/abs/2011.09157), with the help and suggestions from @WXinlong and @DerrickWang005.
Currently, [regionCL-D](https://arxiv.org/abs/2111.12309) is added, and pretrained checkpoints are uploaded.
### Preparation
Install PyTorch and ImageNet dataset following the [official PyTorch ImageNet training code](https://github.com/pytorch/examples/tree/master/imagenet).
This repo aims to be minimal modifications on that code. Check the modifications by:
```
diff main_densecl.py <(curl https://raw.githubusercontent.com/pytorch/examples/master/imagenet/main.py)
diff main_lincls.py <(curl https://raw.githubusercontent.com/pytorch/examples/master/imagenet/main.py)
```### Unsupervised Training & Linear Classification
This implementation only supports **multi-gpu**, **DistributedDataParallel** training, which is faster and simpler; single-gpu or DataParallel training is not supported.
This implementation only supports **ResNet50/ResNet101**, since we need to modify computing graph architecture and I only modified ResNet50/ResNet101.
To do unsupervised pre-training and linear-evaluation of a ResNet50/ResNet101 model on ImageNet in an 8-gpu machine, please refer to [dist_train.sh](./dist_train.sh) for relevant starting script.
Since the paper says they use default mocov2 hyper-parameters, the above script uses same hyper-parameters as mocov2.
***Note***: for 4-gpu training, we recommend following the [linear lr scaling recipe](https://arxiv.org/abs/1706.02677): `--lr 0.015 --batch-size 128` with 4 gpus. We got similar results using this setting.
### Models
Our pre-trained denseCL/RegionCL-D models can be downloaded as following:
epochs
mlp
aug+
cos
IM
top1
VOC
AP50
model
md5MoCov2 R50
200
✓
✓
✓
67.7
82.4
download
59fd9945DenseCL R50
200
✓
✓
✓
63.8
82.7
download
7cfc894cDenseCL R101
200
✓
✓
✓
65.4
83.5
download
006675e5RegionCL-D R50
200
✓
✓
✓
67.5
83.3
download
8afad30eRegionCL-D R101
200
✓
✓
✓
67.5
84.3
download
a1489ad4Here **IM** is imagenet-1k dataset. We freeze pretrained weights and only fine tune the last classifier layer.
Please be aware that though DenseCL cannot match mocov2 in the filed of classification, it is superior to mocov2 in terms of object detection. More results of detection can be found [here](./detection).
### Transferring to Object Detection
For details, see [./detection](./detection).
### License
This project is under the CC-BY-NC 4.0 license. See [LICENSE](LICENSE) for details.
regioncl, r50:
Acc@1 67.518 Acc@5 88.256
Acc@1 67.534 Acc@5 88.212
regioncl, r101:
Acc@1 67.504 Acc@5 88.212
Acc@1 67.470 Acc@5 88.104