Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/bmsookim/wide-resnet.pytorch

Best CIFAR-10, CIFAR-100 results with wide-residual networks using PyTorch
https://github.com/bmsookim/wide-resnet.pytorch

Last synced: about 1 month ago
JSON representation

Best CIFAR-10, CIFAR-100 results with wide-residual networks using PyTorch

Awesome Lists containing this project

README

        

Best CIFAR-10, CIFAR-100 results with wide-residual networks using PyTorch

Pytorch Implementation of Sergey Zagoruyko's [Wide Residual Networks](https://arxiv.org/pdf/1605.07146v2.pdf)

For Torch implementations, see [here](https://github.com/meliketoy/wide-residual-network).

## Requirements
See the [installation instruction](INSTALL.md) for a step-by-step installation guide.
See the [server instruction](SERVER.md) for server settup.
- Install [cuda-8.0](https://developer.nvidia.com/cuda-downloads)
- Install [cudnn v5.1](https://developer.nvidia.com/cudnn)
- Download [Pytorch 2.7](https://pytorch.org) and clone the repository.
```bash
pip install http://download.pytorch.org/whl/cu80/torch-0.1.12.post2-cp27-none-linux_x86_64.whl
pip install torchvision
git clone https://github.com/meliketoy/wide-resnet.pytorch
```

## How to run
After you have cloned the repository, you can train each dataset of either cifar10, cifar100 by running the script below.
```bash
python main --lr 0.1 resume false --net_type [lenet/vggnet/resnet/wide-resnet] --depth 28 --widen_factor 10 --dropout_rate 0.3 --dataset [cifar10/cifar100]
```

## Implementation Details

| epoch | learning rate | weight decay | Optimizer | Momentum | Nesterov |
|:---------:|:-------------:|:-------------:|:---------:|:--------:|:--------:|
| 0 ~ 60 | 0.1 | 0.0005 | Momentum | 0.9 | true |
| 61 ~ 120 | 0.02 | 0.0005 | Momentum | 0.9 | true |
| 121 ~ 160 | 0.004 | 0.0005 | Momentum | 0.9 | true |
| 161 ~ 200 | 0.0008 | 0.0005 | Momentum | 0.9 | true |

## CIFAR-10 Results

![alt tag](imgs/cifar10_image.png)

Below is the result of the test set accuracy for **CIFAR-10 dataset** training.

**Accuracy is the average of 5 runs**

| network | dropout | preprocess | GPU:0 | GPU:1 | per epoch | accuracy(%) |
|:-----------------:|:-------:|:----------:|:-----:|:-----:|:------------:|:-----------:|
| wide-resnet 28x10 | 0 | ZCA | 5.90G | - | 2 min 03 sec | 95.83 |
| wide-resnet 28x10 | 0 | meanstd | 5.90G | - | 2 min 03 sec | 96.21 |
| wide-resnet 28x10 | 0.3 | meanstd | 5.90G | - | 2 min 03 sec | 96.27 |
| wide-resnet 28x20 | 0.3 | meanstd | 8.13G | 6.93G | 4 min 10 sec | **96.55** |
| wide-resnet 40x10 | 0.3 | meanstd | 8.08G | - | 3 min 13 sec | 96.31 |
| wide-resnet 40x14 | 0.3 | meanstd | 7.37G | 6.46G | 3 min 23 sec | 96.34 |

## CIFAR-100 Results

![alt tag](imgs/cifar100_image.png)

Below is the result of the test set accuracy for **CIFAR-100 dataset** training.

**Accuracy is the average of 5 runs**

| network | dropout | preprocess | GPU:0 | GPU:1 | per epoch | Top1 acc(%)| Top5 acc(%) |
|:-----------------:|:-------:|:-----------:|:-----:|:-----:|:------------:|:----------:|:-----------:|
| wide-resnet 28x10 | 0 | ZCA | 5.90G | - | 2 min 03 sec | 80.07 | 95.02 |
| wide-resnet 28x10 | 0 | meanstd | 5.90G | - | 2 min 03 sec | 81.02 | 95.41 |
| wide-resnet 28x10 | 0.3 | meanstd | 5.90G | - | 2 min 03 sec | 81.49 | 95.62 |
| wide-resnet 28x20 | 0.3 | meanstd | 8.13G | 6.93G | 4 min 05 sec | **82.45** | **96.11** |
| wide-resnet 40x10 | 0.3 | meanstd | 8.93G | - | 3 min 06 sec | 81.42 | 95.63 |
| wide-resnet 40x14 | 0.3 | meanstd | 7.39G | 6.46G | 3 min 23 sec | 81.87 | 95.51 |