Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/KaimingHe/resnet-1k-layers
Deep Residual Networks with 1K Layers
https://github.com/KaimingHe/resnet-1k-layers
Last synced: about 1 month ago
JSON representation
Deep Residual Networks with 1K Layers
- Host: GitHub
- URL: https://github.com/KaimingHe/resnet-1k-layers
- Owner: KaimingHe
- Created: 2016-04-05T02:50:30.000Z (over 8 years ago)
- Default Branch: master
- Last Pushed: 2017-05-24T17:35:48.000Z (over 7 years ago)
- Last Synced: 2024-10-15T11:36:59.866Z (about 2 months ago)
- Language: Lua
- Homepage:
- Size: 3.91 KB
- Stars: 903
- Watchers: 65
- Forks: 249
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- awesome-image-classification - official : https://github.com/KaimingHe/resnet-1k-layers
- awesome-image-classification - official : https://github.com/KaimingHe/resnet-1k-layers
README
# Deep Residual Networks with 1K Layers
By [Kaiming He](http://kaiminghe.com), [Xiangyu Zhang](https://scholar.google.com/citations?user=yuB-cfoAAAAJ&hl=en), [Shaoqing Ren](http://home.ustc.edu.cn/~sqren/), [Jian Sun](http://research.microsoft.com/en-us/people/jiansun/).
Microsoft Research Asia (MSRA).
## Table of Contents
0. [Introduction](#introduction)
0. [Notes](#notes)
0. [Usage](#usage)## Introduction
This repository contains re-implemented code for the paper "Identity Mappings in Deep Residual Networks" (http://arxiv.org/abs/1603.05027). This work enables training quality **1k-layer** neural networks in a super simple way.
*Acknowledgement*: This code is re-implemented by Xiang Ming from Xi'an Jiaotong Univeristy for the ease of release.
**Seel Also:** Re-implementations of **ResNet-200 [a] on ImageNet** from Facebook AI Research (FAIR): https://github.com/facebook/fb.resnet.torch/tree/master/pretrained
Related papers:
[a] @article{He2016,
author = {Kaiming He and Xiangyu Zhang and Shaoqing Ren and Jian Sun},
title = {Identity Mappings in Deep Residual Networks},
journal = {arXiv preprint arXiv:1603.05027},
year = {2016}
}
[b] @article{He2015,
author = {Kaiming He and Xiangyu Zhang and Shaoqing Ren and Jian Sun},
title = {Deep Residual Learning for Image Recognition},
journal = {arXiv preprint arXiv:1512.03385},
year = {2015}
}
## Notes0. This code is based on the implementation of Torch ResNets (https://github.com/facebook/fb.resnet.torch).
0. The experiments in the paper were conducted in Caffe, whereas this code is re-implemented in Torch. We observed similar results within reasonable statistical variations.
0. To fit the 1k-layer models into memory without modifying much code, we simply reduced the mini-batch size to 64, noting that results in the paper were obtained with a mini-batch size of 128. Less expectedly, the results with the mini-batch size of 64 are slightly better:
mini-batch |CIFAR-10 test error (%): (median (mean+/-std))
:---------:|:------------------:
128 (as in [a]) | 4.92 (4.89+/-0.14)
64 (as in this code)| **4.62** (4.69+/-0.20)0. Curves obtained by running this code with a mini-batch size of 64 (training loss: y-axis on the left; test error: y-axis on the right):
![resnet1k](https://cloud.githubusercontent.com/assets/11435359/14414142/68714c82-ffc0-11e5-8b1b-657fdb3d96a6.png)
## Usage0. Install Torch ResNets (https://github.com/facebook/fb.resnet.torch) following instructions therein.
0. Add the file resnet-pre-act.lua from this repository to ./models.
0. To train ResNet-1001 as of the form in [a]:
```
th main.lua -netType resnet-pre-act -depth 1001 -batchSize 64 -nGPU 2 -nThreads 4 -dataset cifar10 -nEpochs 200 -shareGradInput false
```
**Note**: ``shareGradInput=true'' is not valid for this model yet.