Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/gaohuang/DenseNet_lite
A more memory efficient Torch implementation of "Densely Connected Convolutional Networks".
https://github.com/gaohuang/DenseNet_lite
Last synced: 24 days ago
JSON representation
A more memory efficient Torch implementation of "Densely Connected Convolutional Networks".
- Host: GitHub
- URL: https://github.com/gaohuang/DenseNet_lite
- Owner: gaohuang
- License: mit
- Created: 2016-09-21T19:48:26.000Z (about 8 years ago)
- Default Branch: master
- Last Pushed: 2017-05-11T20:06:43.000Z (over 7 years ago)
- Last Synced: 2024-08-04T23:09:58.848Z (4 months ago)
- Language: Lua
- Size: 4.88 KB
- Stars: 30
- Watchers: 4
- Forks: 5
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-very-deep-learning - Torch Implementation
README
# DenseNet_lite
This implements the DenseNet architecture introduced in [Densely Connected Convolutional Network](http://arxiv.org/abs/1608.06993).The original Torch implementation can be found at https://github.com/liuzhuang13/DenseNet, and please find more details about DenseNet there. The only difference here is that we write a customed container "DenseLayer.lua" to implement the dense connections in a more memory efficient way. This leads to **~25% reduction** in memory consumption during training, while keeps the accuracy and training time the same.
## Usage
0. Install Torch ResNet (https://github.com/facebook/fb.resnet.torch) following the instructions there. To reduce memory consumption, we recommend to install the [optnet](https://github.com/fmassa/optimize-net) package.
1. Add the files ```densenet_lite.lua``` and ```DenseLayer.lua``` to the folder models/;
2. Insert ```require 'models/DenseLayer``` at Line.89 of ```models/init.lua```, if you need to use multiple GPUs;
3. Change the learning rate schedule at function learningRate() in ```train.lua``` (line 171/173),
from
```decay = epoch >= 122 and 2 or epoch >= 81 and 1 or 0```
to
```decay = epoch >= 225 and 2 or epoch >= 150 and 1 or 0 ```
4. Train a DenseNet (L=40, k=12) on CIFAR-10+ using```
th main.lua -netType densenet_lite -depth 40 -dataset cifar10 -batchSize 64 -nEpochs 300 -optnet true
```