Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/YixuanLi/densenet-tensorflow
DenseNet Implementation in Tensorflow
https://github.com/YixuanLi/densenet-tensorflow
densenet tensorflow
Last synced: about 1 month ago
JSON representation
DenseNet Implementation in Tensorflow
- Host: GitHub
- URL: https://github.com/YixuanLi/densenet-tensorflow
- Owner: YixuanLi
- License: gpl-3.0
- Created: 2016-09-22T14:52:49.000Z (about 8 years ago)
- Default Branch: master
- Last Pushed: 2019-05-07T17:47:07.000Z (over 5 years ago)
- Last Synced: 2024-08-01T22:49:54.733Z (4 months ago)
- Topics: densenet, tensorflow
- Language: Python
- Size: 248 KB
- Stars: 574
- Watchers: 23
- Forks: 197
- Open Issues: 10
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-very-deep-learning - Tensorflow Implementation
- project-awesome - YixuanLi/densenet-tensorflow - DenseNet Implementation in Tensorflow (Python)
- awesome-image-classification - unofficial-pytorch : https://github.com/YixuanLi/densenet-tensorflow
- awesome-image-classification - unofficial-pytorch : https://github.com/YixuanLi/densenet-tensorflow
README
# DenseNet-tensorflow
This repository contains the tensorflow implementation for the paper [Densely Connected Convolutional Networks](http://arxiv.org/abs/1608.06993).The code is developed based on Yuxin Wu's implementation of ResNet (https://github.com/ppwwyyxx/tensorpack/tree/master/examples/ResNet).
Citation:
@inproceedings{huang2017densely,
title={Densely connected convolutional networks},
author={Huang, Gao and Liu, Zhuang and van der Maaten, Laurens and Weinberger, Kilian Q },
booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
year={2017}
}## Dependencies:
+ Python 2 or 3
+ TensorFlow >= 1.0
+ [Tensorpack] (https://github.com/ppwwyyxx/tensorpack)
+ OpenCv-Python## Train a DenseNet (L=40, k=12) on CIFAR-10+ using
```
python cifar10-densenet.py
```
In our experiment environment (cudnn v5.1, CUDA 7.5, one TITAN X GPU), the code runs with speed 5iters/s when batch size is set to be 64. The hyperparameters are identical to the original [torch implementation] (https://github.com/liuzhuang13/DenseNet).## Training curves on CIFAR-10+ (~5.77% after 300 epochs)
![cifar10](cifar10.png)
## Training curves on CIFAR-100+ (~26.36% after 300 epochs)
![cifar100](cifar100.png)
## Differences compared to the original [torch implementation] (https://github.com/liuzhuang13/DenseNet)
+ Preprocessing is not channel-wise, instead we use mean and variances of images.
+ There is no momentum and weight decay applied on the batch normalization parameters (gamma and beta), whereas torch vertison uses both momentum and weight decay on those.## Questions?
Please drop [me](http://www.cs.cornell.edu/~yli) a line if you have any questions!