Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/melodyguan/enas
TensorFlow Code for paper "Efficient Neural Architecture Search via Parameter Sharing"
https://github.com/melodyguan/enas
Last synced: about 1 month ago
JSON representation
TensorFlow Code for paper "Efficient Neural Architecture Search via Parameter Sharing"
- Host: GitHub
- URL: https://github.com/melodyguan/enas
- Owner: melodyguan
- License: apache-2.0
- Created: 2018-02-23T21:56:49.000Z (almost 7 years ago)
- Default Branch: master
- Last Pushed: 2019-06-18T14:02:10.000Z (over 5 years ago)
- Last Synced: 2024-08-01T08:06:41.144Z (4 months ago)
- Language: Python
- Homepage: https://arxiv.org/abs/1802.03268
- Size: 3.8 MB
- Stars: 1,585
- Watchers: 76
- Forks: 390
- Open Issues: 85
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-automated-machine-learning - [code
- awesome-production-machine-learning - ENAS via Parameter Sharing - Efficient Neural Architecture Search via Parameter Sharing by [authors of paper](https://arxiv.org/abs/1802.03268). (Neural Architecture Search)
- Awesome-AIML-Data-Ops - ENAS via Parameter Sharing - Efficient Neural Architecture Search via Parameter Sharing by [authors of paper](https://arxiv.org/abs/1802.03268). (Neural Architecture Search)
- awesome-production-machine-learning - ENAS via Parameter Sharing - Efficient Neural Architecture Search via Parameter Sharing by [authors of paper](https://arxiv.org/abs/1802.03268). (AutoML)
- awesome-AutoML-and-Lightweight-Models - melodyguan/enas
README
# Efficient Neural Architecture Search via Parameter Sharing
Authors' implementation of "Efficient Neural Architecture Search via Parameter Sharing" (2018) in TensorFlow.
Includes code for CIFAR-10 image classification and Penn Tree Bank language modeling tasks.
Paper: https://arxiv.org/abs/1802.03268
Authors: Hieu Pham*, Melody Y. Guan*, Barret Zoph, Quoc V. Le, Jeff Dean
_This is not an official Google product._
## Penn Treebank
**IMPORTANT ERRATA**: The implementation of Language Model on this repository is wrong. Please do not use it. The correct implementation is at the [new repository](https://github.com/google-research/google-research/tree/master/enas_lm). We apologize for the inconvenience.
## CIFAR-10
To run the experiments on CIFAR-10, please first download the [dataset](https://www.cs.toronto.edu/~kriz/cifar.html). Again, all hyper-parameters are specified in the scripts that we descibe below.
To run the ENAS experiments on the _macro search space_ as described in our paper, please use the following scripts:
```
./scripts/cifar10_macro_search.sh
./scripts/cifar10_macro_final.sh
```A macro architecture for a neural network with `N` layers consists of `N` parts, indexed by `1, 2, 3, ..., N`. Part `i` consists of:
* A number in `[0, 1, 2, 3, 4, 5]` that specifies the operation at layer `i`-th, corresponding to `conv_3x3`, `separable_conv_3x3`, `conv_5x5`, `separable_conv_5x5`, `average_pooling`, `max_pooling`.
* A sequence of `i - 1` numbers, each is either `0` or `1`, indicating whether a skip connection should be formed from a the corresponding past layer to the current layer.A concrete example can be found in our script `./scripts/cifar10_macro_final.sh`.
To run the ENAS experiments on the _micro search space_ as described in our paper, please use the following scripts:
```
./scripts/cifar10_micro_search.sh
./scripts/cifar10_micro_final.sh
```A micro cell with `B + 2` blocks can be specified using `B` blocks, corresponding to blocks numbered `2, 3, ..., B+1`, each block consists of `4` numbers
```
index_1, op_1, index_2, op_2
```
Here, `index_1` and `index_2` can be any previous index. `op_1` and `op_2` can be `[0, 1, 2, 3, 4]`, corresponding to `separable_conv_3x3`, `separable_conv_5x5`, `average_pooling`, `max_pooling`, `identity`.A micro architecture can be specified by two sequences of cells concatenated after each other, as shown in our script `./scripts/cifar10_micro_final.sh`
## Citations
If you happen to use our work, please consider citing our paper.
```
@inproceedings{enas,
title = {Efficient Neural Architecture Search via Parameter Sharing},
author = {Pham, Hieu and
Guan, Melody Y. and
Zoph, Barret and
Le, Quoc V. and
Dean, Jeff
},
booktitle = {ICML},
year = {2018}
}
```