Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/D-X-Y/AutoDL-Projects

Automated deep learning algorithms implemented in PyTorch.
https://github.com/D-X-Y/AutoDL-Projects

autodl automl nas neural-architecture-search pytorch

Last synced: about 1 month ago
JSON representation

Automated deep learning algorithms implemented in PyTorch.

Awesome Lists containing this project

README

        



---------
[![MIT licensed](https://img.shields.io/badge/license-MIT-brightgreen.svg)](LICENSE.md)

Automated Deep Learning Projects (AutoDL-Projects) is an open source, lightweight, but useful project for everyone.
This project implemented several neural architecture search (NAS) and hyper-parameter optimization (HPO) algorithms.
中文介绍见[README_CN.md](https://github.com/D-X-Y/AutoDL-Projects/tree/main/docs/README_CN.md)

**Who should consider using AutoDL-Projects**

- Beginners who want to **try different AutoDL algorithms**
- Engineers who want to **try AutoDL** to investigate whether AutoDL works on your projects
- Researchers who want to **easily** implement and experiement **new** AutoDL algorithms.

**Why should we use AutoDL-Projects**
- Simple library dependencies
- All algorithms are in the same codebase
- Active maintenance

## AutoDL-Projects Capabilities

At this moment, this project provides the following algorithms and scripts to run them. Please see the details in the link provided in the description column.



Type
ABBRV
Algorithms
Description


NAS
TAS
Network Pruning via Transformable Architecture Search
NeurIPS-2019-TAS.md


DARTS
DARTS: Differentiable Architecture Search
ICLR-2019-DARTS.md


GDAS
Searching for A Robust Neural Architecture in Four GPU Hours
CVPR-2019-GDAS.md


SETN
One-Shot Neural Architecture Search via Self-Evaluated Template Network
ICCV-2019-SETN.md


NAS-Bench-201
NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search
NAS-Bench-201.md


NATS-Bench
NATS-Bench: Benchmarking NAS Algorithms for Architecture Topology and Size
NATS-Bench.md


...
ENAS / REA / REINFORCE / BOHB
Please check the original papers
NAS-Bench-201.md NATS-Bench.md


HPO
HPO-CG
Hyperparameter optimization with approximate gradient
coming soon


Basic
ResNet
Deep Learning-based Image Classification
BASELINE.md

## Requirements and Preparation

**First of all**, please use `pip install .` to install `xautodl` library.

Please install `Python>=3.6` and `PyTorch>=1.5.0`. (You could use lower versions of Python and PyTorch, but may have bugs).
Some visualization codes may require `opencv`.

CIFAR and ImageNet should be downloaded and extracted into `$TORCH_HOME`.
Some methods use knowledge distillation (KD), which require pre-trained models. Please download these models from [Google Drive](https://drive.google.com/open?id=1ANmiYEGX-IQZTfH8w0aSpj-Wypg-0DR-) (or train by yourself) and save into `.latent-data`.

Please use
```
git clone --recurse-submodules https://github.com/D-X-Y/AutoDL-Projects.git XAutoDL
```
to download this repo with submodules.

## Citation

If you find that this project helps your research, please consider citing the related paper:
```
@inproceedings{dong2021autohas,
title = {{AutoHAS}: Efficient Hyperparameter and Architecture Search},
author = {Dong, Xuanyi and Tan, Mingxing and Yu, Adams Wei and Peng, Daiyi and Gabrys, Bogdan and Le, Quoc V},
booktitle = {2nd Workshop on Neural Architecture Search at International Conference on Learning Representations (ICLR)},
year = {2021}
}
@article{dong2021nats,
title = {{NATS-Bench}: Benchmarking NAS Algorithms for Architecture Topology and Size},
author = {Dong, Xuanyi and Liu, Lu and Musial, Katarzyna and Gabrys, Bogdan},
doi = {10.1109/TPAMI.2021.3054824},
journal = {IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)},
year = {2021},
note = {\mbox{doi}:\url{10.1109/TPAMI.2021.3054824}}
}
@inproceedings{dong2020nasbench201,
title = {{NAS-Bench-201}: Extending the Scope of Reproducible Neural Architecture Search},
author = {Dong, Xuanyi and Yang, Yi},
booktitle = {International Conference on Learning Representations (ICLR)},
url = {https://openreview.net/forum?id=HJxyZkBKDr},
year = {2020}
}
@inproceedings{dong2019tas,
title = {Network Pruning via Transformable Architecture Search},
author = {Dong, Xuanyi and Yang, Yi},
booktitle = {Neural Information Processing Systems (NeurIPS)},
pages = {760--771},
year = {2019}
}
@inproceedings{dong2019one,
title = {One-Shot Neural Architecture Search via Self-Evaluated Template Network},
author = {Dong, Xuanyi and Yang, Yi},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
pages = {3681--3690},
year = {2019}
}
@inproceedings{dong2019search,
title = {Searching for A Robust Neural Architecture in Four GPU Hours},
author = {Dong, Xuanyi and Yang, Yi},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
pages = {1761--1770},
year = {2019}
}
```

# Others

If you want to contribute to this repo, please see [CONTRIBUTING.md](.github/CONTRIBUTING.md).
Besides, please follow [CODE-OF-CONDUCT.md](.github/CODE-OF-CONDUCT.md).

We use [`black`](https://github.com/psf/black) for Python code formatter.
Please use `black . -l 88`.

# License
The entire codebase is under the [MIT license](LICENSE.md).