{"id":13869954,"url":"https://github.com/automl/NASLib","last_synced_at":"2025-07-15T20:30:56.397Z","repository":{"id":37868477,"uuid":"185628775","full_name":"automl/NASLib","owner":"automl","description":" NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.","archived":false,"fork":false,"pushed_at":"2024-11-11T15:56:07.000Z","size":544418,"stargazers_count":525,"open_issues_count":35,"forks_count":118,"subscribers_count":18,"default_branch":"Develop","last_synced_at":"2024-11-11T16:41:20.902Z","etag":null,"topics":["automl","nas","neural-architecture-search"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/automl.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"docs/contributing.html","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":".github/CODEOWNERS","security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2019-05-08T15:03:37.000Z","updated_at":"2024-11-11T14:22:18.000Z","dependencies_parsed_at":"2024-11-11T16:42:10.482Z","dependency_job_id":null,"html_url":"https://github.com/automl/NASLib","commit_stats":null,"previous_names":[],"tags_count":2,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/automl%2FNASLib","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/automl%2FNASLib/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/automl%2FNASLib/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/automl%2FNASLib/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/automl","download_url":"https://codeload.github.com/automl/NASLib/tar.gz/refs/heads/Develop","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":226068121,"owners_count":17568701,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["automl","nas","neural-architecture-search"],"created_at":"2024-08-05T20:01:23.305Z","updated_at":"2024-11-23T16:30:54.663Z","avatar_url":"https://github.com/automl.png","language":"Python","funding_links":[],"categories":["Python","Uncategorized"],"sub_categories":["Uncategorized"],"readme":"\u003cdiv align=\"center\"\u003e\n  ** For the \u003ca href='https://codalab.lisn.upsaclay.fr/competitions/3932'\u003eZero-Cost NAS Competition\u003c/a\u003e, please switch to the \u003ca href='https://github.com/automl/NASLib/tree/automl-conf-competition'\u003e\u003ccode\u003eautoml-conf-competition\u003c/code\u003e\u003c/a\u003e branch ** \u003cbr\u003e\u003cbr\u003e\n\n  \u003cimg src=\"images/naslib-logo.png\" width=\"400\" height=\"250\"\u003e\n\u003c/div\u003e\n\n\u003cp align=\"center\"\u003e\n  \u003ca href=\"https://github.com/automl/NASLib\"\u003e\n    \u003cimg src=\"https://img.shields.io/badge/Python-3.7%20%7C%203.8-blue?style=for-the-badge\u0026logo=python\" /\u003e\n  \u003c/a\u003e\u0026nbsp;\n  \u003ca href=\"https://pytorch.org/\"\u003e\n    \u003cimg src=\"https://img.shields.io/badge/pytorch-1.9-orange?style=for-the-badge\u0026logo=pytorch\" alt=\"PyTorch Version\" /\u003e\n  \u003c/a\u003e\u0026nbsp;\n  \u003ca href=\"https://github.com/automl/NASLib\"\u003e\n    \u003cimg src=\"https://img.shields.io/badge/open-source-9cf?style=for-the-badge\u0026logo=Open-Source-Initiative\" alt=\"Open Source\" /\u003e\n  \u003c/a\u003e\n  \u003ca href=\"https://github.com/automl/NASLib\"\u003e\n    \u003cimg src=\"https://img.shields.io/github/stars/automl/naslib?style=for-the-badge\u0026logo=github\" alt=\"GitHub Repo Stars\" /\u003e\n  \u003c/a\u003e\n\u003c/p\u003e\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"https://repobeats.axiom.co/api/embed/44112451b6b665f03b7c7dc35dbeff8050df036f.svg\" width=\"750\" /\u003e\n\u003c/p\u003e\n\n\n\n\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\n**NASLib** is a modular and flexible framework created with the aim of providing a common codebase to the community to facilitate research on **Neural Architecture Search** (NAS). It offers high-level abstractions for designing and reusing search spaces, interfaces to benchmarks and evaluation pipelines, enabling the implementation and extension of state-of-the-art NAS methods with a few lines of code. The modularized nature of NASLib\nallows researchers to easily innovate on individual components (e.g., define a new\nsearch space while reusing an optimizer and evaluation pipeline, or propose a new\noptimizer with existing search spaces). It is designed to be modular, extensible and easy to use.\n\nNASLib was developed by the [**AutoML Freiburg group**](https://www.automl.org/team/) and with the help of the NAS community, we are constantly adding new _search spaces_, _optimizers_ and _benchmarks_ to the library. Please reach out to zelaa@cs.uni-freiburg.de for any questions or potential collaborations. \n\n![naslib-overview](images/naslib-overall.png)\n\n[**Setup**](#setup)\n| [**Usage**](#usage)\n| [**Docs**](https://automl.github.io/NASLib/)\n| [**Contributing**](#contributing)\n| [**Cite**](#cite)\n\n# Setup\n\nWhile installing the repository, creating a new conda environment is recomended. [Install PyTorch GPU/CPU](https://pytorch.org/get-started/locally/) for your setup.\n\n```bash\nconda create -n mvenv python=3.7\nconda install pytorch torchvision torchaudio cudatoolkit=11.1 -c pytorch -c nvidia\n```\n\nRun setup.py file with the following command, which will install all the packages listed in [`requirements.txt`](requirements.txt)\n```bash\npip install --upgrade pip setuptools wheel\npip install -e .\n```\n\nTo validate the setup, you can run tests:\n\n```bash\ncd tests\ncoverage run -m unittest discover -v\n```\n\nThe test coverage can be seen with `coverage report`.\n\n## Queryable Benchmarks\nNASLib allows you to query the following (tabular and surrogate) benchmarks for the performance of any architecture, for a given search space, dataset and task. To set them up, simply download the benchmark data files from the these URLs and place them in `naslib/data`.\n\n| Benchmark     | Task                               | Datasets |        Data URL       | Required Files |\n|---------------|------------------------------------|----------|-----------------------|----------------|\n|NAS-Bench-101  | Image Classification   |                    CIFAR10                   | [cifar10](https://drive.google.com/file/d/1oORtEmzyfG1GcnPHh0ijCs0gCHKEThNx/view?usp=sharing)| `naslib/data/nasbench_only108.pkl` |\n|NAS-Bench-201  | Image Classification   |  CIFAR10 \u003cbr\u003e CIFAR100 \u003cbr\u003e ImageNet16-120   | [cifar10](https://drive.google.com/file/d/1sh8pEhdrgZ97-VFBVL94rI36gedExVgJ/view?usp=sharing) \u003cbr\u003e [cifar100](https://drive.google.com/file/d/1hV6-mCUKInIK1iqZ0jfBkcKaFmftlBtp/view?usp=sharing) \u003cbr\u003e [imagenet](https://drive.google.com/file/d/1FVCn54aQwD6X6NazaIZ_yjhj47mOGdIH/view?usp=sharing)| `naslib/data/nb201_cifar10_full_training.pickle` \u003cbr\u003e `naslib/data/nb201_cifar100_full_training.pickle` \u003cbr\u003e  `naslib/data/nb201_ImageNet16_full_training.pickle`|\n|NAS-Bench-301  | Image Classification   |                    CIFAR10                   |  [cifar10](https://drive.google.com/file/d/1YJ80Twt9g8Gaf8mMgzK-f5hWaVFPlECF/view?usp=sharing)\u003cbr\u003e [models](https://figshare.com/articles/software/nasbench301_models_v1_0_zip/13061510) |`naslib/data/nb301_full_training.pickle` \u003cbr\u003e `naslib/data/nb_models/...`|\n|NAS-Bench-ASR  | Automatic Speech Recognition  |               TIMIT                   |  [timit](https://github.com/SamsungLabs/nb-asr/releases/tag/v1.1.0) | `naslib/data/nb-asr-bench-gtx-1080ti-fp32.pickle` \u003cbr\u003e `naslib/data/nb-asr-bench-jetson-nano-fp32.pickle` \u003cbr\u003e `naslib/data/nb-asr-e40-1234.pickle` \u003cbr\u003e `naslib/data/nb-asr-e40-1235.pickle` \u003cbr\u003e `naslib/data/nb-asr-e40-1236.pickle` \u003cbr\u003e `naslib/data/nb-asr-info.pickle`\n|NAS-Bench-NLP  | Natural Language Processing   |           Penn Treebank               |               [ptb](https://drive.google.com/file/d/1DtrmuDODeV2w5kGcmcHcGj5JXf2qWg01/view?usp=sharing), [models](https://drive.google.com/file/d/13Kbn9VWHuBdSN3lG4Mbyr2-VdrTsfLfd/view?usp=sharing)| `naslib/data/nb_nlp.pickle` \u003cbr\u003e `naslib/data/nbnlp_v01/...`|\n|TransNAS-Bench-101  | 7 Computer Vision tasks  |             Taskonomy                 | [taskonomy](https://www.noahlab.com.hk/opensource/vega/page/doc.html?path=datasets/transnasbench101) |`naslib/data/transnas-bench_v10141024.pth`|\n\nFor `NAS-Bench-301` and `NAS-Bench-NLP`, additionally, you will have to install the NASBench301 API from [here](https://github.com/crwhite14/nasbench301).\n\nOnce set up, you can test if the APIs work as follows:\n```\npython test_benchmark_apis.py --all --show_error\n```\nYou can also test any one API.\n```\npython test_benchmark_apis.py --search_space \u003csearch_space\u003e --show_error\n```\n# Usage\n\nTo get started, check out [`demo.py`](examples/demo.py).\n\n```python\nsearch_space = SimpleCellSearchSpace()\n\noptimizer = DARTSOptimizer(**config.search)\noptimizer.adapt_search_space(search_space, config.dataset)\n\ntrainer = Trainer(optimizer, config)\ntrainer.search()        # Search for an architecture\ntrainer.evaluate()      # Evaluate the best architecture\n```\n\nFor more examples see [naslib tutorial](examples/naslib_tutorial.ipynb), [intro to search spaces](examples/search_spaces.ipynb) and [intro to predictors](examples/predictors.md).\n\n### Scripts for running multiple experiments on a cluster\nThe `scripts` folder contains code for generating config files for running experiments across various configurations and seeds. It writes them into the `naslib/configs` folder. \n\n```bash\ncd scripts\nbash bbo/make_configs_asr.sh\n```\n\nIt also contains `scheduler.sh` files to automatically read these generated config files and submits a corresponding job to the cluster using SLURM.\n\n## Contributing\nWe welcome contributions to the library along with any potential issues or suggestions. Please create a pull request to the Develop branch.\n\n\n## Cite\n\nIf you use this code in your own work, please use the following bibtex entries:\n\n```bibtex\n@misc{naslib-2020, \n  title={NASLib: A Modular and Flexible Neural Architecture Search Library}, \n  author={Ruchte, Michael and Zela, Arber and Siems, Julien and Grabocka, Josif and Hutter, Frank}, \n  year={2020}, publisher={GitHub}, \n  howpublished={\\url{https://github.com/automl/NASLib}} }\n  \n@inproceedings{mehta2022bench,\n  title={NAS-Bench-Suite: NAS Evaluation is (Now) Surprisingly Easy},\n  author={Mehta, Yash and White, Colin and Zela, Arber and Krishnakumar, Arjun and Zabergja, Guri and Moradian, Shakiba and Safari, Mahmoud and Yu, Kaicheng and Hutter, Frank},\n  booktitle={International Conference on Learning Representations},\n  year={2022}\n}\n ``` \n \n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"images/predictors.png\" alt=\"predictors\" width=\"75%\"\u003e\n\u003c/p\u003e\n\nNASLib has been used to run an extensive comparison of 31 performance predictors (figure above). See the separate readme: \u003ca href=\"examples/predictors.md\"\u003epredictors.md\u003c/a\u003e\nand our paper: \u003ca href=\"https://arxiv.org/abs/2104.01177\"\u003eHow Powerful are Performance Predictors in Neural Architecture Search?\u003c/a\u003e\n\n```bibtex\n@article{white2021powerful,\n  title={How Powerful are Performance Predictors in Neural Architecture Search?},\n  author={White, Colin and Zela, Arber and Ru, Robin and Liu, Yang and Hutter, Frank},\n  journal={Advances in Neural Information Processing Systems},\n  volume={34},\n  year={2021}\n}\n```\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fautoml%2FNASLib","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fautoml%2FNASLib","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fautoml%2FNASLib/lists"}