{"id":18340892,"url":"https://github.com/tinyvision/prenas","last_synced_at":"2025-04-06T06:31:30.398Z","repository":{"id":165390240,"uuid":"638846011","full_name":"tinyvision/PreNAS","owner":"tinyvision","description":"The official implementation of paper PreNAS: Preferred One-Shot Learning Towards Efficient Neural Architecture Search","archived":false,"fork":false,"pushed_at":"2023-09-05T14:10:46.000Z","size":61,"stargazers_count":27,"open_issues_count":3,"forks_count":6,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-03-21T18:51:46.110Z","etag":null,"topics":["icml-2023","neural-architecture-search","one-shot-nas","pytorch","transformer","zero-cost-nas","zero-cost-proxies"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/tinyvision.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE.txt","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-05-10T08:24:59.000Z","updated_at":"2025-03-20T14:03:14.000Z","dependencies_parsed_at":null,"dependency_job_id":"cfba67d0-7a7a-4818-8b0b-e21eb93f2824","html_url":"https://github.com/tinyvision/PreNAS","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tinyvision%2FPreNAS","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tinyvision%2FPreNAS/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tinyvision%2FPreNAS/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/tinyvision%2FPreNAS/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/tinyvision","download_url":"https://codeload.github.com/tinyvision/PreNAS/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247445647,"owners_count":20939951,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["icml-2023","neural-architecture-search","one-shot-nas","pytorch","transformer","zero-cost-nas","zero-cost-proxies"],"created_at":"2024-11-05T20:24:35.015Z","updated_at":"2025-04-06T06:31:30.390Z","avatar_url":"https://github.com/tinyvision.png","language":"Python","readme":"# PreNAS: Preferred One-Shot Learning Towards Efficient Neural Architecture Search\n\nPreNAS is a novel learning paradigm that integrates one-shot and zero-shot NAS techniques to enhance search efficiency and training effectiveness. \nThis search-free approach outperforms current state-of-the-art one-shot NAS methods for both Vision Transformer and convolutional architectures, \nas confirmed by its superior performance when the code is released.\n\n\u003eWang H, Ge C, Chen H and Sun X. PreNAS: Preferred One-Shot Learning Towards Efficient Neural Architecture Search. ICML 2023.\n\nPaper link: [arXiv](https://arxiv.org/abs/2304.14636)\n\n## Overview\n\u003cbr\u003e\n\u003cdiv align=\"center\"\u003e\u003cimg width=\"85%\" src=\"figure/overview.svg\"\u003e\u003c/div\u003e\n\u003cbr\u003e\nPrevious one-shot NAS samples all architectures in the search space when one-shot training of the supernet for better evaluation in evolution search. \nInstead, PreNAS first searches the target architectures via a zero-cost proxy and next applies preferred one-shot training to supernet. \nPreNAS improves the Pareto Frontier benefited from the preferred one-shot learning and is search-free after training by offering the models with the \nadvance selected architectures from the zero-cost search.\n\n## Environment Setup\n\nTo set up the environment you can easily run the following command:\n```buildoutcfg\nconda create -n PreNAS python=3.7\nconda activate PreNAS\npip install -r requirements.txt\n```\n\n## Data Preparation \nYou need to download the [ImageNet-2012](http://www.image-net.org/) to the folder `../data/imagenet`.\n\n## Run example\nThe code was run on 8 x 80G A100.\n- Zero-Shot Search\n\n  `bash 01_zero_shot_search.sh`\n\n- One-Shot Training\n\n  `bash 02_one_shot_training.sh`\n\n- Evaluation\n\n  `bash 03_evaluation.sh`\n\n## Model Zoo\n\n| Model        | TOP-1 (%)  | TOP-5 (%)     | #Params (M)   | FLOPs (G) | Download Link |\n| ------------ | ---------- | ------------- | ------------- | --------- | ------------- |\n| PreNAS-Ti    | 77.1       | 93.4          | 5.9           | 1.4       | [AliCloud](https://idstcv.oss-cn-zhangjiakou.aliyuncs.com/PreNAS/supernet-tiny.pth)    | \n| PreNAS-S     | 81.8       | 95.9          | 22.9          | 5.1       | [AliCloud](https://idstcv.oss-cn-zhangjiakou.aliyuncs.com/PreNAS/supernet-small.pth)    |\n| PreNAS-B     | 82.6       | 96.0          | 54            | 11        | [AliCloud](https://idstcv.oss-cn-zhangjiakou.aliyuncs.com/PreNAS/supernet-base.pth)    |\n\n## Bibtex\n\nIf PreNAS is useful for you, please consider to cite it. Thank you! :)\n```bibtex\n@InProceedings{PreNAS,\n    title     = {PreNAS: Preferred One-Shot Learning Towards Efficient Neural Architecture Search},\n    author    = {Wang, Haibin and Ge, Ce and Chen, Hesen and Sun, Xiuyu},\n    booktitle = {International Conference on Machine Learning (ICML)},\n    month     = {July},\n    year      = {2023}\n}\n```\n\n## Acknowledgements\n\nThe codes are inspired by [AutoFormer](https://github.com/microsoft/Cream/tree/main/AutoFormer).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ftinyvision%2Fprenas","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Ftinyvision%2Fprenas","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ftinyvision%2Fprenas/lists"}