{"id":13671372,"url":"https://github.com/ziplab/EcoFormer","last_synced_at":"2025-04-27T18:31:12.037Z","repository":{"id":59381388,"uuid":"536941160","full_name":"ziplab/EcoFormer","owner":"ziplab","description":"[NeurIPS 2022 Spotlight] This is the official PyTorch implementation of \"EcoFormer: Energy-Saving Attention with Linear Complexity\"","archived":false,"fork":false,"pushed_at":"2022-11-15T04:08:04.000Z","size":805,"stargazers_count":66,"open_issues_count":0,"forks_count":1,"subscribers_count":5,"default_branch":"main","last_synced_at":"2024-08-03T09:09:29.808Z","etag":null,"topics":["classification","efficient-transformers","neurips-2022","pytorch","vision-transformer"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ziplab.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2022-09-15T08:51:52.000Z","updated_at":"2024-01-11T19:52:16.000Z","dependencies_parsed_at":"2023-01-23T04:01:23.090Z","dependency_job_id":null,"html_url":"https://github.com/ziplab/EcoFormer","commit_stats":null,"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ziplab%2FEcoFormer","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ziplab%2FEcoFormer/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ziplab%2FEcoFormer/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ziplab%2FEcoFormer/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ziplab","download_url":"https://codeload.github.com/ziplab/EcoFormer/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":224079330,"owners_count":17252267,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["classification","efficient-transformers","neurips-2022","pytorch","vision-transformer"],"created_at":"2024-08-02T09:01:07.871Z","updated_at":"2024-11-11T09:30:21.625Z","avatar_url":"https://github.com/ziplab.png","language":"Python","readme":"# EcoFormer: Energy-Saving Attention with Linear Complexity（NeurIPS 2022 Spotlight) 🚀\n\u003ca href=\"https://arxiv.org/abs/2209.09004\"\u003e\u003cimg src=\"https://img.shields.io/badge/arXiv-2209.09004-b31b1b.svg\" height=22.5\u003e\u003c/a\u003e \n[![License](https://img.shields.io/badge/License-Apache_2.0-blue.svg)](https://opensource.org/licenses/Apache-2.0) \n\u003ca href=\"https://pytorch.org/get-started/locally/\"\u003e\u003cimg alt=\"PyTorch\" src=\"https://img.shields.io/badge/PyTorch-ee4c2c?logo=pytorch\u0026logoColor=white\"\u003e\u003c/a\u003e\n\nThis is the official PyTorch implementation of [EcoFormer: Energy-Saving Attention with Linear Complexity](https://arxiv.org/abs/2209.09004) by [Jing Liu](https://scholar.google.com/citations?user=-lHaZH4AAAAJ\u0026hl=en), [Zizheng Pan](https://scholar.google.com.au/citations?user=w_VMopoAAAAJ\u0026hl=en), [Haoyu He](https://scholar.google.com.au/citations?user=aU1zMhUAAAAJ\u0026hl=en), [Jianfei Cai](https://scholar.google.com/citations?user=N6czCoUAAAAJ\u0026hl=en), and [Bohan Zhuang](https://scholar.google.com.au/citations?user=DFuDBBwAAAAJ).\n\n## News\n\n- **11/11/2022.** EcoFormer is selected as Spotlight!\n- **08/10/2022.** We release the source code. Any issues are welcomed!\n- **15/09/2022.** EcoFormer is accepted by NeurIPS 2022! 🔥🔥🔥\n\n## A Gentle Introduction\n![EcoFormer](framework.png)\n\nWe present a novel energy-saving attention mechanism with linear complexity, called EcoFormer, to save the vast majority of multiplications from a new binarization perspective. More details can be found in our [paper](https://arxiv.org/abs/2209.09004).\n\n## Installation\n\n### Requirements\n\n- Python ≥ 3.8\n- PyTorch 1.10.1\n- CUDA 11.1\n- Torchvision 0.11.2\n- PyTorch Image Models (timm) 0.4.9\n- MMCV 1.3.8\n- Einops 0.4.1\n- SciPy 1.8.0\n\n### Instructions\n\nUse [Anaconda](https://www.anaconda.com) to create the running environment for the project, kindly run\n\n```bash\ngit clone https://github.com/ziplab/EcoFormer\ncd EcoFormer\nconda env create -f environment/environment.yml\nconda activate ecoformer\n```\n\n**Note**: If the above instructions does not work on your machine, please refer to [environment/README.md](./environment/README.md) for manual installation and trouble shootings.\n\n## Getting Started\n\nFor experiments on PVTv2, please refer to [pvt](./pvt).\n\nFor experiments on twins, please refer to [twins](./twins).\n\n## Results and Model Zoo\n\n| Model       | #Mul. (B) | #Add. (B) | Energy (B pJ) | Throughput (images/s) | Top-1 Acc. (%) | Download                                                                                       |\n| ----------- | --------- | --------- | ------------- | --------------------- | -------------- | ---------------------------------------------------------------------------------------------- |\n| PVTv2-B0    | 0.54      | 0.56      | 2.5           | 1379                  | 70.44          | [Github](https://github.com/ziplab/EcoFormer/releases/download/v1.0/pvtv2_b0_ecoformer.pth)    |\n| PVTv2-B1    | 2.03      | 2.09      | 9.4           | 874                   | 78.38          | [Github](https://github.com/ziplab/EcoFormer/releases/download/v1.0/pvtv2_b1_ecoformer.pth)    |\n| PVTv2-B2    | 3.85      | 3.97      | 17.8          | 483                   | 81.28          | [Github](https://github.com/ziplab/EcoFormer/releases/download/v1.0/pvtv2_b2_ecoformer.pth)    |\n| PVTv2-B3    | 6.54      | 6.75      | 30.25         | 325                   | 81.96          | [Github](https://github.com/ziplab/EcoFormer/releases/download/v1.0/pvtv2_b3_ecoformer.pth)    |\n| PVTv2-B4    | 9.57      | 9.82      | 44.25         | 249                   | 81.90          | [Github](https://github.com/ziplab/EcoFormer/releases/download/v1.0/pvtv2_b4_ecoformer.pth)    |\n| Twins-SVT-S | 2.72      | 2.81      | 12.6          | 576                   | 80.22          | [Github](https://github.com/ziplab/EcoFormer/releases/download/v1.0/twins_svt_s_ecoformer.pth) |\n\n## Citation\nIf you find ``EcoFormer`` useful in your research, please consider to cite the following related papers:\n\n```BibTeX\n@inproceedings{liu2022ecoformer,\n  title={EcoFormer: Energy-Saving Attention with Linear Complexity},\n  author={Liu, Jing and Pan, Zizheng and He, Haoyu and Cai, Jianfei and Zhuang, Bohan},\n  booktitle={NeurIPS},\n  year={2022}\n}\n```\n\n## License\n\nThis repository is released under the Apache 2.0 license as found in the [LICENSE](./LICENSE) file.\n\n## Acknowledgement\n\nThis repository is built upon [PVT](https://github.com/whai362/PVT) and [Twins](https://github.com/Meituan-AutoML/Twins). We thank the authors for their open-sourced code.\n","funding_links":[],"categories":["Python"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fziplab%2FEcoFormer","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fziplab%2FEcoFormer","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fziplab%2FEcoFormer/lists"}