{"id":13499106,"url":"https://github.com/ShichenLiu/CondenseNet","last_synced_at":"2025-03-29T03:32:22.594Z","repository":{"id":85878312,"uuid":"104818215","full_name":"ShichenLiu/CondenseNet","owner":"ShichenLiu","description":"CondenseNet: Light weighted CNN for mobile devices","archived":false,"fork":false,"pushed_at":"2019-11-11T03:38:46.000Z","size":29,"stargazers_count":693,"open_issues_count":11,"forks_count":130,"subscribers_count":23,"default_branch":"master","last_synced_at":"2025-03-28T17:11:11.292Z","etag":null,"topics":["deep-learning","mobile-device","pytorch"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ShichenLiu.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null}},"created_at":"2017-09-26T01:03:24.000Z","updated_at":"2025-03-03T12:39:13.000Z","dependencies_parsed_at":"2023-06-18T06:15:13.166Z","dependency_job_id":null,"html_url":"https://github.com/ShichenLiu/CondenseNet","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ShichenLiu%2FCondenseNet","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ShichenLiu%2FCondenseNet/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ShichenLiu%2FCondenseNet/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ShichenLiu%2FCondenseNet/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ShichenLiu","download_url":"https://codeload.github.com/ShichenLiu/CondenseNet/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246135765,"owners_count":20729056,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["deep-learning","mobile-device","pytorch"],"created_at":"2024-07-31T22:00:28.888Z","updated_at":"2025-03-29T03:32:22.575Z","avatar_url":"https://github.com/ShichenLiu.png","language":"Python","readme":"# CondenseNets\n\nThis repository contains the code (in PyTorch) for \"[CondenseNet: An Efficient DenseNet using Learned Group Convolutions](https://arxiv.org/abs/1711.09224)\" paper by [Gao Huang](http://www.cs.cornell.edu/~gaohuang/)\\*, [Shichen Liu](https://shichenliu.github.io)\\*, [Laurens van der Maaten](https://lvdmaaten.github.io) and [Kilian Weinberger](https://www.cs.cornell.edu/%7Ekilian/) (* Authors contributed equally).\n\n### Citation\n\nIf you find our project useful in your research, please consider citing:\n\n```\n@inproceedings{huang2018condensenet,\n  title={Condensenet: An efficient densenet using learned group convolutions},\n  author={Huang, Gao and Liu, Shichen and Van der Maaten, Laurens and Weinberger, Kilian Q},\n  booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},\n  pages={2752--2761},\n  year={2018}\n}\n```\n\n## Contents\n\n1. [Introduction](#introduction)\n2. [Usage](#usage)\n3. [Results](#results)\n4. [Discussions](#discussions)\n5. [Contacts](#contacts)\n\n## Introduction\n\nCondenseNet is a novel, computationally efficient convolutional network architecture. It combines dense connectivity between layers with a mechanism to remove unused connections. The dense connectivity facilitates feature re-use in the network, whereas learned group convolutions remove connections between layers for which this feature re-use is superfluous. At test time, our model can be implemented using standard grouped convolutions —- allowing for efficient computation in practice. Our experiments demonstrate that CondenseNets are much more efficient than other compact convolutional networks such as MobileNets and ShuffleNets.\n\n\u003cimg src=\"https://user-images.githubusercontent.com/9162722/32978657-b10fae0e-cc81-11e7-888d-1f9e4c028a9b.png\"\u003e\n\nFigure 1: Learned Group Convolution with G=C=3.\n\n\u003cimg src=\"https://user-images.githubusercontent.com/9162722/31302319-6ca3a49c-ab33-11e7-938c-70379feca5bc.jpg\" width=\"480\"\u003e\n\nFigure 2: CondenseNets with Fully Dense Connectivity and Increasing Growth Rate.\n\n## Usage\n\n### Dependencies\n\n- [Python3](https://www.python.org/downloads/)\n- [PyTorch(1.1.0)](http://pytorch.org)\n- [ImageNet](https://www.image-net.org/challenges/LSVRC/2012/)\n\n### Train\nAs an example, use the following command to train a CondenseNet on ImageNet\n\n```\npython main.py --model condensenet -b 256 -j 20 /PATH/TO/IMAGENET \\\n--stages 4-6-8-10-8 --growth 8-16-32-64-128 --gpu 0,1,2,3,4,5,6,7 --resume\n```\n\nAs another example, use the following command to train a CondenseNet on CIFAR-10\n\n```\npython main.py --model condensenet -b 64 -j 12 cifar10 \\\n--stages 14-14-14 --growth 8-16-32 --gpu 0 --resume\n```\n\n\n### Evaluation\nWe take the ImageNet model trained above as an example.\n\nTo evaluate the trained model, use `evaluate` to evaluate from the default checkpoint directory:\n\n```\npython main.py --model condensenet -b 64 -j 20 /PATH/TO/IMAGENET \\\n--stages 4-6-8-10-8 --growth 8-16-32-64-128 --gpu 0 --resume \\\n--evaluate\n```\n\nor use `evaluate-from` to evaluate from an arbitrary path:\n\n```\npython main.py --model condensenet -b 64 -j 20 /PATH/TO/IMAGENET \\\n--stages 4-6-8-10-8 --growth 8-16-32-64-128 --gpu 0 --resume \\\n--evaluate-from /PATH/TO/BEST/MODEL\n```\n\nNote that these models are still the large models. To convert the model to group-convolution version as described in the paper, use the `convert-from` function:\n\n```\npython main.py --model condensenet -b 64 -j 20 /PATH/TO/IMAGENET \\\n--stages 4-6-8-10-8 --growth 8-16-32-64-128 --gpu 0 --resume \\\n--convert-from /PATH/TO/BEST/MODEL\n```\n\nFinally, to directly load from a converted model (that is, a CondenseNet), use a **converted model file** in combination with the `evaluate-from` option:\n\n```\npython main.py --model condensenet_converted -b 64 -j 20 /PATH/TO/IMAGENET \\\n--stages 4-6-8-10-8 --growth 8-16-32-64-128 --gpu 0 --resume \\\n--evaluate-from /PATH/TO/CONVERTED/MODEL\n```\n\n### Other Options\nWe also include DenseNet implementation in this repository.  \nFor more examples of usage, please refer to [script.sh](script.sh)  \nFor detailed options, please `python main.py --help`\n\n## Results\n\n### Results on ImageNet\n\n| Model | FLOPs | Params | Top-1 Err. | Top-5 Err. | Pytorch Model |\n|---|---|---|---|---|---|\n| CondenseNet-74 (C=G=4) | 529M | 4.8M | 26.2 | 8.3 | [Download (18.69M)](https://www.dropbox.com/s/sj26rm4so3uhdmg/converted_condensenet_4.pth.tar?dl=0) |\n| CondenseNet-74 (C=G=8) | 274M | 2.9M | 29.0 | 10.0 | [Download (11.68M)](https://www.dropbox.com/s/aj1xpd6zcnclous/converted_condensenet_8.pth.tar?dl=0) |\n\n### Results on CIFAR\n\n| Model | FLOPs | Params | CIFAR-10 | CIFAR-100 |\n|---|---|---|---|---|\n| CondenseNet-50 | 28.6M | 0.22M | 6.22 | - |\n| CondenseNet-74 | 51.9M | 0.41M | 5.28 | - |\n| CondenseNet-86 | 65.8M | 0.52M | 5.06 | 23.64 |\n| CondenseNet-98 | 81.3M | 0.65M | 4.83 | - |\n| CondenseNet-110 | 98.2M | 0.79M | 4.63 | - |\n| CondenseNet-122 | 116.7M | 0.95M | 4.48 | - |\n| CondenseNet-182* | 513M | 4.2M | 3.76 | 18.47 |\n\n(* trained 600 epochs)\n\n### Inference time on ARM platform\n\n| Model | FLOPs | Top-1 | Time(s) |\n|---|---|---|---|\n| VGG-16 | 15,300M | 28.5 | 354 |\n| ResNet-18 | 1,818M | 30.2 | 8.14 |\n| 1.0 MobileNet-224 | 569M | 29.4 | 1.96 |\n| CondenseNet-74 (C=G=4) | 529M | 26.2 | 1.89 |\n| CondenseNet-74 (C=G=8) | 274M | 29.0 | 0.99 |\n\n## Contact\nliushichen95@gmail.com  \ngh349@cornell.com\n\nWe are working on the implementation on other frameworks.  \nAny discussions or concerns are welcomed!\n","funding_links":[],"categories":["Papers\u0026Codes","DLA","Paper implementations｜论文实现","Paper implementations"],"sub_categories":["CondenseNet","Other libraries｜其他库:","Other libraries:"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FShichenLiu%2FCondenseNet","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FShichenLiu%2FCondenseNet","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FShichenLiu%2FCondenseNet/lists"}