{"id":13643080,"url":"https://github.com/neuralmagic/sparsezoo","last_synced_at":"2025-05-16T12:08:54.909Z","repository":{"id":37487935,"uuid":"320590749","full_name":"neuralmagic/sparsezoo","owner":"neuralmagic","description":"Neural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes","archived":false,"fork":false,"pushed_at":"2024-05-10T18:57:00.000Z","size":1876,"stargazers_count":359,"open_issues_count":7,"forks_count":23,"subscribers_count":26,"default_branch":"main","last_synced_at":"2024-05-19T14:34:51.178Z","etag":null,"topics":["computer-vision","deep-learning-algorithms","deep-learning-models","mobilenet","models-optimized","nlp","object-detection-model","pretrained-models","pruning","quantization","resnet","smaller-models","sparse-quantized-models","sparsification-recipe","transfer-learning","yolo"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/neuralmagic.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2020-12-11T14:05:49.000Z","updated_at":"2024-06-17T21:30:30.979Z","dependencies_parsed_at":"2022-07-09T13:47:53.325Z","dependency_job_id":"c8a0ff1c-ef97-4afc-95c1-45c44753ba7c","html_url":"https://github.com/neuralmagic/sparsezoo","commit_stats":{"total_commits":170,"total_committers":16,"mean_commits":10.625,"dds":0.8470588235294118,"last_synced_commit":"b92dbeaa5d4ab15f4c8511f607278a6fa36eab71"},"previous_names":[],"tags_count":28,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/neuralmagic%2Fsparsezoo","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/neuralmagic%2Fsparsezoo/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/neuralmagic%2Fsparsezoo/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/neuralmagic%2Fsparsezoo/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/neuralmagic","download_url":"https://codeload.github.com/neuralmagic/sparsezoo/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247994119,"owners_count":21030050,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["computer-vision","deep-learning-algorithms","deep-learning-models","mobilenet","models-optimized","nlp","object-detection-model","pretrained-models","pruning","quantization","resnet","smaller-models","sparse-quantized-models","sparsification-recipe","transfer-learning","yolo"],"created_at":"2024-08-02T01:01:41.141Z","updated_at":"2025-05-16T12:08:54.883Z","avatar_url":"https://github.com/neuralmagic.png","language":"Python","readme":"\u003c!--\nCopyright (c) 2021 - present / Neuralmagic, Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing,\nsoftware distributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n--\u003e\n\n\u003ch1 style=\"display: flex; align-items: center;\" \u003e\n     \u003cimg width=\"100\" height=\"100\" alt=\"tool icon\" src=\"https://neuralmagic.com/wp-content/uploads/2024/03/icon_SparseZoo-003.svg\" /\u003e\n      \u003cspan\u003e\u0026nbsp;\u0026nbsp;SparseZoo\u003c/span\u003e\n  \u003c/h1\u003e\n\n\u003ch3\u003eNeural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes\u003c/h3\u003e\n\n## 🚨 2025 End of Life Announcement: DeepSparse, SparseML, SparseZoo, and Sparsify\n\nDear Community,\n\nWe’re reaching out with heartfelt thanks and important news. Following [Neural Magic’s acquisition by Red Hat in January 2025](https://www.redhat.com/en/about/press-releases/red-hat-completes-acquisition-neural-magic-fuel-optimized-generative-ai-innovation-across-hybrid-cloud), we’re shifting our focus to commercial and open-source offerings built around [vLLM (virtual large language models)](https://www.redhat.com/en/topics/ai/what-is-vllm).\n\nAs part of this transition, we have ceased development and will deprecate the community versions of **DeepSparse (including DeepSparse Enterprise), SparseML, SparseZoo, and Sparsify on June 2, 2025**. After that, these tools will no longer receive updates or support.\n\nFrom day one, our mission was to democratize AI through efficient, accessible tools. We’ve learned so much from your feedback, creativity, and collaboration—watching these tools become vital parts of your ML journeys has meant the world to us.\n\nThough we’re winding down the community editions, we remain committed to our original values. Now as part of Red Hat, we’re excited to evolve our work around vLLM and deliver even more powerful solutions to the ML community.\n\nTo learn more about our next chapter, visit [ai.redhat.com](ai.redhat.com). Thank you for being part of this incredible journey.\n\n_With gratitude, The Neural Magic Team (now part of Red Hat)_\n\n## Overview\n\n[SparseZoo is a constantly-growing repository](https://sparsezoo.neuralmagic.com) of sparsified (pruned and pruned-quantized) models with matching sparsification recipes for neural networks. \nIt simplifies and accelerates your time-to-value in building performant deep learning models with a collection of inference-optimized models and recipes to prototype from. \nRead [more about sparsification](https://docs.neuralmagic.com/user-guides/sparsification).\n\nAvailable via API and hosted in the cloud, the SparseZoo contains both baseline and models sparsified to different degrees of inference performance vs. baseline loss recovery. \nRecipe-driven approaches built around sparsification algorithms allow you to use the models as given, transfer-learn from the models onto private datasets, or transfer the recipes to your architectures.\n\nThe [GitHub repository](https://github.com/neuralmagic/sparsezoo) contains the Python API code to handle the connection and authentication to the cloud.\n\n\u003cimg alt=\"SparseZoo Flow\" src=\"https://docs.neuralmagic.com/docs/source/infographics/sparsezoo.png\" width=\"960px\" /\u003e\n\n## Highlights\n\n- [Model Stub Architecture Overview](https://docs.neuralmagic.com/sparsezoo/source/models.html)\n- [Available Model Recipes](https://docs.neuralmagic.com/sparsezoo/source/recipes.html)\n- [sparsezoo.neuralmagic.com](https://sparsezoo.neuralmagic.com)\n\n## Installation\n\nThis repository is tested on Python 3.8-3.11, and Linux/Debian systems.\nIt is recommended to install in a [virtual environment](https://docs.python.org/3/library/venv.html) to keep your system in order.\n\nInstall with pip using:\n\n```bash\npip install sparsezoo\n```\n\n## Quick Tour\n\nThe SparseZoo Python API enables you to search and download sparsified models. Code examples are given below.\nWe encourage users to load SparseZoo models by copying a stub directly from a [model page]((https://sparsezoo.neuralmagic.com/)).\n\n### Introduction to Model Class Object\n\nThe `Model` is a fundamental object that serves as a main interface with the SparseZoo library. \nIt represents a SparseZoo model, together with all its directories and files.\n\n#### Creating a Model Class Object From SparseZoo Stub\n```python\nfrom sparsezoo import Model\n\nstub = \"zoo:cv/classification/resnet_v1-50/pytorch/sparseml/imagenet/pruned95_quant-none\"\n\nmodel = Model(stub)\nprint(str(model))\n\n\u003e\u003e Model(stub=zoo:cv/classification/resnet_v1-50/pytorch/sparseml/imagenet/pruned95_quant-none)\n```\n\n#### Creating a Model Class Object From Local Model Directory\n```python\nfrom sparsezoo import Model\n\ndirectory = \".../.cache/sparsezoo/eb977dae-2454-471b-9870-4cf38074acf0\"\n\nmodel = Model(directory)\nprint(str(model))\n\n\u003e\u003e Model(directory=.../.cache/sparsezoo/eb977dae-2454-471b-9870-4cf38074acf0)\n```\n\n#### Manually Specifying the Model Download Path\n\nUnless specified otherwise, the model created from the SparseZoo stub is saved to the local sparsezoo cache directory. \nThis can be overridden by passing the optional `download_path` argument to the constructor:\n\n```python\nfrom sparsezoo import Model\n\nstub = \"zoo:cv/classification/resnet_v1-50/pytorch/sparseml/imagenet/pruned95_quant-none\"\ndownload_directory = \"./model_download_directory\"\n\nmodel = Model(stub, download_path = download_directory)\n```\n#### Downloading the Model Files\nOnce the model is initialized from a stub, it may be downloaded either by calling the `download()` method or by invoking a `path` property. Both pathways are universal for all the files in SparseZoo. Invoking the `path` property will always trigger file download unless the file has already been downloaded.\n\n```python\n# method 1\nmodel.download() \n\n# method 2 \nmodel_path = model.path\n```\n\n#### Inspecting the Contents of the SparseZoo Model\n\nWe call the `available_files` method to inspect which files are present in the SparseZoo model. Then, we select a file by calling the appropriate attribute:\n\n```python\nmodel.available_files\n\n\u003e\u003e {'training': Directory(name=training), \n\u003e\u003e 'deployment': Directory(name=deployment), \n\u003e\u003e 'sample_inputs': Directory(name=sample_inputs.tar.gz), \n\u003e\u003e 'sample_outputs': {'framework': Directory(name=sample_outputs.tar.gz)}, \n\u003e\u003e 'sample_labels': Directory(name=sample_labels.tar.gz), \n\u003e\u003e 'model_card': File(name=model.md), \n\u003e\u003e 'recipes': Directory(name=recipe), \n\u003e\u003e 'onnx_model': File(name=model.onnx)}\n```\nThen, we might take a closer look at the contents of the SparseZoo model:\n```python\nmodel_card = model.model_card\nprint(model_card)\n\n\u003e\u003e File(name=model.md)\n```\n```python\nmodel_card_path = model.model_card.path\nprint(model_card_path)\n\n\u003e\u003e .../.cache/sparsezoo/eb977dae-2454-471b-9870-4cf38074acf0/model.md\n```\n\n\n### Model, Directory, and File\n\nIn general, every file in the SparseZoo model shares a set of attributes: `name`, `path`, `URL`, and `parent`:\n- `name` serves as an identifier of the file/directory\n- `path` points to the location of the file/directory \n- `URL` specifies the server address of the file/directory in question\n- `parent` points to the location of the parent directory of the file/directory in question\n\nA directory is a unique type of file that contains other files. For that reason, it has an additional `files` attribute.\n\n```python\nprint(model.onnx_model)\n\n\u003e\u003e File(name=model.onnx)\n\nprint(f\"File name: {model.onnx_model.name}\\n\"\n      f\"File path: {model.onnx_model.path}\\n\"\n      f\"File URL: {model.onnx_model.url}\\n\"\n      f\"Parent directory: {model.onnx_model.parent_directory}\")\n      \n\u003e\u003e File name: model.onnx\n\u003e\u003e File path: .../.cache/sparsezoo/eb977dae-2454-471b-9870-4cf38074acf0/model.onnx\n\u003e\u003e File URL: https://models.neuralmagic.com/cv-classification/...\n\u003e\u003e Parent directory: .../.cache/sparsezoo/eb977dae-2454-471b-9870-4cf38074acf0\n```\n\n```python\nprint(model.recipes)\n\n\u003e\u003e Directory(name=recipe)\n\nprint(f\"File name: {model.recipes.name}\\n\"\n      f\"Contains: {[file.name for file in model.recipes.files]}\\n\"\n      f\"File path: {model.recipes.path}\\n\"\n      f\"File URL: {model.recipes.url}\\n\"\n      f\"Parent directory: {model.recipes.parent_directory}\")\n      \n\u003e\u003e File name: recipe\n\u003e\u003e Contains: ['recipe_original.md', 'recipe_transfer-classification.md']\n\u003e\u003e File path: /home/user/.cache/sparsezoo/eb977dae-2454-471b-9870-4cf38074acf0/recipe\n\u003e\u003e File URL: None\n\u003e\u003e Parent directory: /home/user/.cache/sparsezoo/eb977dae-2454-471b-9870-4cf38074acf0\n```\n\n### Selecting Checkpoint-Specific Data\n\nA SparseZoo model may contain several checkpoints. The model may contain a checkpoint that had been saved before the model was quantized - that checkpoint would be used for transfer learning. Another checkpoint might have been saved after the quantization step - that one is usually directly used for inference.\n\nThe recipes may also vary depending on the use case. We may want to access a recipe that was used to sparsify the dense model (`recipe_original`) or the one that enables us to sparse transfer learn from the already sparsified model (`recipe_transfer`). \n\nThere are two ways to access those specific files.\n\n#### Accessing Recipes (Through Python API)\n```python\navailable_recipes = model.recipes.available\nprint(available_recipes)\n\n\u003e\u003e ['original', 'transfer-classification']\n\ntransfer_recipe = model.recipes[\"transfer-classification\"]\nprint(transfer_recipe)\n\n\u003e\u003e File(name=recipe_transfer-classification.md)\n\noriginal_recipe = model.recipes.default # recipe defaults to `original`\noriginal_recipe_path = original_recipe.path # downloads the recipe and returns its path\nprint(original_recipe_path)\n\n\u003e\u003e .../.cache/sparsezoo/eb977dae-2454-471b-9870-4cf38074acf0/recipe/recipe_original.md\n```\n\n#### Accessing Checkpoints (Through Python API)\nIn general, we are expecting the following checkpoints to be included in the model: \n\n- `checkpoint_prepruning`\n- `checkpoint_postpruning`\n- `checkpoint_preqat`\n- `checkpoint_postqat` \n\nThe checkpoint that the model defaults to is the `preqat` state (just before the quantization step).\n\n```python\nfrom sparsezoo import Model\n\nstub = \"zoo:nlp/question_answering/bert-base/pytorch/huggingface/squad/pruned_quant_3layers-aggressive_84\"\n\nmodel = Model(stub)\navailable_checkpoints = model.training.available\nprint(available_checkpoints)\n\n\u003e\u003e ['preqat']\n\npreqat_checkpoint = model.training.default # recipe defaults to `preqat`\npreqat_checkpoint_path = preqat_checkpoint.path # downloads the checkpoint and returns its path\nprint(preqat_checkpoint_path)\n\n\u003e\u003e .../.cache/sparsezoo/0857c6f2-13c1-43c9-8db8-8f89a548dccd/training\n\n[print(file.name) for file in preqat_checkpoint.files]\n\n\u003e\u003e vocab.txt\n\u003e\u003e special_tokens_map.json\n\u003e\u003e pytorch_model.bin\n\u003e\u003e config.json\n\u003e\u003e training_args.bin\n\u003e\u003e tokenizer_config.json\n\u003e\u003e trainer_state.json\n\u003e\u003e tokenizer.json\n```\n\n\n#### Accessing Recipes (Through Stub String Arguments)\n\nYou can also directly request a specific recipe/checkpoint type by appending the appropriate URL query arguments to the stub:\n```python\nfrom sparsezoo import Model\n\nstub = \"zoo:cv/classification/resnet_v1-50/pytorch/sparseml/imagenet/pruned95_quant-none?recipe=transfer\"\n\nmodel = Model(stub)\n\n# Inspect which files are present.\n# Note that the available recipes are restricted\n# according to the specified URL query arguments\nprint(model.recipes.available)\n\n\u003e\u003e ['transfer-classification']\n\ntransfer_recipe = model.recipes.default # Now the recipes default to the one selected by the stub string arguments\nprint(transfer_recipe)\n\n\u003e\u003e File(name=recipe_transfer-classification.md)\n```\n\n### Accessing Sample Data\n\nThe user may easily request a sample batch of data that represents the inputs and outputs of the model.\n\n```python\nsample_data = model.sample_batch(batch_size = 10)\n\nprint(sample_data['sample_inputs'][0].shape)\n\u003e\u003e (10, 3, 224, 224) # (batch_size, num_channels, image_dim, image_dim)\n\nprint(sample_data['sample_outputs'][0].shape)\n\u003e\u003e (10, 1000) # (batch_size, num_classes)\n```\n\n### Model Search\nThe function `search_models` enables the user to quickly filter the contents of SparseZoo repository to find the stubs of interest:\n\n```python\nfrom sparsezoo import search_models\n\nargs = {\n    \"domain\": \"cv\",\n    \"sub_domain\": \"segmentation\",\n    \"architecture\": \"yolact\",\n}\n\nmodels = search_models(**args)\n[print(model) for model in models]\n\n\u003e\u003e Model(stub=zoo:cv/segmentation/yolact-darknet53/pytorch/dbolya/coco/pruned82_quant-none)\n\u003e\u003e Model(stub=zoo:cv/segmentation/yolact-darknet53/pytorch/dbolya/coco/pruned90-none)\n\u003e\u003e Model(stub=zoo:cv/segmentation/yolact-darknet53/pytorch/dbolya/coco/base-none)\n```\n\n### Environmental Variables\n\nUsers can specify the directory where models (temporarily during download) and its required credentials will be saved in your working machine. \n`SPARSEZOO_MODELS_PATH` is the path where the downloaded models will be saved temporarily. Default `~/.cache/sparsezoo/`\n`SPARSEZOO_CREDENTIALS_PATH` is the path where `credentials.yaml` will be saved. Default `~/.cache/sparsezoo/`\n\n### Console Scripts\n\nIn addition to the Python APIs, a console script entry point is installed with the package `sparsezoo`.\nThis enables easy interaction straight from your console/terminal.\n\n#### Downloading\n\nDownload command help\n\n```shell script\nsparsezoo.download -h\n```\n\n\u003cbr\u003eDownload ResNet-50 Model\n\n```shell script\nsparsezoo.download zoo:cv/classification/resnet_v1-50/pytorch/sparseml/imagenet/base-none\n```\n\n\u003cbr\u003eDownload pruned and quantized ResNet-50 Model\n\n```shell script\nsparsezoo.download zoo:cv/classification/resnet_v1-50/pytorch/sparseml/imagenet/pruned_quant-moderate\n```\n\n#### Searching\n\nSearch command help\n\n```shell script\nsparsezoo search -h\n```\n\n\u003cbr\u003eSearching for all classification MobileNetV1 models in the computer vision domain\n\n```shell script\nsparsezoo search --domain cv --sub-domain classification --architecture mobilenet_v1\n```\n\n\u003cbr\u003eSearching for all ResNet-50 models\n\n```shell script\nsparsezoo search --domain cv --sub-domain classification \\\n    --architecture resnet_v1 --sub-architecture 50\n```\n\nFor a more in-depth read, check out [SparseZoo documentation.](https://docs.neuralmagic.com/sparsezoo/)\n\n## Resources\n\n### Learning More\n\n- Documentation: [SparseML,](https://docs.neuralmagic.com/sparseml/) [SparseZoo,](https://docs.neuralmagic.com/sparsezoo/) [Sparsify,](https://docs.neuralmagic.com/sparsify/) [DeepSparse](https://docs.neuralmagic.com/deepsparse/)\n- Neural Magic: [Blog,](https://www.neuralmagic.com/blog/) [Resources](https://www.neuralmagic.com/resources/)\n\n### Release History\n\nOfficial builds are hosted on PyPI\n\n- stable: [sparsezoo](https://pypi.org/project/sparsezoo/)\n- nightly (dev): [sparsezoo-nightly](https://pypi.org/project/sparsezoo-nightly/)\n\nAdditionally, more information can be found via [GitHub Releases.](https://github.com/neuralmagic/sparsezoo/releases)\n\n### License\n\nThe project is licensed under the [Apache License Version 2.0.](https://github.com/neuralmagic/sparsezoo/blob/main/LICENSE)\n\n\n","funding_links":[],"categories":["Lighter and Deployment Frameworks","Networks"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fneuralmagic%2Fsparsezoo","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fneuralmagic%2Fsparsezoo","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fneuralmagic%2Fsparsezoo/lists"}