https://github.com/vita-group/smc-bench
[ICLR 2023] "Sparsity May Cry: Let Us Fail (Current) Sparse Neural Networks Together!" Shiwei Liu, Tianlong Chen, Zhenyu Zhang, Xuxi Chen, Tianjin Huang, AJAY KUMAR JAISWAL, Zhangyang Wang
https://github.com/vita-group/smc-bench
benchmark deep-learning dynamic-sparse-training pruning sparse-neural-networks sparsity
Last synced: about 1 month ago
JSON representation
[ICLR 2023] "Sparsity May Cry: Let Us Fail (Current) Sparse Neural Networks Together!" Shiwei Liu, Tianlong Chen, Zhenyu Zhang, Xuxi Chen, Tianjin Huang, AJAY KUMAR JAISWAL, Zhangyang Wang
- Host: GitHub
- URL: https://github.com/vita-group/smc-bench
- Owner: VITA-Group
- Created: 2022-10-13T21:35:17.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2023-08-29T14:07:34.000Z (almost 2 years ago)
- Last Synced: 2025-04-18T07:36:50.139Z (about 2 months ago)
- Topics: benchmark, deep-learning, dynamic-sparse-training, pruning, sparse-neural-networks, sparsity
- Language: Python
- Homepage: https://openreview.net/forum?id=J6F3lLg4Kdp
- Size: 34.4 MB
- Stars: 28
- Watchers: 11
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# [ICLR 2023] [Sparsity May Cry Benchmark (SMC-Bench)](https://openreview.net/forum?id=J6F3lLg4Kdp)
Official PyTorch implementation of **SMC-Bench** - Sparsity May Cry: Let Us Fail (Current) Sparse Neural Networks Together!
[Shiwei Liu](https://shiweiliuiiiiiii.github.io/), [Tianlong Chen](https://tianlong-chen.github.io/about/), [Zhenyu Zhang](https://scholar.google.com/citations?user=ZLyJRxoAAAAJ&hl=zh-CN), [Xuxi Chen](http://xxchen.site/), [Tianjin Huang](https://research.tue.nl/en/persons/tianjin-huang), [Ajay Jaiswal](https://ajay1994.github.io/), [Zhangyang Wang](https://vita-group.github.io/)
University of Texas at Austin, Eindhoven University of Technology
The "Sparsity May Cry" Benchmark (SMC-Bench) is a collection of benchmark in pursuit of a more general evaluation and unveiling the true potential of sparse algorithms. SMC-Bench contains carefully curated 4 diverse tasks with 10 datasets, that accounts for capturing a wide-range of domain-specific knowledge.
The benchmark organizers can be contacted at [email protected].
Table of contents
* [Installation](#installation-of-smc-bench)
* [Training](#training-of-smc-bench)
* [Evaluated Sparse Algorithms](#sparse-algorithms)
* [Tasks, Datasets, and Models](#tasks-models-and-datasets)
* [Results](#results)
---## Installation of SMC-Bench
Please check [INSTALL.md](INSTALL.md) for installation instructinos.## Training of SMC-Bench
Please check [TRAINING.md](TRAINING.md) for installation instructinos.## Tasks Models and Datasets
Specifically, we consider a broad set of tasks including *commonsense reasoning, arithmatic reasoning, multilingual translation, and protein prediction*, whose content spans multiple domains, requiring a vast amount of commonsense knowledge, solid mathematical and scientific background to solve. Note that none of the datasets in SMC-Bench has been created from scratch for the benchmark, we rely on pre-existing datasets as they have been implicitly agreed by researchers as challenging, interesting, and of high practical value. The models and datasets that we used for SMC-Bench are summarized below.---
![]()
## Sparse Algorithms
*After Taining*: [Lottery Ticket Hypothesis](https://arxiv.org/abs/1803.03635), [Magnitude After Training](https://proceedings.neurips.cc/paper/2015/file/ae0eb3eed39d2bcef4622b2499a05fe6-Paper.pdf), [Random After Training](https://arxiv.org/abs/1812.10240), [oBERT](https://arxiv.org/abs/2203.07259).*During Taining*: [Gradual Magnitude Pruning](https://arxiv.org/abs/1902.09574a).
*Before Training*: [Magnitude Before Training](https://arxiv.org/abs/2009.08576), [SNIP](https://arxiv.org/abs/1810.02340), [Rigging the Lottery](https://arxiv.org/abs/1911.11134), [Random Before Training](https://arxiv.org/abs/2202.02643).
## Results
Commonsense Reasoning
---
![]()
Arithmatic Reasoning
---
![]()
Protein Property Prediction
---
![]()
Multilingual Translation
---
![]()