Projects in Awesome Lists tagged with network-compression
A curated list of projects in awesome lists tagged with network-compression .
https://github.com/datawhalechina/leedl-tutorial
《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
bert chatgpt cnn deep-learning diffusion gan leedl-tutorial machine-learning network-compression pruning reinforcement-learning rnn self-attention transfer-learning transformer tutorial
Last synced: 14 May 2025
https://github.com/IntelLabs/distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
automl-for-compression deep-neural-networks distillation early-exit group-lasso jupyter-notebook network-compression onnx pruning pruning-structures pytorch quantization regularization truncated-svd
Last synced: 20 Mar 2025
https://nervanasystems.github.io/distiller/
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
automl-for-compression deep-neural-networks distillation early-exit group-lasso jupyter-notebook network-compression onnx pruning pruning-structures pytorch quantization regularization truncated-svd
Last synced: 20 Nov 2024
https://github.com/intellabs/distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
automl-for-compression deep-neural-networks distillation early-exit group-lasso jupyter-notebook network-compression onnx pruning pruning-structures pytorch quantization regularization truncated-svd
Last synced: 17 Jan 2025
https://intellabs.github.io/distiller/
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
automl-for-compression deep-neural-networks distillation early-exit group-lasso jupyter-notebook network-compression onnx pruning pruning-structures pytorch quantization regularization truncated-svd
Last synced: 03 May 2025
https://github.com/quic/aimet
AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
auto-ml compression deep-learning deep-neural-networks machine-learning network-compression network-quantization open-source opensource pruning quantization
Last synced: 13 May 2025
https://github.com/clovaai/overhaul-distillation
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
iccv2019 knowledge-distillation knowledge-transfer network-compression teacher-student
Last synced: 06 Apr 2025
https://github.com/sony/model_optimization
Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. This project provides researchers, developers, and engineers advanced quantization and compression tools for deploying state-of-the-art neural networks.
deep-learning deep-neural-networks edge-ai machine-learning network-compression network-quantization neural-network optimizer ptq pytorch qat quantization tensorflow
Last synced: 14 May 2025
https://github.com/jshilong/fisherpruning
Group Fisher Pruning for Practical Network Compression(ICML2021)
Last synced: 22 Jan 2025
https://github.com/uber-research/permute-quantize-finetune
Using ideas from product quantization for state-of-the-art neural network compression.
deep-learning network-compression vector-quantization
Last synced: 11 Apr 2025
https://github.com/bhheo/ab_distillation
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
knowledge-distillation knowledge-transfer network-compression teacher-student-learning transfer-learning
Last synced: 30 Apr 2025
https://github.com/musco-ai/musco-pytorch
MUSCO: MUlti-Stage COmpression of neural networks
cp-decomposition deep-neural-networks low-rank model-acceleration model-compression network-acceleration network-compression pytorch tensor-decomposition truncated-svd tucker vbmf
Last synced: 15 Apr 2025
https://github.com/bhheo/bss_distillation
Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)
adversarial-attacks adversarial-samples image-classification knowledge-distillation network-compression teacher-student-learning
Last synced: 30 Apr 2025
https://github.com/musco-ai/musco-tf
MUSCO: Multi-Stage COmpression of neural networks
cnn-acceleration cnn-compresion cp-decomposition deep-neural-networks low-rank-approximation musco network-compression tensor-decomposition tensorflow truncated-svd tucker vbmf
Last synced: 13 Apr 2025