Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
Projects in Awesome Lists tagged with distillation
A curated list of projects in awesome lists tagged with distillation .
https://github.com/intellabs/distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
automl-for-compression deep-neural-networks distillation early-exit group-lasso jupyter-notebook network-compression onnx pruning pruning-structures pytorch quantization regularization truncated-svd
Last synced: 25 Sep 2024
https://github.com/IntelLabs/distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
automl-for-compression deep-neural-networks distillation early-exit group-lasso jupyter-notebook network-compression onnx pruning pruning-structures pytorch quantization regularization truncated-svd
Last synced: 27 Oct 2024
https://nervanasystems.github.io/distiller/
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
automl-for-compression deep-neural-networks distillation early-exit group-lasso jupyter-notebook network-compression onnx pruning pruning-structures pytorch quantization regularization truncated-svd
Last synced: 20 Nov 2024
https://intellabs.github.io/distiller/
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
automl-for-compression deep-neural-networks distillation early-exit group-lasso jupyter-notebook network-compression onnx pruning pruning-structures pytorch quantization regularization truncated-svd
Last synced: 13 Nov 2024
https://github.com/gmvandeven/continual-learning
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
artificial-neural-networks class-incremental-learning continual-learning deep-learning distillation domain-incremental-learning elastic-weight-consolidation generative-models gradient-episodic-memory icarl incremental-learning lifelong-learning replay replay-through-feedback task-incremental-learning variational-autoencoder
Last synced: 21 Dec 2024
https://github.com/airaria/textbrewer
A PyTorch-based knowledge distillation toolkit for natural language processing
bert distillation knowledge nlp pytorch
Last synced: 21 Dec 2024
https://github.com/airaria/TextBrewer
A PyTorch-based knowledge distillation toolkit for natural language processing
bert distillation knowledge nlp pytorch
Last synced: 03 Nov 2024
https://github.com/GMvandeVen/continual-learning
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
artificial-neural-networks class-incremental-learning continual-learning deep-learning distillation domain-incremental-learning elastic-weight-consolidation generative-models gradient-episodic-memory icarl incremental-learning lifelong-learning replay replay-through-feedback task-incremental-learning variational-autoencoder
Last synced: 12 Nov 2024
https://github.com/paddlepaddle/paddleslim
PaddleSlim is an open-source library for deep model compression and architecture search.
bert compression detection distillation ernie nas pruning quantization segmentation sparsity tensorrt transformer yolov5 yolov6 yolov7
Last synced: 19 Dec 2024
https://github.com/PaddlePaddle/PaddleSlim
PaddleSlim is an open-source library for deep model compression and architecture search.
bert compression detection distillation ernie nas pruning quantization segmentation sparsity tensorrt transformer yolov5 yolov6 yolov7
Last synced: 28 Oct 2024
https://github.com/vitae-transformer/vitpose
The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose Estimation" and [TPAMI'23] "ViTPose++: Vision Transformer for Generic Body Pose Estimation"
deep-learning distillation mae pose-estimation pytorch self-supervised-learning vision-transformer
Last synced: 19 Dec 2024
https://github.com/Syencil/mobile-yolov5-pruning-distillation
mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
distillation mobile-yolov5s ncnn pruning yolov5
Last synced: 09 Nov 2024
https://github.com/cluebenchmark/cluepretrainedmodels
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
albert bert chinese corpus dataset distillation pretrained-models roberta semantic-similarity sentence-analysis sentence-classification sentence-pairs text-classification
Last synced: 21 Dec 2024
https://github.com/CLUEbenchmark/CLUEPretrainedModels
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
albert bert chinese corpus dataset distillation pretrained-models roberta semantic-similarity sentence-analysis sentence-classification sentence-pairs text-classification
Last synced: 07 Nov 2024
https://github.com/segmind/distill-sd
Segmind Distilled diffusion
distillation inference knowledge-distillation stable-diffusion
Last synced: 31 Oct 2024
https://github.com/thu-ml/ares
A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.
adversarial-attacks adversarial-machine-learning adversarial-robustness benchmark-framework bim boundary deepfool distillation evolutionary fgsm hgd mi-fgsm mmlda nes pca spsa
Last synced: 21 Dec 2024
https://github.com/gojasper/flash-diffusion
Official implementation of ⚡ Flash Diffusion ⚡: Accelerating Any Conditional Diffusion Model for Few Steps Image Generation
diffusion-models distillation dit inpainting sdxl super-resolution text-to-image
Last synced: 28 Oct 2024
https://github.com/huggingface/optimum-intel
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
diffusers distillation inference intel onnx openvino optimization pruning quantization transformers
Last synced: 30 Oct 2024
https://github.com/dotchen/LAV
(CVPR 2022) A minimalist, mapless, end-to-end self-driving stack for joint perception, prediction, planning and control.
autonomous-driving carla-simulator cvpr2022 distillation imitation-learning perception planning prediction
Last synced: 28 Oct 2024
https://github.com/qiangsiwei/bert_distill
BERT distillation(基于BERT的蒸馏实验 )
bert classification distillation nlp
Last synced: 02 Nov 2024
https://github.com/gyunggyung/AGI-Papers
Papers and Book to look at when starting AGI 📚
all-to-all dialogue distillation efficient llm multimodal multiple-tasks nlg nlp sentence-embeddings sentence-similarity stable-diffusion text-to-video tts
Last synced: 20 Oct 2024
https://github.com/gmvandeven/brain-inspired-replay
A brain-inspired version of generative replay for continual learning with deep neural networks (e.g., class-incremental learning on CIFAR-100; PyTorch code).
artificial-neural-networks brain-inspired continual-learning deep-learning distillation elastic-weight-consolidation generative-replay incremental-learning internal-replay lifelong-learning permuted-mnist replay replay-through-feedback split-cifar100 split-mnist synaptic-intelligence variational-autoencoder
Last synced: 18 Dec 2024
https://github.com/Sharpiless/Yolov5-distillation-train-inference
Yolov5 distillation training | Yolov5知识蒸馏训练,支持训练自己的数据
distillation konwledge-distillation model-compression object-detection yolov5
Last synced: 09 Nov 2024
https://github.com/snap-research/r2l
[ECCV 2022] R2L: Distilling Neural Radiance Field to Neural Light Field for Efficient Novel View Synthesis
deep-learning distillation mlp nerf neural-light-field novel-view-synthesis rendering
Last synced: 19 Dec 2024
https://github.com/BioSTEAMDevelopmentGroup/biosteam
The Biorefinery Simulation and Techno-Economic Analysis Modules; Life Cycle Assessment; Chemical Process Simulation Under Uncertainty
biochemical-process bioprocess biorefinery centrifuge chemical-engineering distillation fermentation flash heat-exchanger life-cycle-assessment monte-carlo process-simulation pump reactor sensitivity-analysis techno-economic-analysis thermodynamics unit-operation
Last synced: 14 Nov 2024
https://github.com/snap-research/R2L
[ECCV 2022] R2L: Distilling Neural Radiance Field to Neural Light Field for Efficient Novel View Synthesis
deep-learning distillation mlp nerf neural-light-field novel-view-synthesis rendering
Last synced: 07 Nov 2024
https://github.com/khurramjaved96/incremental-learning
Pytorch implementation of ACCV18 paper "Revisiting Distillation and Incremental Classifier Learning."
convolutional-neural-networks distillation incremental-learning machine-learning paper-implementations pytorch
Last synced: 15 Nov 2024
https://github.com/alldbi/SuperMix
Pytorch implementation of CVPR2021 paper: SuperMix: Supervising the Mixing Data Augmentation
augmentation cvpr2021 deep-learning distillation pytorch saliency-detection supervised
Last synced: 13 Nov 2024
https://github.com/snap-research/graphless-neural-networks
[ICLR 2022] Code for Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation (GLNN)
deep-learning distillation efficient-inference gnn graph-algorithm graph-neural-networks knowledge-distillation pytorch scalability
Last synced: 17 Dec 2024
https://github.com/adamdad/knowledgefactor
[ECCV2022] Factorizing Knowledge in Neural Networks
deep-learning distillation eccv2022 kd knowldge-distillation knowledge-transfer multitask-learning pytorch
Last synced: 16 Nov 2024
https://github.com/julesbelveze/bert-squeeze
🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
bert deebert distillation fastbert lstm nlp pruning pytorch-lightning quantization theseus transformers
Last synced: 17 Dec 2024
https://github.com/qcraftai/distill-bev
DistillBEV: Boosting Multi-Camera 3D Object Detection with Cross-Modal Knowledge Distillation (ICCV 2023)
3d-object-detection autonomous-driving bev cross-modal distillation knowledge-distillation lidar multi-camera multi-modal nuscenes point-cloud self-driving
Last synced: 28 Oct 2024
https://github.com/xiongma/roberta-wwm-base-distill
this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
bert distillation natural-language-processing pretrained-models roberta tensorflow
Last synced: 07 Nov 2024
https://github.com/bloomberg/minilmv2.bb
Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)
distillation language-model model-compression model-distillation python pytorch transformers
Last synced: 09 Nov 2024
https://github.com/sayakpaul/deit-tf
Includes PyTorch -> Keras model porting code for DeiT models with fine-tuning and inference notebooks.
computer-vision distillation image-recognition imagenet-1k inductive-biases keras tensorflow vision-transformers
Last synced: 22 Oct 2024
https://github.com/joisino/speedbook
書籍『深層ニューラルネットワークの高速化』のサポートサイトです。
deep-learning deep-neural-networks distillation efficiency neural-networks pruning pytorch quantization
Last synced: 10 Oct 2024
https://github.com/microsoft/augmented-interpretable-models
Interpretable and efficient predictors using pre-trained language models. Scikit-learn compatible.
ai artificial-intelligence deep-learning distillation embedding explainability huggingface interpretability language-model large-language-models linear linear-models logistic-regression machine-learning ml neural-network scikit-learn sentiment-classification transformer transparent
Last synced: 07 Oct 2024
https://github.com/esceptico/squeezer
Lightweight knowledge distillation pipeline
distillation knowledge-distillation model-compression pytorch
Last synced: 19 Oct 2024
https://github.com/vitae-transformer/simdistill
The official repo for [AAAI 2024] "SimDistill: Simulated Multi-modal Distillation for BEV 3D Object Detection""
3d-object-detection bird-view-image deep-learning distillation simulation
Last synced: 14 Nov 2024
https://github.com/ViTAE-Transformer/SimDistill
The official repo for [AAAI 2024] "SimDistill: Simulated Multi-modal Distillation for BEV 3D Object Detection""
3d-object-detection bird-view-image deep-learning distillation simulation
Last synced: 28 Oct 2024
https://github.com/z7zuqer/model-compression-and-acceleration-4-dnn
model-compression-and-acceleration-4-DNN
decomposition distillation model-compression pruning quantization
Last synced: 28 Nov 2024
https://github.com/akimach/tensorflow-distillation-examples
Knowledge distillation implemented in TensorFlow
dark-knowledge deep-learning distillation jupyter-notebook neural-network python tensorflow
Last synced: 09 Nov 2024
https://github.com/Atenrev/diffusion_continual_learning
PyTorch implementation of various distillation approaches for continual learning of Diffusion Models.
continual-learning diffusion-models distillation generative-replay
Last synced: 30 Oct 2024
https://github.com/larry-athey/rpi-smart-still
Raspberry PI and Arduino/ESP32 powered smart still controller system. Designed around the Still Spirits T-500 column and boiler, but can be easily added to any other gas or electric still with a dephlegmator.
arduino automation distillation esp32 esp32-arduino fermentation genio homebrew hydrometer istill moonshine pot-still raspberry-pi reflux smart-still still-controller still-spirits t500
Last synced: 13 Dec 2024
https://github.com/snap-research/linkless-link-prediction
[ICML 2023] Linkless Link Prediction via Relational Distillation
deep-learning distillation efficient-inference gnn graph-neural-networks knowledge-distillation link-prediction scalability
Last synced: 15 Nov 2024
https://github.com/zjcv/knowledgereview
[CVPR 2021] Distilling Knowledge via Knowledge Review
distillation feature-distillation knowledge-review pytorch zcls
Last synced: 14 Dec 2024
https://github.com/zjcv/overhaul
[ICCV 2019] A Comprehensive Overhaul of Feature Distillation
distillation feature-distillation overhaul pytorch zcls
Last synced: 14 Dec 2024
https://github.com/fightnyy/distillbart_eck
Repository for distillation of English, Chinese, Korean Multilingual BART
chinese distillation english korean mbart multilingual
Last synced: 13 Oct 2024
https://github.com/autodistill/autodistill-base-model-template
A template for use in creating Autodistill Base Model packages.
autodistill computer-vision distillation
Last synced: 08 Nov 2024
https://github.com/jakegrigsby/algorithm_distillation
minimalist pytorch replication of Algorithm Distillation (Laskin et al., 2022)
Last synced: 12 Nov 2024
https://github.com/tonywu71/distilling-and-forgetting-in-large-pre-trained-models
Code for my dissertation on "Distilling and Forgetting in Large Pre-Trained Models" for the MPhil in Machine Learning and Machine Intelligence (MLMI) at the University of Cambridge.
continual-learning distillation speech-recognition whisper
Last synced: 04 Dec 2024
https://github.com/daspartho/distillclassifier
Easily generate synthetic data for classification tasks using LLMs
classification classification-models dataset-generation distillation distillation-model distilling-the-knowledge large-language-models nlp synthetic-data synthetic-dataset-generation text-classification
Last synced: 15 Dec 2024
https://github.com/z7zuqer/compression-pytorch
experiments on pytorch
distillation pruning quantization
Last synced: 28 Nov 2024
https://github.com/stanleylsx/text_classifier_torch
Text classification repository built with Torch, featuring training tricks, acceleration methods, and model optimization techniques like distillation, compression, and pruning. Supports single-label and multi-label training with customizable configurations. 基于Torch的文本分类仓库,包含训练技巧、加速方法以及模型优化技术,如蒸馏、压缩和剪枝。支持单标签和多标签训练,并提供可自定义的配置选项。
distillation pretrained-models pytorch text-classification
Last synced: 09 Dec 2024
https://github.com/autodistill/autodistill-target-model-template
A template for use in creating Autodistill Target Model packages.
autodistill computer-vision distillation
Last synced: 08 Nov 2024
https://github.com/ksasi/fair-distill
Distillation of GANs with fairness constraints
Last synced: 05 Dec 2024
https://github.com/larry-athey/airhead
Air Still (or clone) upgrade that uses an SCR controller for the heating element and an ESP32 to make the whole thing smarter. Eliminates that constant full power on/off switching to the heating element and the possibility of scorching. Plus even more capabilities.
air-still boiler-controller distillation esp32 fermentation gin moonshine rum still-spirits temperature-control vodka whiskey
Last synced: 29 Nov 2024
https://github.com/pialghosh2233/finetune-light-weight-stablediffusion-imagetoimage
Grayscale image colorization using Light weight stable diffusion
artificial-intelligence compression distillation generative-ai huggingface huggingface-diffusers image-processing lightweight lightweight-stable-diffusion pytorch stable-diffusion
Last synced: 09 Dec 2024