Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

Projects in Awesome Lists tagged with distillation

A curated list of projects in awesome lists tagged with distillation .

https://github.com/intellabs/distiller

Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller

automl-for-compression deep-neural-networks distillation early-exit group-lasso jupyter-notebook network-compression onnx pruning pruning-structures pytorch quantization regularization truncated-svd

Last synced: 25 Sep 2024

https://github.com/IntelLabs/distiller

Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller

automl-for-compression deep-neural-networks distillation early-exit group-lasso jupyter-notebook network-compression onnx pruning pruning-structures pytorch quantization regularization truncated-svd

Last synced: 27 Oct 2024

https://nervanasystems.github.io/distiller/

Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller

automl-for-compression deep-neural-networks distillation early-exit group-lasso jupyter-notebook network-compression onnx pruning pruning-structures pytorch quantization regularization truncated-svd

Last synced: 20 Nov 2024

https://intellabs.github.io/distiller/

Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller

automl-for-compression deep-neural-networks distillation early-exit group-lasso jupyter-notebook network-compression onnx pruning pruning-structures pytorch quantization regularization truncated-svd

Last synced: 13 Nov 2024

https://github.com/airaria/textbrewer

A PyTorch-based knowledge distillation toolkit for natural language processing

bert distillation knowledge nlp pytorch

Last synced: 21 Dec 2024

https://github.com/airaria/TextBrewer

A PyTorch-based knowledge distillation toolkit for natural language processing

bert distillation knowledge nlp pytorch

Last synced: 03 Nov 2024

https://github.com/paddlepaddle/paddleslim

PaddleSlim is an open-source library for deep model compression and architecture search.

bert compression detection distillation ernie nas pruning quantization segmentation sparsity tensorrt transformer yolov5 yolov6 yolov7

Last synced: 19 Dec 2024

https://github.com/PaddlePaddle/PaddleSlim

PaddleSlim is an open-source library for deep model compression and architecture search.

bert compression detection distillation ernie nas pruning quantization segmentation sparsity tensorrt transformer yolov5 yolov6 yolov7

Last synced: 28 Oct 2024

https://github.com/vitae-transformer/vitpose

The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose Estimation" and [TPAMI'23] "ViTPose++: Vision Transformer for Generic Body Pose Estimation"

deep-learning distillation mae pose-estimation pytorch self-supervised-learning vision-transformer

Last synced: 19 Dec 2024

https://github.com/Syencil/mobile-yolov5-pruning-distillation

mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!

distillation mobile-yolov5s ncnn pruning yolov5

Last synced: 09 Nov 2024

https://github.com/thu-ml/ares

A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.

adversarial-attacks adversarial-machine-learning adversarial-robustness benchmark-framework bim boundary deepfool distillation evolutionary fgsm hgd mi-fgsm mmlda nes pca spsa

Last synced: 21 Dec 2024

https://github.com/gojasper/flash-diffusion

Official implementation of ⚡ Flash Diffusion ⚡: Accelerating Any Conditional Diffusion Model for Few Steps Image Generation

diffusion-models distillation dit inpainting sdxl super-resolution text-to-image

Last synced: 28 Oct 2024

https://github.com/huggingface/optimum-intel

🤗 Optimum Intel: Accelerate inference with Intel optimization tools

diffusers distillation inference intel onnx openvino optimization pruning quantization transformers

Last synced: 30 Oct 2024

https://github.com/dotchen/LAV

(CVPR 2022) A minimalist, mapless, end-to-end self-driving stack for joint perception, prediction, planning and control.

autonomous-driving carla-simulator cvpr2022 distillation imitation-learning perception planning prediction

Last synced: 28 Oct 2024

https://github.com/qiangsiwei/bert_distill

BERT distillation(基于BERT的蒸馏实验 )

bert classification distillation nlp

Last synced: 02 Nov 2024

https://github.com/Sharpiless/Yolov5-distillation-train-inference

Yolov5 distillation training | Yolov5知识蒸馏训练,支持训练自己的数据

distillation konwledge-distillation model-compression object-detection yolov5

Last synced: 09 Nov 2024

https://github.com/snap-research/r2l

[ECCV 2022] R2L: Distilling Neural Radiance Field to Neural Light Field for Efficient Novel View Synthesis

deep-learning distillation mlp nerf neural-light-field novel-view-synthesis rendering

Last synced: 19 Dec 2024

https://github.com/snap-research/R2L

[ECCV 2022] R2L: Distilling Neural Radiance Field to Neural Light Field for Efficient Novel View Synthesis

deep-learning distillation mlp nerf neural-light-field novel-view-synthesis rendering

Last synced: 07 Nov 2024

https://github.com/khurramjaved96/incremental-learning

Pytorch implementation of ACCV18 paper "Revisiting Distillation and Incremental Classifier Learning."

convolutional-neural-networks distillation incremental-learning machine-learning paper-implementations pytorch

Last synced: 15 Nov 2024

https://github.com/alldbi/SuperMix

Pytorch implementation of CVPR2021 paper: SuperMix: Supervising the Mixing Data Augmentation

augmentation cvpr2021 deep-learning distillation pytorch saliency-detection supervised

Last synced: 13 Nov 2024

https://github.com/snap-research/graphless-neural-networks

[ICLR 2022] Code for Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation (GLNN)

deep-learning distillation efficient-inference gnn graph-algorithm graph-neural-networks knowledge-distillation pytorch scalability

Last synced: 17 Dec 2024

https://github.com/julesbelveze/bert-squeeze

🛠️ Tools for Transformers compression using PyTorch Lightning ⚡

bert deebert distillation fastbert lstm nlp pruning pytorch-lightning quantization theseus transformers

Last synced: 17 Dec 2024

https://github.com/qcraftai/distill-bev

DistillBEV: Boosting Multi-Camera 3D Object Detection with Cross-Modal Knowledge Distillation (ICCV 2023)

3d-object-detection autonomous-driving bev cross-modal distillation knowledge-distillation lidar multi-camera multi-modal nuscenes point-cloud self-driving

Last synced: 28 Oct 2024

https://github.com/xiongma/roberta-wwm-base-distill

this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large

bert distillation natural-language-processing pretrained-models roberta tensorflow

Last synced: 07 Nov 2024

https://github.com/bloomberg/minilmv2.bb

Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)

distillation language-model model-compression model-distillation python pytorch transformers

Last synced: 09 Nov 2024

https://github.com/sayakpaul/deit-tf

Includes PyTorch -> Keras model porting code for DeiT models with fine-tuning and inference notebooks.

computer-vision distillation image-recognition imagenet-1k inductive-biases keras tensorflow vision-transformers

Last synced: 22 Oct 2024

https://github.com/joisino/speedbook

書籍『深層ニューラルネットワークの高速化』のサポートサイトです。

deep-learning deep-neural-networks distillation efficiency neural-networks pruning pytorch quantization

Last synced: 10 Oct 2024

https://github.com/esceptico/squeezer

Lightweight knowledge distillation pipeline

distillation knowledge-distillation model-compression pytorch

Last synced: 19 Oct 2024

https://github.com/vitae-transformer/simdistill

The official repo for [AAAI 2024] "SimDistill: Simulated Multi-modal Distillation for BEV 3D Object Detection""

3d-object-detection bird-view-image deep-learning distillation simulation

Last synced: 14 Nov 2024

https://github.com/ViTAE-Transformer/SimDistill

The official repo for [AAAI 2024] "SimDistill: Simulated Multi-modal Distillation for BEV 3D Object Detection""

3d-object-detection bird-view-image deep-learning distillation simulation

Last synced: 28 Oct 2024

https://github.com/Atenrev/diffusion_continual_learning

PyTorch implementation of various distillation approaches for continual learning of Diffusion Models.

continual-learning diffusion-models distillation generative-replay

Last synced: 30 Oct 2024

https://github.com/larry-athey/rpi-smart-still

Raspberry PI and Arduino/ESP32 powered smart still controller system. Designed around the Still Spirits T-500 column and boiler, but can be easily added to any other gas or electric still with a dephlegmator.

arduino automation distillation esp32 esp32-arduino fermentation genio homebrew hydrometer istill moonshine pot-still raspberry-pi reflux smart-still still-controller still-spirits t500

Last synced: 13 Dec 2024

https://github.com/zjcv/knowledgereview

[CVPR 2021] Distilling Knowledge via Knowledge Review

distillation feature-distillation knowledge-review pytorch zcls

Last synced: 14 Dec 2024

https://github.com/zjcv/overhaul

[ICCV 2019] A Comprehensive Overhaul of Feature Distillation

distillation feature-distillation overhaul pytorch zcls

Last synced: 14 Dec 2024

https://github.com/fightnyy/distillbart_eck

Repository for distillation of English, Chinese, Korean Multilingual BART

chinese distillation english korean mbart multilingual

Last synced: 13 Oct 2024

https://github.com/autodistill/autodistill-base-model-template

A template for use in creating Autodistill Base Model packages.

autodistill computer-vision distillation

Last synced: 08 Nov 2024

https://github.com/jakegrigsby/algorithm_distillation

minimalist pytorch replication of Algorithm Distillation (Laskin et al., 2022)

algorithm distillation rl

Last synced: 12 Nov 2024

https://github.com/tonywu71/distilling-and-forgetting-in-large-pre-trained-models

Code for my dissertation on "Distilling and Forgetting in Large Pre-Trained Models" for the MPhil in Machine Learning and Machine Intelligence (MLMI) at the University of Cambridge.

continual-learning distillation speech-recognition whisper

Last synced: 04 Dec 2024

https://github.com/z7zuqer/compression-pytorch

experiments on pytorch

distillation pruning quantization

Last synced: 28 Nov 2024

https://github.com/stanleylsx/text_classifier_torch

Text classification repository built with Torch, featuring training tricks, acceleration methods, and model optimization techniques like distillation, compression, and pruning. Supports single-label and multi-label training with customizable configurations. 基于Torch的文本分类仓库,包含训练技巧、加速方法以及模型优化技术,如蒸馏、压缩和剪枝。支持单标签和多标签训练,并提供可自定义的配置选项。

distillation pretrained-models pytorch text-classification

Last synced: 09 Dec 2024

https://github.com/autodistill/autodistill-target-model-template

A template for use in creating Autodistill Target Model packages.

autodistill computer-vision distillation

Last synced: 08 Nov 2024

https://github.com/ksasi/fair-distill

Distillation of GANs with fairness constraints

distillation fairness gans

Last synced: 05 Dec 2024

https://github.com/larry-athey/airhead

Air Still (or clone) upgrade that uses an SCR controller for the heating element and an ESP32 to make the whole thing smarter. Eliminates that constant full power on/off switching to the heating element and the possibility of scorching. Plus even more capabilities.

air-still boiler-controller distillation esp32 fermentation gin moonshine rum still-spirits temperature-control vodka whiskey

Last synced: 29 Nov 2024