https://github.com/ambidextrous9/quantization-of-models-ptq-and-qat
Quantization of Models : Post-Training Quantization(PTQ) and Quantize Aware Training(QAT)
https://github.com/ambidextrous9/quantization-of-models-ptq-and-qat
keras ptq pytorch pytorch-implementation qat quantization quantization-aware-training tflite tflite-models
Last synced: about 2 months ago
JSON representation
Quantization of Models : Post-Training Quantization(PTQ) and Quantize Aware Training(QAT)
- Host: GitHub
- URL: https://github.com/ambidextrous9/quantization-of-models-ptq-and-qat
- Owner: ambideXtrous9
- Created: 2024-05-07T15:30:26.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-12-29T09:38:00.000Z (10 months ago)
- Last Synced: 2025-01-11T21:32:45.484Z (9 months ago)
- Topics: keras, ptq, pytorch, pytorch-implementation, qat, quantization, quantization-aware-training, tflite, tflite-models
- Language: Jupyter Notebook
- Homepage:
- Size: 5.1 MB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Quantization
Link : [Quantization : PTQ and QAT on CNN using Keras](https://www.kaggle.com/code/sushovansaha9/quantization-ptq-and-qat-on-cnn-using-keras/notebook)
#### **Quantization is a model size reduction technique that converts model weights from high-precision floating-point representation to low-precision floating-point (FP) or integer (INT) representations, such as 16-bit or 8-bit.**


### **Post-Training Quantization (PTQ)**
Post-training quantization (PTQ) is a quantization technique where the model is quantized after it has been trained.
### **Quantization-Aware Training (QAT)**
Quantization-aware training (QAT) is a fine-tuning of the PTQ model, where the model is further trained with quantization in mind. The quantization process (scaling, clipping, and rounding) is incorporated into the training process, allowing the model to be trained to retain its accuracy even after quantization

#### References :
1. [QAT PyTorch](https://github.com/fbsamples/pytorch-quantization-workshop)
2. [QAT Details](https://towardsdatascience.com/inside-quantization-aware-training-4f91c8837ead)