https://github.com/dudeperf3ct/end-to-end-images
This repo contains code for training and deploying PyTorch models with applications in images in end-to-end fashion.
https://github.com/dudeperf3ct/end-to-end-images
fastapi image-classification pytorch triton-inference-server
Last synced: 3 months ago
JSON representation
This repo contains code for training and deploying PyTorch models with applications in images in end-to-end fashion.
- Host: GitHub
- URL: https://github.com/dudeperf3ct/end-to-end-images
- Owner: dudeperf3ct
- License: mit
- Created: 2021-08-06T15:42:42.000Z (almost 4 years ago)
- Default Branch: develop
- Last Pushed: 2021-11-20T08:05:05.000Z (over 3 years ago)
- Last Synced: 2024-12-31T10:17:30.467Z (5 months ago)
- Topics: fastapi, image-classification, pytorch, triton-inference-server
- Language: Jupyter Notebook
- Homepage:
- Size: 78.8 MB
- Stars: 3
- Watchers: 2
- Forks: 1
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Almost end-to-end image classification
- [x] PyTorch training, evaluation, inference and benchmark code with SOTA practices (support for wandb.ai logging)
- [x] Onnx conversion, calibration and inference
- [x] TensorRT conversion and inference
- [x] Example notebook
- [ ] C++ Inference (Future release)# Deployment
- [x] [FastAPI](https://github.com/dudeperf3ct/end-to-end-images/tree/fastapi) (`fastapi` branch) [+ Heroku deployment]
- [x] [Triton Inference Server](https://github.com/dudeperf3ct/end-to-end-images/tree/triton) (`triton` branch)### What is this?
In this project, for a given image classification task, we can perform a large number of experiments just by changing `param.json` file.
The project supports pretraining and finetuning of `timm models`. The training code supports scope for lot many customization for example adding more optimizers in `_get_optimizers` or schedulers in `_get_scheduler` functions.
It also contains an option to convert model to onnx and TensorRT. There are reference inference scripts for all different formats of the model.
----
### Getting Started
- How to run with custom dataset?
- replace `datasets_to_df` in `utils.py` with a function that returns a dataframe with 2 columns containing image file paths named `file` and labels named `label`.
- check if `prepare_df` in `main.py` is compatible.
- Create many different models and experiments just by replacing `model_name` in `params.json` (by creating appropriate folder for each model under `experiments` folder) or `finetune_layer` parameter or any other hyper parameter in json file.-----
### Example
`Notebooks` folder contains a sample notebook to run `cifar10` dataset end to end.
`Docker` container
```bash
sudo docker build -t e2e .
sudo chmod +x run_container.sh
./run_container.sh
python3 main_cifar10.py
```To run TensorRT Inference, build it's corresponding docker and set `do_trt_inference` to `True` in `main_cifar10.py`.
-----