https://github.com/aidinhamedi/pytorch-img-classification-trainer
A Pytorch image classification training system, easy to use (relatively 😅). Supports Tensorboard logging and techniques like "Gradient centralization", "Adaptive gradient clipping" and etc...
https://github.com/aidinhamedi/pytorch-img-classification-trainer
Last synced: 2 months ago
JSON representation
A Pytorch image classification training system, easy to use (relatively 😅). Supports Tensorboard logging and techniques like "Gradient centralization", "Adaptive gradient clipping" and etc...
- Host: GitHub
- URL: https://github.com/aidinhamedi/pytorch-img-classification-trainer
- Owner: AidinHamedi
- License: mit
- Created: 2025-01-05T10:01:39.000Z (5 months ago)
- Default Branch: main
- Last Pushed: 2025-03-14T19:07:59.000Z (3 months ago)
- Last Synced: 2025-03-14T20:23:17.527Z (3 months ago)
- Language: Python
- Homepage:
- Size: 547 KB
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Pytorch Image Classification Trainer
![]()
[](https://opensource.org/licenses/MIT)
[](https://github.com/astral-sh/ruff)**A Pytorch image classification training system, easy to use (relatively 😅)**
Supports **Tensorboard** logging and techniques like **Gradient centralization**, **Adaptive gradient clipping** and etc...## 🚀 Getting started
### Step 1: Clone the repository
```bash
git clone https://github.com/AidinHamedi/Pytorch-Img-Classification-Trainer.git
```### Step 2: Install the requirements
This repo uses [**uv**](https://github.com/astral-sh/uv) to manage it dependencies.
```bash
uv sync
```### Step 3: Check out the example
There is an example already made that uses this system to run experiments (hyper parameters tuning), The experiments params are set in the `./expers.toml` file and the experiment runner is `./run_expers.py`.
The experiment runner will run the `train` function in `./train_exper.py` with the experiment params as the arg in that function you can set what the params do and etc...
In end you will see that the fit function of the `Training_Engine` is being called and here the magic happens 😊.## 📚 Documentation
### Training Engine
You can access the main `fit` function from `./Training_Engine/trainer.py` file.
The `fit` function takes the following **required** arguments:- `model`: The model to be trained.
- `train_dataloader`: The training data loader. (**DynamicArg**)
- `test_dataloader`: The test data loader. (**DynamicArg**)
- `optimizer`: The optimizer to be used for training.
- `loss_fn`: The loss function to be used for training.And done all of the other args are optional and used setting mixed precision and etc...
I think you have noticed that the `train_dataloader` and `test_dataloader` are **DynamicArg** so what is a DynamicArg?### DynamicArg
A DynamicArg is a special type of argument that allows you to pass a function as an generator and outputs a value based on the the environment.
Like for example you make the `train_dataloader` DynamicArg return a pytorch dataloader that adjusts the augmentation amount based on the epoch count.
Its not that complicated just by looking at the code you will understand it.
You can import dynamic args from `./Training_Engine/Utils/Base/dynamic_args.py`.### Utils
There are some utils with the training engine that you can use for like loading the images and etc...
- `./Training_Engine/Utils/Data/data_loader.py`: A utility for loading images from a directory or making a pytorch dataset for loading large datasets on the fly.
- `./Training_Engine/Utils/Data/normalization.py`: A utility for normalizing images and getting class weighting.## 📷 Example Output

## 📝 License
Copyright (c) 2025 Aidin HamediThis software is released under the MIT License.
https://opensource.org/licenses/MIT