https://github.com/ajithvcoder/unet-pytorch-customdataset
Unet pytorch on Custom dataset
https://github.com/ajithvcoder/unet-pytorch-customdataset
colab-notebook deep-learning pytorch segmentation unet-image-segmentation
Last synced: 20 days ago
JSON representation
Unet pytorch on Custom dataset
- Host: GitHub
- URL: https://github.com/ajithvcoder/unet-pytorch-customdataset
- Owner: ajithvcoder
- License: gpl-3.0
- Created: 2023-04-29T14:54:20.000Z (about 2 years ago)
- Default Branch: main
- Last Pushed: 2023-05-07T08:35:00.000Z (about 2 years ago)
- Last Synced: 2025-04-08T19:48:15.394Z (about 2 months ago)
- Topics: colab-notebook, deep-learning, pytorch, segmentation, unet-image-segmentation
- Language: Jupyter Notebook
- Homepage:
- Size: 23.6 MB
- Stars: 12
- Watchers: 1
- Forks: 7
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# U-Net: Semantic segmentation with PyTorch - Custom dataset
**Offline Dataset preprocessing**
Make sure the data you have satisfies following conditions
- Directory tree
```
- Data
-- imgs
-- all train images
-- masks
-- all masks (file names should be same as train images)
```
- Size and color space
- Make sure you have same size images
- Make sure you have RGB color space for all images
- if you need you can use ```utils\\resize_and_img_format.py`` file
- Mask values (I have tested only for these values it might also work for multi labels but you need to adjust the classes)
- Make sure mask values are only two i.e either 0 or 255
- if you need you can use ```utils\\convert_to_binary.py``` file**Flood Area dataset**
I have used all the offline dataset preprocessing for this [kaggle dataset](https://www.kaggle.com/datasets/faizalkarim/flood-area-segmentation)
- data.zip holds the preprocessed images
- ```unzip data.zip```**Installation**
- ```pip install -r requirements```
**Training**
- ```python train.py --epochs 100 --batch-size 16```
**Prediction**
- Visuvalize - ```python predict.py --model ./checkpoints/checkpoint_epoch100.pth -i ./data/imgs/0.jpg --viz --output ./0_OUT.jpg```
- You can use utils/blending.py to create blended image
Image and mask
![]()
Blended Segmentation :
**Colab NoteBook**
- [Link](https://colab.research.google.com/drive/1aM2VOqfhwo84zSe-MS_i2JyWO8bgouqD?usp=sharing)
**Credits**
- [Pytorch-Unet](https://github.com/milesial/Pytorch-UNet)