Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/IyatomiLab/LeafGAN
https://github.com/IyatomiLab/LeafGAN
Last synced: 7 days ago
JSON representation
- Host: GitHub
- URL: https://github.com/IyatomiLab/LeafGAN
- Owner: IyatomiLab
- License: other
- Created: 2020-07-06T13:31:06.000Z (over 4 years ago)
- Default Branch: master
- Last Pushed: 2023-10-16T04:42:27.000Z (about 1 year ago)
- Last Synced: 2024-08-02T15:37:37.770Z (3 months ago)
- Language: Python
- Size: 5.08 MB
- Stars: 71
- Watchers: 7
- Forks: 22
- Open Issues: 6
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
## LeafGAN — Official Pytorch Implementation
![Teaser image](media/Teaser.png)
**LeafGAN: An Effective Data Augmentation Method for Practical Plant Disease Diagnosis**
Quan Huu Cap, Hiroyuki Uga, Satoshi Kagiwada, Hitoshi IyatomiPaper: https://arxiv.org/abs/2002.10100
Accepted for publication in the **IEEE Transactions on Automation Science and Engineering (T-ASE)**Abstract: *Many applications for the automated diagnosis of plant disease have been developed based on the success of deep learning techniques. However, these applications often suffer from overfitting, and the diagnostic performance is drastically decreased when used on test datasets from new environments. In this paper, we propose LeafGAN, a novel image-to-image translation system with own attention mechanism. LeafGAN generates a wide variety of diseased images via transformation from healthy images, as a data augmentation tool for improving the performance of plant disease diagnosis. Thanks to its own attention mechanism, our model can transform only relevant areas from images with a variety of backgrounds, thus enriching the versatility of the training images. Experiments with five-class cucumber disease classification show that data augmentation with vanilla CycleGAN cannot help to improve the generalization, i.e. disease diagnostic performance increased by only 0.7% from the baseline. In contrast, LeafGAN boosted the diagnostic performance by 7.4%. We also visually confirmed the generated images by our LeafGAN were much better quality and more convincing than those generated by vanilla CycleGAN.*
![Teaser image](media/Teaser_result.png)
## New Features- Jul 25, 2021: Added a new option to load the mask images from disk. Running the LFLSeg module during training is quite slow. Instead, we can generate the masks of all training images beforehand and load it during training. Refer to [prepare_mask.py](https://github.com/IyatomiLab/LeafGAN/blob/master/prepare_mask.py) of how to generate mask images from the pre-trained LFLSeg, and the [unaligned_masked_dataset.py](https://github.com/IyatomiLab/LeafGAN/blob/master/data/unaligned_masked_dataset.py) of how to load the mask images. See below of how to train with this new feature.
## LFLSeg module
Tutorial of how to create dataset and train the LFLSeg module is available in the [LFLSeg](https://github.com/IyatomiLab/LeafGAN/tree/master/LFLSeg)![LFLSeg_result](media/LFLSeg_infer.png)
## Datasets
- Normal dataset: A normal dataset will have 4 directories for two domains A (trainA, testA) and B (trainB, testB). Each directory must contain only images (no other file types).
An example of the dataset named `healthy2brownspot`
```bash
/path/to/healthy2brownspot/trainA
/path/to/healthy2brownspot/testA
/path/to/healthy2brownspot/trainB
/path/to/healthy2brownspot/testB
```
- Masked dataset: This dataset is normal dataset + pre-generated mask images. First, you need to generate your own mask images using the [prepare_mask.py](https://github.com/IyatomiLab/LeafGAN/blob/master/prepare_mask.py). An example of the masked dataset named `healthy2brownspot_mask`
```bash
/path/to/healthy2brownspot/trainA
/path/to/healthy2brownspot/trainA_mask # mask images of trainA
/path/to/healthy2brownspot/testA
/path/to/healthy2brownspot/trainB
/path/to/healthy2brownspot/trainB_mask # mask images of trainB
/path/to/healthy2brownspot/testB
```
## LeafGAN/CycleGAN train/test
- Make sure to prepare the dataset first
- Train a model (example with the dataset `healthy2brownspot`):
```bash
python train.py --dataroot /path/to/healthy2brownspot --name healthy2brownspot_leafGAN --model leaf_gan
```
- Train model with mask images (example with the dataset `healthy2brownspot_mask`):
```bash
python train.py --dataroot /path/to/healthy2brownspot --name healthy2brownspot_leafGAN --model leaf_gan --dataset_mode unaligned_masked
```
To see more intermediate results, check out `./checkpoints/healthy2brownspot_leafGAN/web/index.html`.
- Test the model:
```bash
python test.py --dataroot /path/to/healthy2brownspot --name healthy2brownspot_leafGAN --model leaf_gan
```
- The test results will be saved to a html file here: `./results/healthy2brownspot_leafGAN/latest_test/index.html`.## Citation
```
@article{cap2020leafgan,
title = {LeafGAN: An Effective Data Augmentation Method for Practical Plant Disease Diagnosis},
author = {Quan Huu Cap and Hiroyuki Uga and Satoshi Kagiwada and Hitoshi Iyatomi},
journal = {IEEE Transactions on Automation Science and Engineering},
year = {2020},
doi = {10.1109/TASE.2020.3041499}
}
```## Acknowledgments
Our code is inspired by [pytorch-CycleGAN](https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix).