https://github.com/softwaremill/sentinel-cgan
Sentinel generative conditional adversarial network implementation
https://github.com/softwaremill/sentinel-cgan
gan gis machine-learning satellite-imagery
Last synced: 3 months ago
JSON representation
Sentinel generative conditional adversarial network implementation
- Host: GitHub
- URL: https://github.com/softwaremill/sentinel-cgan
- Owner: softwaremill
- Created: 2019-12-15T21:24:59.000Z (about 6 years ago)
- Default Branch: master
- Last Pushed: 2022-12-21T04:47:14.000Z (about 3 years ago)
- Last Synced: 2025-07-30T22:23:48.365Z (6 months ago)
- Topics: gan, gis, machine-learning, satellite-imagery
- Language: Python
- Homepage:
- Size: 18.8 MB
- Stars: 5
- Watchers: 3
- Forks: 3
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Sentinel cGAN

Data argumentation facility used during [modifiable areal](https://en.wikipedia.org/wiki/Modifiable_areal_unit_problem) unit problem research project. Read more in our article on Medium - [Generative adversarial networks in satellite image datasets augmentation.](https://blog.softwaremill.com/generative-adversarial-networks-in-satellite-image-datasets-augmentation-b7045d2f51ab)
## Usage
### Data
Sample data can be downloaded from our [S3](https://sml-ml-data-sets.s3.eu-central-1.amazonaws.com/sentinel-cgan/bdot.tar.gz) bucket or by utilizing the `scgan/data_download.py`.
To produce you own data you can use `scgan/gdal_operations.py`. Please note that three files will be needed: TIFF representing land cover, TIFF with satellite image and generate grid in form of an ArcGis shape file.
Dataset has to meet following criteria in terms of the directory structure:
```
dataset (name of the dataset)
├── train (samples used during training)
│ ├── data_descriptor.csv (names / ids of the files)
│ ├── LC (land cover data folder)
│ │ ├── LC_10.tif
│ │ ├── LC_1.tif
....................
│ │ └── LC_n.tif
│ └── S
│ ├── S_10.tif
│ ├── S_1.tif
....................
│ └── S_n.tif
├── plot (samples used during intermediate result plotting after each epoch)
│ ├── data_descriptor.csv (names / ids of the files)
│ ├── LC (land cover data folder)
│ │ ├── LC_10.tif
│ │ ├── LC_1.tif
....................
│ │ └── LC_n.tif
│ └── S
│ ├── S_10.tif
│ ├── S_1.tif
....................
│ └── S_n.tif
└── test (samples used during predict phase)
├── data_descriptor.csv (names / ids of the files)
├── LC (land cover data folder)
│ ├── LC_10.tif
│ ├── LC_1.tif
....................
│ └── LC_n.tif
└── S
├── S_10.tif
....................
└── S_1.tif
```
### Train
The default training configuration can be run from `scgan/train.py`. Default dataset is called `bdot`. Please note that chosen hyperparameters were set for the training to perform best on the sample dataset related to central Poland and Sentinel-2 images.
### Predict
To generate artificial satellite images from predefined mask use `scgan/predict`. If you did not train a model you can download one of ours from [S3](https://sml-ml-data-sets.s3.eu-central-1.amazonaws.com/sentinel-cgan/generator.h5). Masks have to placed in relevant dataset `test` subdirectory.

# References
1. [Original Pix2Pix paper](https://arxiv.org/abs/1611.07004)
1. [Tips on training GAN](https://medium.com/@utk.is.here/keep-calm-and-train-a-gan-pitfalls-and-tips-on-training-generative-adversarial-networks-edd529764aa9)
1. [Reference implementation using Keras](https://machinelearningmastery.com/how-to-develop-a-pix2pix-gan-for-image-to-image-translation/)