Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mosessoh/iconcolor
Automatic icon colorization using deep convolutional neural networks. "Towards Icon Design Using Machine Learning." In Stanford CS229, Fall 2017.
https://github.com/mosessoh/iconcolor
cnn colorization deep-learning icons machine-learning pytorch unet
Last synced: 11 days ago
JSON representation
Automatic icon colorization using deep convolutional neural networks. "Towards Icon Design Using Machine Learning." In Stanford CS229, Fall 2017.
- Host: GitHub
- URL: https://github.com/mosessoh/iconcolor
- Owner: mosessoh
- Created: 2017-12-06T08:12:25.000Z (almost 7 years ago)
- Default Branch: master
- Last Pushed: 2017-12-13T09:19:11.000Z (almost 7 years ago)
- Last Synced: 2024-08-02T15:06:30.511Z (4 months ago)
- Topics: cnn, colorization, deep-learning, icons, machine-learning, pytorch, unet
- Language: Python
- Homepage:
- Size: 1.16 MB
- Stars: 37
- Watchers: 5
- Forks: 4
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Towards Icon Design Using Machine Learning
![icon conversion](/assets/conversion.png)
[[video]](https://youtu.be/rFnMdFkjpAE) [[medium post]](https://medium.com/@moses.soh/towards-automatic-icon-design-using-machine-learning-423cbe6710fe) [[poster]](#poster)
## Introduction
I created a model that learns how to turn outlines into stylish and colorful icons. Icon and logo design is difficult — expert human designers choose from line weights, colors, textures and shapes to create beautiful icons such as [these (I'm a big fan)](https://dribbble.com/yoga). But there seems to be a pattern to how each designer make their choices. So, I decided it would be interesting to try to train a model that learns a designer's style, and then takes any freely available icon outlines (e.g. from [the Noun Project](https://thenounproject.com/)), and color and style them exactly as how a designer would have completely automatically.
The icon generator is a convolutional neural network called a U-Net that was trained on an icon set from [Smashicons](smashicons.com). I optimized the generator against the L1 loss and an adversarial loss under a Conditional Generative Adversarial Network (cGAN) setup. A quick view of our results:
![before and after](/assets/before_after.gif)
![poster](/assets/poster.svg)
The poster above summarizes the technical approach of tis project. The image below showcases the performance of our best model on test set images from the Smashicon dataset.
![test](/assets/test.png)
## How to use
### Clone the repo
```
git clone https://github.com/mosessoh/iconcolor
```### Download pre-trained models
```
cd iconcolor
cd models
python fetch_models.py
```### Inference
```
python color_icon.py assets/demo.png
```The `color_icon.py` file contains a script to load the pre-trained generator contained in `model/outline2yellow_generator_gan.pth` and use it to colorize an input icon. This is the generator trained against L1 and adversarial loss. If you're getting funky colorizations (the adversarial loss encourages the use of more vibrant colors), the weights for the L1-optimized generator are at `model/outline2yellow_generator.pth`. Note that the model expects a 1 x 1 x 128 x 128 input, and saves the output at `assets/output.png`. If your setup is correct, you should get the icon at the start of our "Introduction" (the outline icon is from [IconBros](https://www.iconbros.com/) and the colored icon is produced by our model).
### Training
The `train_model.py` file contains the training script used to train the discriminator using `BCEWithLogitsLoss` and the generator against L1 and adversarial loss. The `model/outline2yellow_discriminator.pth` and `model/outline2yellow_generator_gan.pth` files contain a useful checkpoints so your discriminator and generator do not need to start from scratch.