Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/enkiwang/style-transfer-jc

style transfer
https://github.com/enkiwang/style-transfer-jc

Last synced: 6 days ago
JSON representation

style transfer

Awesome Lists containing this project

README

        

# fast-neural-style :city_sunrise: :rocket:
Jingchang, this repo is forked from original pytorch/examples.

You may need to install pytorch.

Just tested, the code works in my computer.

requirement: pytorch 0.4.1(or higher)

1) git clone https://github.com/enkiwang/style-transfer-JC.git

2) cd style-transfer-JC/neural_style/

3) transfer the style from a **style image**(e.g., candy, mosaic etc) to a **content-image**(eg., your beautiful gf!).

Suppose the content image is amber.jpg, the style image is candy, the corresponding pretrained model should be candy.pth, save the transferred image to the output-images/ with name amber_content.jpy( or your sweet gf.jpg!). Check the generated image as amber_content.jpg.

Enjoy!!

```
python neural_style.py eval --content-image ../images/content-images/amber.jpg --model ../saved_models/candy.pth --output-image ../images/output-images/amber_candy.jpg
```

----------------------------------
Here is the original link: https://github.com/pytorch/examples/tree/master/fast_neural_style
--------------------------------
This repository contains a pytorch implementation of an algorithm for artistic style transfer. The algorithm can be used to mix the content of an image with the style of another image. For example, here is a photograph of a door arch rendered in the style of a stained glass painting.

The model uses the method described in [Perceptual Losses for Real-Time Style Transfer and Super-Resolution](https://arxiv.org/abs/1603.08155) along with [Instance Normalization](https://arxiv.org/pdf/1607.08022.pdf). The saved-models for examples shown in the README can be downloaded from [here](https://www.dropbox.com/s/lrvwfehqdcxoza8/saved_models.zip?dl=0).





## Requirements
The program is written in Python, and uses [pytorch](http://pytorch.org/), [scipy](https://www.scipy.org). A GPU is not necessary, but can provide a significant speed up especially for training a new model. Regular sized images can be styled on a laptop or desktop using saved models.

## Usage
Stylize image
```
python neural_style/neural_style.py eval --content-image --model --output-image --cuda 0
```
* `--content-image`: path to content image you want to stylize.
* `--model`: saved model to be used for stylizing the image (eg: `mosaic.pth`)
* `--output-image`: path for saving the output image.
* `--content-scale`: factor for scaling down the content image if memory is an issue (eg: value of 2 will halve the height and width of content-image)
* `--cuda`: set it to 1 for running on GPU, 0 for CPU.

Train model
```bash
python neural_style/neural_style.py train --dataset --style-image --save-model-dir --epochs 2 --cuda 1
```

There are several command line arguments, the important ones are listed below
* `--dataset`: path to training dataset, the path should point to a folder containing another folder with all the training images. I used COCO 2014 Training images dataset [80K/13GB] [(download)](http://mscoco.org/dataset/#download).
* `--style-image`: path to style-image.
* `--save-model-dir`: path to folder where trained model will be saved.
* `--cuda`: set it to 1 for running on GPU, 0 for CPU.

Refer to ``neural_style/neural_style.py`` for other command line arguments. For training new models you might have to tune the values of `--content-weight` and `--style-weight`. The mosaic style model shown above was trained with `--content-weight 1e5` and `--style-weight 1e10`. The remaining 3 models were also trained with similar order of weight parameters with slight variation in the `--style-weight` (`5e10` or `1e11`).

## Models

Models for the examples shown below can be downloaded from [here](https://www.dropbox.com/s/lrvwfehqdcxoza8/saved_models.zip?dl=0) or by running the script ``download_saved_models.py``.