Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ryushinn/two-stream-dyntex-syn
An unofficial JAX implementation of "Two-Stream Convolutional Networks for Dynamic Texture Synthesis (CVPR'18)".
https://github.com/ryushinn/two-stream-dyntex-syn
dynamic-texture jax motion texture-synthesis
Last synced: 13 days ago
JSON representation
An unofficial JAX implementation of "Two-Stream Convolutional Networks for Dynamic Texture Synthesis (CVPR'18)".
- Host: GitHub
- URL: https://github.com/ryushinn/two-stream-dyntex-syn
- Owner: ryushinn
- Created: 2024-08-03T10:09:14.000Z (5 months ago)
- Default Branch: main
- Last Pushed: 2024-08-03T17:59:27.000Z (5 months ago)
- Last Synced: 2024-11-10T16:38:50.062Z (about 2 months ago)
- Topics: dynamic-texture, jax, motion, texture-synthesis
- Language: Python
- Homepage:
- Size: 48.9 MB
- Stars: 0
- Watchers: 2
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Two-Stream Convolutional Networks for Dynamic Texture Synthesis
> This is an unofficial [**JAX**](https://github.com/google/jax) implementation of "Two-Stream Convolutional Networks for Dynamic Texture Synthesis (CVPR'18)"
Please see the author's repo [here](https://github.com/tesfaldet/two-stream-dyntex-synth) and cite them:
```bib
@inproceedings{tesfaldet2018,
author = {Matthew Tesfaldet and Marcus A. Brubaker and Konstantinos G. Derpanis},
title = {Two-Stream Convolutional Networks for Dynamic Texture Synthesis},
booktitle = {IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2018}
}
```## Notes
We require these libraries:
```bash
pip install -U "jax[cuda]" equinox optax tqdm pillow
```Thus far, we can **NOT** fully figure out and stick with the configurations in the official repo, but it works anyway :smile:.
We re-write the appearance / motion stream network and the proposed two-stream loss in the paper, in JAX code. Networks are built on top of [equinox](https://github.com/patrick-kidger/equinox).
Pre-trained weights are ported from [here (VGG)](https://github.com/tchambon/A-Sliced-Wasserstein-Loss-for-Neural-Texture-Synthesis) and [here (optical flow network)](https://github.com/IVRL/DyNCA)
## Run
```bash
python two_stream_dyntex_syn.py --exemplar_path data/fish
```## Results
| X | fish | flames | escalator |
| ------ | ------------------------------------------- | --------------------------------------------- | ------------------------------------------------ |
| Input | ![A1](data/fish/fish.gif) | ![alt text](data/flames/flames.gif) | ![alt text](data/escalator/escalator.gif) |
| Output | ![alt text](data/fish/output/animation.gif) | ![alt text](data/flames/output/animation.gif) | ![alt text](data/escalator/output/animation.gif) |## Last words
Thanks all efforts put on making all mentioned repositories public.
We appreciate bug reports. I will fix them when I make time around.