Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/funnyplanter/CuNNy
Convolutional upscaling Neural Network, yeah!
https://github.com/funnyplanter/CuNNy
anime magpie mpv upscaling
Last synced: 3 months ago
JSON representation
Convolutional upscaling Neural Network, yeah!
- Host: GitHub
- URL: https://github.com/funnyplanter/CuNNy
- Owner: funnyplanter
- License: lgpl-3.0
- Created: 2024-06-17T15:00:18.000Z (5 months ago)
- Default Branch: master
- Last Pushed: 2024-08-01T22:27:31.000Z (4 months ago)
- Last Synced: 2024-08-02T06:11:41.201Z (4 months ago)
- Topics: anime, magpie, mpv, upscaling
- Language: GLSL
- Homepage:
- Size: 4.33 MB
- Stars: 13
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-mpv - CuNNy - Cute and funny CNN-based upscaler optimized for anime. (Other)
README
# CuNNy - Convolutional upscaling Neural Network, yeah!
Nice, small, and fast realtime CNN-based upscaler. Trained on visual novel
screenshots/CG.Currently very new and immature ðŸ˜.
Supports exporting to an mpv ~~meme~~shader!
And now a Magpie effect!
# Usage
mpv shaders are found inside the `mpv/` directory.
Metric-focused variants are found inside the `results/` directory.Magpie effects are found inside the `magpie/` directory.
The order of quality is 8x32 > 4x32 > 4x24 > 4x16 > 4x12 > 3x12 > 2x12 > fast >
faster > veryfast. Conversely the order of speed would be the reverse.The `CuNNy-fast` shader is the recommended shader if you're on a slow machine.
Variants:
- (No-suffix): Trained to be neutral and do no denoising or sharpening.
- `SOFT`: Trained to anti-alias & produce a soft output. Is probably the most
'artifact-free' variant if you don't count blur.
- `DS`: Trained to denoise & sharpen.
- `NVL`: Trained to upscale visual novel games & high-quality illustrations.There are versions of the mpv shaders use 8-bit `dp4a` instructions. They can be
many times faster than the standard upscaling shader depending on if your
hardware supports accelerated `dp4a` instructions. Requires `vo=gpu-next` with
`gpu-api=vulkan`. They can be found inside the `dp4a/` subdirectories.# Training
Tested training with PyTorch nightly. If any errors arise try using nightly.
Prepare data by running `sh scripts/build.sh`, then `sh scripts/split.sh
`, then `py scripts/proc.py <128-grids> `.To train `py train.py ` where `N` is the number of internal
convolutions and `D` is the number of feature layers.Convert the resulting model to an mpv shader by running
`py mpv.py `.Convert the resulting model to a Magpie effect by running
`py magpie.py `.Trains very fast on my machine.
# Quality
See `results/`.
# License
LGPL v3