An open API service indexing awesome lists of open source software.

https://github.com/mos9527/blenderimagelut

Real-time Color Grading with Image LUTs in Blender Compositor
https://github.com/mos9527/blenderimagelut

blender color-grading lut

Last synced: 4 months ago
JSON representation

Real-time Color Grading with Image LUTs in Blender Compositor

Awesome Lists containing this project

README

          

BlenderImageLUT
---
Real time image-based color grading in Blender's compositor.
# WARNING
Only LDR sRGB LUTs are supported. There is NO correctness guarantee for this implementation and is currently *NOT* color correct. Use at your own risk.

# Usage
## Loading the Nodes
- [Download `BlenderImageLut.blend`](https://github.com/mos9527/blender-image-lut/raw/refs/heads/main/BlenderImageLUT.blend)
- In your Blender project, Go to `File > Append...`
image

- Open the downloaded `.blend` file and go to `NodeTree`, where you can find the node **`BlenderImageLUT`**. Click `Append`
image

- In the `Compositor` tab, enable `Use Nodes` and search for `BlenderImageLUT` with F3, add it to your node tree
image

## Loading the LUT image
- Add an `Image` node with F3 and then load the LUT image of your choice **and setup the `LUT Dimension` ($D$) properly**.
- **ATTENTION:** Please refer to the [Notes](#notes) section for what kind of LUT image you should be using.
- Ensure your image transform is linear. Again, **read the [Notes](#notes) section** for more information.
- Connect the nodes. Your final node setup should look like this:
image

- To view the result, you can enable the compositor under the `Viewport Shading` tab and set the compositor option to `Always` as shown below
image

# Notes
## LUT Dimension
Your 3D LUT transform should be a **2D Image** of pixel dimension $(D^2,D)$, where $$D$$ is the **uniform size in pixel of your LUT**

The 3D texture should be swizzled onto the 2D plane like this:
```
Numbers indicate the index of the Z-slice
┌───────┬───────┬───────┬─────┐
│ │ │ │ │
│ 1 │ 2 │ 3 │ ..n │
│ │ │ │ │
└───────┴───────┴───────┴─────┘
```
For example, here's a *netural* one with $D=16$, Generated by [lutgen.py](https://github.com/mos9527/BlenderImageLUT/blob/main/lutgen.py)

![image](https://github.com/user-attachments/assets/60dac6c8-bfb6-45dd-8017-da5003cbc777)

The $R,G,B$ channels should advance in value in the UV (pixel) direction shown in the following diagrams:
- $R$ channel
image

- $G$ channel
image

- $B$ channel
image

## Colorspace (with sRGB LUTs)
LUTs **MUST** **contain mapping from Linear (pixel coords) to Linear (colors)** since Blender's compositor uses linear colorspace at all times

[With PNG in sRGB colorspace, the transfer function is simply a power function with gamma 2.2](http://www.libpng.org/pub/png/spec/1.2/png-1.2-pdg.html#C.Anc-color). The following node setup converts the transformed (sRGB->Linear) colors back to sRGB (encoded, linear in UV) colors
image

Another setup like this also works, which skips color conversions and uses the sample as is
image

## Transposing LUTs of different dimensions
(todo)

# How it works
By implementing Bilinear Filtering and 3D texture sampling from scratch with Compositor Nodes and performs LERPed color look-up in runtime.

Generally there will be 8 texture lookups for each pixel - which is expensive. Use this node setup sparingly or only at render-time!

# References
- Real Time Rendering 4th Edition
- https://github.com/mos9527/sssekai_blender_io
- https://docs.blender.org/manual/en/latest/render/color_management.html