https://github.com/huchenlei/comfyui_densediffusion
DenseDiffusion custom node for ComfyUI
https://github.com/huchenlei/comfyui_densediffusion
Last synced: 3 months ago
JSON representation
DenseDiffusion custom node for ComfyUI
- Host: GitHub
- URL: https://github.com/huchenlei/comfyui_densediffusion
- Owner: huchenlei
- License: apache-2.0
- Created: 2024-06-05T01:39:47.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2025-02-25T16:34:32.000Z (4 months ago)
- Last Synced: 2025-03-29T12:08:52.682Z (3 months ago)
- Language: Python
- Size: 51.8 KB
- Stars: 127
- Watchers: 1
- Forks: 12
- Open Issues: 7
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Update
2024/12/27 Changed to allow multiple masks to be entered in batches.
2024/12/26 If there is no mask input, a solid mask is used instead.
# ComfyUI_densediffusion
DenseDiffusion custom node for ComfyUI. Implements the [DenseDiffusion](https://github.com/naver-ai/DenseDiffusion)-like method for regional prompt used in [Omost](https://github.com/lllyasviel/Omost) project.## What this repo implements
Normal attention calculation can be written as `y=softmax(q@k)@v`. DenseDiffusion introduces the method of attention manipulation on `q@k`, which makes the expression look like `y=softmax(modify(q@k))@v`.
The original DenseDiffusion's implementation does not perform very well according to my testing so here I only implemented the version used in Omost repo. Refer to https://github.com/lllyasviel/Omost#regional-prompter for other regional prompt methods.## How to use
## Limitation [IMPORTANT]
Currently ComfyUI's attention replacements do not compose with each other, so this regional prompt method does not compose with IPAdapter. I am currently working on a universal model patcher to solve this issue.