Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ali-vilab/MimicBrush
Official implementations for paper: Zero-shot Image Editing with Reference Imitation
https://github.com/ali-vilab/MimicBrush
aigc customization image-composition image-editing texture-transfer
Last synced: 18 days ago
JSON representation
Official implementations for paper: Zero-shot Image Editing with Reference Imitation
- Host: GitHub
- URL: https://github.com/ali-vilab/MimicBrush
- Owner: ali-vilab
- License: apache-2.0
- Created: 2024-06-07T03:24:20.000Z (5 months ago)
- Default Branch: main
- Last Pushed: 2024-06-15T16:17:10.000Z (5 months ago)
- Last Synced: 2024-08-01T18:33:11.193Z (4 months ago)
- Topics: aigc, customization, image-composition, image-editing, texture-transfer
- Language: Python
- Homepage: https://xavierchen34.github.io/MimicBrush-Page/
- Size: 15.5 MB
- Stars: 974
- Watchers: 13
- Forks: 70
- Open Issues: 3
-
Metadata Files:
- Readme: readme.md
- License: LICENSE
Awesome Lists containing this project
- awesome-diffusion-categorized - [Code
- ai-game-devtools - MimicBrush - shot Image Editing with Reference Imitation. |[arXiv](https://arxiv.org/abs/2406.07547) | | Image | (<span id="image">Image</span> / <span id="tool">Tool (AI LLM)</span>)
- awesome-llm-projects - MimicBrush - shot Image Editing with Reference Imitation (Projects / 🌄 Image)
README
Zero-shot Image Editing with Reference Imitation
Xi Chen
·
Yutong Feng
·
Mengting Chen
·
Yiyang Wang
·
Shilong Zhang
·
Yu Liu
·
Yujun Shen
·
Hengshuang Zhao
The University of Hong Kong | Alibaba Group | Ant Group
## News
* **[2024.06.12]** Release inference code, local gradio demo, online demo.
* **[Todo]** Release our benchmark.## Community Contributions
[ComfyUI version](https://github.com/AIFSH/ComfyUI-MimicBrush) by [@AIFSH](https://github.com/AIFSH)## Installation
Install with `conda`:
```bash
conda env create -f environment.yaml
conda activate mimicbrush
```
or `pip`:
```bash
#Python==3.8.5
pip install -r requirements.txt
```## Download Checkpoints
Download SD-1.5 and SD-1.5-inpainting checkpoint:
* You could download them from HuggingFace [stable-diffusion-v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5) and [stable-diffusion-inpainting](https://huggingface.co/runwayml/stable-diffusion-inpainting/)
* However, the repo above contains many models that would not be used, we provide a clean version at [cleansd](https://modelscope.cn/models/xichen/cleansd/)Download MimicBrush checkpoint, along with a VAE, a CLIP encoder, and a depth model
* Download the weights on ModelScope [xichen/MimicBrush](https://www.modelscope.cn/models/xichen/MimicBrush)
* The model is big because it contains two U-Nets.You could use the following code to download them from modelscope
```python
from modelscope.hub.snapshot_download import snapshot_download as ms_snapshot_downloadsd_dir = ms_snapshot_download('xichen/cleansd', cache_dir='./modelscope')
print('=== Pretrained SD weights downloaded ===')
model_dir = ms_snapshot_download('xichen/MimicBrush', cache_dir='./modelscope')
print('=== MimicBrush weights downloaded ===')
```
or from Huggingface```python
from huggingface_hub import snapshot_download
snapshot_download(repo_id="xichenhku/cleansd", local_dir="./cleansd")
print('=== Pretrained SD weights downloaded ===')
snapshot_download(repo_id="xichenhku/MimicBrush", local_dir="./MimicBrush")
print('=== MimicBrush weights downloaded ===')
```## Gradio Demo
First, modify `./configs/inference.yaml` to set the path of model weight. Afterwards, run the script:
```bash
python run_gradio3_demo.py
```The gradio demo would look like the UI shown below.
*Please do not forget to click ''keep the original shape'' if you want condut texture transfer like the third case.
A biref tutorial:
* Upload/select a source image to edit.
* Draw the to-edit regionon the source image.
* Upload/select a reference image.
* Run.
## Inference
1. Dowload our evaluation benchmark at Google Drive:
* URL: [to be released]2. Set the path to each dataset and checkpoints in `./config/inference.yaml` :
3. Run inference with
```bash
python run_inference_benchmark.py
```## Acknowledgements
This project is developped on the codebase of [IP-Adapter](https://github.com/tencent-ailab/IP-Adapter) and [MagicAnimate](https://github.com/magic-research/magic-animate) . We appreciate this great work!## Citation
If you find this codebase useful for your research, please use the following entry.
```BibTeX
@article{chen2024mimicbrush,
title={Zero-shot Image Editing with Reference Imitation},
author={Chen, Xi and Feng, Yutong and Chen, Mengting and Wang, Yiyang, and Zhang, Shilong and Yu, Liu and Shen, Yujun and Zhao, Hengshuang},
journal={arXiv preprint arXiv:2406.07547},
year={2024}
}
```
## Star History
[![Star History Chart](https://api.star-history.com/svg?repos=ali-vilab/MimicBrush&type=Date)](https://star-history.com/#ali-vilab/MimicBrush&Date)