{"id":13564911,"url":"https://github.com/junyanz/interactive-deep-colorization","last_synced_at":"2025-05-15T07:04:12.107Z","repository":{"id":37276437,"uuid":"90598753","full_name":"junyanz/interactive-deep-colorization","owner":"junyanz","description":"Deep learning software for colorizing black and white images with a few clicks.","archived":false,"fork":false,"pushed_at":"2022-07-29T10:15:32.000Z","size":12125,"stargazers_count":2705,"open_issues_count":32,"forks_count":448,"subscribers_count":121,"default_branch":"master","last_synced_at":"2025-04-14T10:42:48.537Z","etag":null,"topics":["automatic-colorization","caffe","colorization","computer-vision","deep-learning","deep-learning-algorithms","interactive"],"latest_commit_sha":null,"homepage":"https://richzhang.github.io/ideepcolor/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/junyanz.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2017-05-08T07:27:47.000Z","updated_at":"2025-04-13T03:50:47.000Z","dependencies_parsed_at":"2022-07-20T11:47:43.637Z","dependency_job_id":null,"html_url":"https://github.com/junyanz/interactive-deep-colorization","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/junyanz%2Finteractive-deep-colorization","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/junyanz%2Finteractive-deep-colorization/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/junyanz%2Finteractive-deep-colorization/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/junyanz%2Finteractive-deep-colorization/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/junyanz","download_url":"https://codeload.github.com/junyanz/interactive-deep-colorization/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":254292039,"owners_count":22046426,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["automatic-colorization","caffe","colorization","computer-vision","deep-learning","deep-learning-algorithms","interactive"],"created_at":"2024-08-01T13:01:37.843Z","updated_at":"2025-05-15T07:04:12.087Z","avatar_url":"https://github.com/junyanz.png","language":"Python","readme":"\n# Interactive Deep Colorization\n\n[Project Page](https://richzhang.github.io/InteractiveColorization) | [Paper](https://arxiv.org/abs/1705.02999) | [Demo Video](https://youtu.be/eL5ilZgM89Q) | [SIGGRAPH Talk](https://www.youtube.com/watch?v=rp5LUSbdsys)\n\u003cimg src='imgs/demo.gif' width=600\u003e  \n\n\n\u003cb\u003e04/10/2020 Update\u003c/b\u003e: [@mabdelhack](mabdelhack) provided a windows installation guide for the PyTorch model in Python 3.6. Check out the Windows [branch](https://github.com/junyanz/interactive-deep-colorization/tree/windows) for the guide.\n\n\u003cb\u003e10/3/2019 Update\u003c/b\u003e: Our technology is also now available in Adobe Photoshop Elements 2020. See this [blog](https://helpx.adobe.com/photoshop-elements/using/colorize-photo.html) and [video](https://www.youtube.com/watch?v=tmXg4N4YlJg) for more details.\n\n\n\u003cb\u003e9/3/2018 Update\u003c/b\u003e:  The code now supports a backend PyTorch model (with PyTorch 0.5.0+). Please find the Local Hints Network training code in the [colorization-pytorch](https://github.com/richzhang/colorization-pytorch) repository.\n\nReal-Time User-Guided Image Colorization with Learned Deep Priors.  \n[Richard Zhang](https://richzhang.github.io/)\\*, [Jun-Yan Zhu](https://www.cs.cmu.edu/~junyanz/)\\*, [Phillip Isola](http://people.eecs.berkeley.edu/~isola/), [Xinyang Geng](http://young-geng.xyz/), Angela S. Lin, Tianhe Yu, and [Alexei A. Efros](https://people.eecs.berkeley.edu/~efros/).  \nIn ACM Transactions on Graphics (SIGGRAPH 2017).  \n(\\*indicates equal contribution)\n\nWe first describe the system \u003cb\u003e(0) Prerequisities\u003c/b\u003e and steps for \u003cb\u003e(1) Getting started\u003c/b\u003e. We then describe the interactive colorization demo \u003cb\u003e(2) Interactive Colorization (Local Hints Network)\u003c/b\u003e. There are two demos: (a) a \"barebones\" version in iPython notebook and (b) the full GUI we used in our paper. We then provide an example of the \u003cb\u003e(3) Global Hints Network\u003c/b\u003e.\n\n\n\u003cimg src='https://richzhang.github.io/InteractiveColorization/index_files/imagenet_showcase_small.jpg' width=800\u003e\n\n### (0) Prerequisites\n- Linux or OSX\n- [Caffe](http://caffe.berkeleyvision.org/installation.html) or PyTorch\n- CPU or NVIDIA GPU + CUDA CuDNN.\n\n### (1) Getting Started\n- Clone this repo:\n```bash\ngit clone https://github.com/junyanz/interactive-deep-colorization ideepcolor\ncd ideepcolor\n```\n\n- Download the reference model\n```\nbash ./models/fetch_models.sh\n```\n\n- Install [Caffe](http://caffe.berkeleyvision.org/installation.html) or [PyTorch]() and 3rd party Python libraries ([OpenCV](http://opencv.org/), [scikit-learn](http://scikit-learn.org/stable/install.html) and [scikit-image](https://github.com/scikit-image/scikit-image)). See the [Requirements](#Requirements) for more details.\n\n### (2) Interactive Colorization (Local Hints Network)\n\u003cimg src='imgs/teaser_v3.jpg' width=800\u003e\n\nWe provide a \"barebones\" demo in iPython notebook, which does not require QT. We also provide our full GUI demo.\n\n#### (2a) Barebones Interactive Colorization Demo\n\n- Run `ipython notebook` and click on [`DemoInteractiveColorization.ipynb`](./DemoInteractiveColorization.ipynb).\n\nIf you need to convert the Notebook to an older version, use `jupyter nbconvert --to notebook --nbformat 3 ./DemoInteractiveColorization.ipynb`.\n\n#### (2b) Full Demo GUI\n\n- Install [Qt5](https://doc.qt.io/qt-5/gettingstarted.html) and [QDarkStyle](https://github.com/ColinDuquesnoy/QDarkStyleSheet). (See [Installation](https://github.com/junyanz/interactive-deep-colorization#installation))\n\n- Run the UI: `python ideepcolor.py --gpu [GPU_ID] --backend [CAFFE OR PYTORCH]`. Arguments are described below:\n```\n--win_size    [512] GUI window size\n--gpu         [0] GPU number\n--image_file  ['./test_imgs/mortar_pestle.jpg'] path to the image file\n--backend     ['caffe'] either use 'caffe' or 'pytorch'; 'caffe' is the official model from siggraph 2017, and 'pytorch' is the same weights converted\n```\n\n- User interactions\n\n\u003cimg src='./imgs/pad.jpg' width=800\u003e\n\n- \u003cb\u003eAdding points\u003c/b\u003e: Left-click somewhere on the input pad\n- \u003cb\u003eMoving points\u003c/b\u003e: Left-click and hold on a point on the input pad, drag to desired location, and let go\n- \u003cb\u003eChanging colors\u003c/b\u003e: For currently selected point, choose a recommended color (middle-left) or choose a color on the ab color gamut (top-left)\n- \u003cb\u003eRemoving points\u003c/b\u003e: Right-click on a point on the input pad\n- \u003cb\u003eChanging patch size\u003c/b\u003e: Mouse wheel changes the patch size from 1x1 to 9x9\n- \u003cb\u003eLoad image\u003c/b\u003e: Click the load image button and choose desired image\n- \u003cb\u003eRestart\u003c/b\u003e: Click on the restart button. All points on the pad will be removed.\n- \u003cb\u003eSave result\u003c/b\u003e: Click on the save button. This will save the resulting colorization in a directory where the ```image_file``` was, along with the user input ab values.\n- \u003cb\u003eQuit\u003c/b\u003e: Click on the quit button.\n\n### (3) Global Hints Network\n\u003cimg src='https://richzhang.github.io/InteractiveColorization/index_files/lab_all_figures45k_small.jpg' width=800\u003e\n\nWe include an example usage of our Global Hints Network, applied to global histogram transfer. We show its usage in an iPython notebook.\n\n- Add `./caffe_files` to your `PYTHONPATH`\n\n- Run `ipython notebook`. Click on [`./DemoGlobalHistogramTransfer.ipynb`](./DemoGlobalHistogramTransfer.ipynb)\n\n### Installation\n- Install Caffe or PyTorch. The Caffe model is official. PyTorch is a reimplementation.\n\n  - Install Caffe: see the Caffe [installation](http://caffe.berkeleyvision.org/installation.html) and Ubuntu installation [document](http://caffe.berkeleyvision.org/install_apt.html). Please compile the Caffe with the python layer [support](https://chrischoy.github.io/research/caffe-python-layer/) (set `WITH_PYTHON_LAYER=1` in the `Makefile.config`) and build Caffe python library by `make pycaffe`.\n\n  You also need to add `pycaffe` to your `PYTHONPATH`. Use `vi ~/.bashrc` to edit the environment variables.\n  ```bash\n  PYTHONPATH=/path/to/caffe/python:$PYTHONPATH\n  LD_LIBRARY_PATH=/path/to/caffe/build/lib:$LD_LIBRARY_PATH\n  ```\n\n  - Install PyTorch: see the PyTorch [installation](https://pytorch.org/) guide.\n\n- Install scikit-image, scikit-learn, opencv, Qt5, and QDarkStyle pacakges:\n```bash\n# ./install/install_deps.sh\nsudo pip install scikit-image\nsudo pip install scikit-learn\nsudo apt-get install python-opencv\nsudo apt-get install qt5-default\nsudo pip install qdarkstyle\n```\nFor Conda users, type the following command lines (this may work for full Anaconda but not Miniconda):\n```bash\n# ./install/install_conda.sh\nconda install -c anaconda protobuf  ## photobuf\nconda install -c anaconda scikit-learn=0.19.1 ## scikit-learn\nconda install -c anaconda scikit-image=0.13.0  ## scikit-image\nconda install -c menpo opencv=2.4.11   ## opencv\nconda install -c anaconda qt ## qt5\nconda install -c auto qdarkstyle  ## qdarkstyle\n```\n\nFor Docker users, please follow the Docker [document](https://github.com/junyanz/interactive-deep-colorization/tree/master/docker). \n\n- **Docker**: [[OSX Docker file](https://hub.docker.com/r/vbisbest/ideepcolor_osx/)] and [[OSX Installation video](https://www.youtube.com/watch?v=IORcb4lQlxQ)] by @vbisbest,  [[Docker file 2](https://hub.docker.com/r/swallner/ideepcolor/)] (by @sabrinawallner) based on [DL Docker](https://github.com/floydhub/dl-docker).\n\n- More installation [help](https://github.com/junyanz/interactive-deep-colorization/issues/10) (by @SleepProgger).\n\n### Training\n\nPlease find a PyTorch reimplementation of the Local Hints Network training code in the [colorization-pytorch](https://github.com/richzhang/colorization-pytorch) repository.\n\n### Citation\nIf you use this code for your research, please cite our paper:\n```\n@article{zhang2017real,\n  title={Real-Time User-Guided Image Colorization with Learned Deep Priors},\n  author={Zhang, Richard and Zhu, Jun-Yan and Isola, Phillip and Geng, Xinyang and Lin, Angela S and Yu, Tianhe and Efros, Alexei A},\n  journal={ACM Transactions on Graphics (TOG)},\n  volume={9},\n  number={4},\n  year={2017},\n  publisher={ACM}\n}\n```\n\n### Cat Paper Collection\nOne of the authors objects to the inclusion of this list, due to an allergy. Another author objects on the basis that cats are silly creatures and this is a serious, scientific paper. However, if you love cats, and love reading cool graphics, vision, and learning papers, please check out the Cat Paper Collection: [[Github]](https://github.com/junyanz/CatPapers) [[Webpage]](https://www.cs.cmu.edu/~junyanz/cat/cat_papers.html)\n","funding_links":[],"categories":["Python","Jupyter Notebook","Colorization"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fjunyanz%2Finteractive-deep-colorization","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fjunyanz%2Finteractive-deep-colorization","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fjunyanz%2Finteractive-deep-colorization/lists"}