{"id":13488125,"url":"https://github.com/astra-vision/MaterialPalette","last_synced_at":"2025-03-27T23:32:54.943Z","repository":{"id":197872892,"uuid":"698860224","full_name":"astra-vision/MaterialPalette","owner":"astra-vision","description":"[CVPR 2024] Official repository of \"Material Palette: Extraction of Materials from a Single Real-world Image\"","archived":false,"fork":false,"pushed_at":"2024-06-18T07:27:53.000Z","size":531636,"stargazers_count":229,"open_issues_count":1,"forks_count":10,"subscribers_count":22,"default_branch":"master","last_synced_at":"2024-10-30T23:36:25.722Z","etag":null,"topics":["albedo","computer-vision","cvpr","cvpr2024","generative-ai","material","normal","roughness","stable-diffusion"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/astra-vision.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-10-01T07:35:37.000Z","updated_at":"2024-10-26T06:35:21.000Z","dependencies_parsed_at":"2023-11-29T10:26:13.235Z","dependency_job_id":"7d7d6665-f2db-472b-b7f2-9139fe46fcfb","html_url":"https://github.com/astra-vision/MaterialPalette","commit_stats":null,"previous_names":["astra-vision/materialpalette"],"tags_count":2,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/astra-vision%2FMaterialPalette","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/astra-vision%2FMaterialPalette/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/astra-vision%2FMaterialPalette/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/astra-vision%2FMaterialPalette/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/astra-vision","download_url":"https://codeload.github.com/astra-vision/MaterialPalette/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":245944037,"owners_count":20697946,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["albedo","computer-vision","cvpr","cvpr2024","generative-ai","material","normal","roughness","stable-diffusion"],"created_at":"2024-07-31T18:01:10.023Z","updated_at":"2025-03-27T23:32:49.905Z","avatar_url":"https://github.com/astra-vision.png","language":"Python","readme":"\u003cdiv align='center'\u003e\n\n## Material Palette: Extraction of Materials from a Single Image (CVPR 2024)\n\n\u003cdiv\u003e\n  \u003ca href=\"https://wonjunior.github.io/\"\u003eIvan Lopes\u003c/a\u003e\u003csup\u003e1\u003c/sup\u003e\u0026nbsp;\u0026nbsp;\n  \u003ca href=\"https://fabvio.github.io/\"\u003eFabio Pizzati\u003c/a\u003e\u003csup\u003e2\u003c/sup\u003e\u0026nbsp;\u0026nbsp;\n  \u003ca href=\"https://team.inria.fr/rits/membres/raoul-de-charette/\"\u003eRaoul de Charette\u003c/a\u003e\u003csup\u003e1\u003c/sup\u003e\n  \u003cbr\u003e\n  \u003csup\u003e1\u003c/sup\u003e Inria,\n  \u003csup\u003e2\u003c/sup\u003e Oxford Uni.\n\u003c/div\u003e\n\u003cbr\u003e\n\n[![Project page](https://img.shields.io/badge/🚀_Project_Page-_-darkgreen?style=flat-square)](https://astra-vision.github.io/MaterialPalette/)\n[![paper](https://img.shields.io/badge/paper-_-darkgreen?style=flat-square)](https://github.com/astra-vision/MaterialPalette/releases/download/preprint/material_palette.pdf)\n[![cvf](https://img.shields.io/badge/CVF-_-darkgreen?style=flat-square)](https://openaccess.thecvf.com/content/CVPR2024/html/Lopes_Material_Palette_Extraction_of_Materials_from_a_Single_Image_CVPR_2024_paper.html)\n[![dataset](https://img.shields.io/badge/🤗_dataset--darkgreen?style=flat-square)](https://huggingface.co/datasets/ilopes/texsd)\n[![star](https://img.shields.io/badge/⭐_star--darkgreen?style=flat-square)](https://github.com/astra-vision/MaterialPalette/stargazers)\n\u003c!--[![arXiv](https://img.shields.io/badge/arXiv-_-darkgreen?style=flat-square\u0026logo=arxiv)](https://arxiv.org/abs/2311.17060)--\u003e\n\n\n\u003cb\u003eTL;DR,\u003c/b\u003e Material Palette extracts a palette of PBR materials - \u003cbr\u003ealbedo, normals, and roughness - from a single real-world image.\n\n\u003c/div\u003e\n\nhttps://github.com/astra-vision/MaterialPalette/assets/30524163/44e45e58-7c7d-49a3-8b6e-ec6b99cf9c62\n\n\n\u003c!--ts--\u003e\n* [Overview](#overview)\n* [1. Installation](#1-installation)\n* [2. Quick Start](#2-quick-start)\n  * [Generation](#-generation)\n  * [Complete Pipeline](#-complete-pipeline)\n* [3. Project Structure](#3-project-structure)\n* [4. (optional) Retraining](#4-optional-training)\n* [Acknowledgments](#acknowledgments)\n* [Licence](#license)\n\u003c!--te--\u003e\n\n## 🚨 Todo\n\n- 3D rendering script.\n\n## Overview\n\nThis is the official repository of [**Material Palette**](https://astra-vision.github.io/MaterialPalette/). In a nutshell, the method works in three stages: first, concepts are extracted from an input image based on a user-provided mask; then, those concepts are used to generate texture images; finally, the generations are decomposed into SVBRDF maps (albedo, normals, and roughness). Visit our project page or consult our paper for more details!\n\n![pipeline](https://github.com/astra-vision/MaterialPalette/assets/30524163/be03b0ca-bee2-4fc7-bebd-9519c3c4947d)\n\n**Content**: This repository allows the extraction of texture concepts from image and region mask sets. It also allows generation at different resolutions. Finally, it proposes a decomposition step thanks to our decomposition model, for which we share the training weights.\n\n\u003e [!TIP]\n\u003e We propose a [\"Quick Start\"](#2-quick-start) section: before diving straight into the full pipeline, we share four pretrained concepts ⚡ so you can go ahead and experiment with the texture generation step of the method: see [\"§ Generation\"](#-generation). Then you can try out the full method with your own image and masks = concept learning + generation + decomposition, see [\"§ Complete Pipeline\"](#-complete-pipeline).\n\n\n## 1. Installation\n\n 1. Download the source code with git\n    ```\n    git clone https://github.com/astra-vision/MaterialPalette.git\n    ```\n    The repo can also be downloaded as a zip [here](https://github.com/astra-vision/MaterialPalette/archive/refs/heads/master.zip).\n\n 2. Create a conda environment with the dependencies.\n    ```\n    conda env create --verbose -f deps.yml\n    ```\n    This repo was tested with [**Python**](https://www.python.org/doc/versions/) 3.10.8, [**PyTorch**](https://pytorch.org/get-started/previous-versions/) 1.13, [**diffusers**](https://huggingface.co/docs/diffusers/installation) 0.19.3, [**peft**](https://huggingface.co/docs/peft/en/install) 0.5, and [**PyTorch Lightning**](https://lightning.ai/docs/pytorch/stable/past_versions.html) 1.8.3.\n\n 3. Load the conda environment:\n    ```\n    conda activate matpal\n    ```\n\n 4. If you are looking to perform decomposition, download our pre-trained model and untar the archive:\n    ```\n    wget https://github.com/astra-vision/MaterialPalette/releases/download/weights/model.tar.gz\n    ```\n    \u003csup\u003eThis is not required if you are only looking to perform texture extraction\u003c/sup\u003e\n\n\u003c!--\nIn case you want to retrain the source model, you can download the AmbientCG samples using the following command (`outdir` is the directory where the dataset will be downloaded to):\n```\npython capture/data/download.py outdir\n```--\u003e\n\n## 2. Quick start\n\nHere are instructions to get you started using **Material Palette**. First, we provide some optimized concepts so you can experiment with the generation pipeline. We also show how to run the method on user-selected images and masks (concept learning + generation + decomposition)\n\n### § Generation\n\n| Input image | 1K | 2K | 4K | 8K | ⬇️ LoRA ~8Kb\n| :-: | :-: | :-: | :-: | :-: | :-: |\n| \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/ba3126d7-ce54-4895-8d59-93f1fd22e7d6\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/e1ec9c9e-d618-4314-82a3-2ac2432af668\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/d960a216-5558-4375-9bf2-5a648221aa55\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/45ad2ca9-8be7-48ba-b368-5528ae021627\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/c9140b16-a59f-4898-b49f-5c3635a3ea85\" alt=\"J\" width=\"100\"/\u003e | [![x](https://img.shields.io/badge/-⚡blue_tiles.zip-black)](https://github.com/astra-vision/MaterialPalette/files/14601640/blue_tiles.zip)\n| \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/f5838959-aeeb-417a-8030-0fab5e39443b\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/4b756fae-3ea6-4d40-b4e6-0a8c50674e14\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/91aefd19-0985-4b84-81a2-152eb16b87e0\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/c9547e54-7bac-4f3d-8d94-acafd61847d9\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/069d639b-71bc-4f67-a735-a3b44d7bc683\" alt=\"J\" width=\"100\"/\u003e | [![x](https://img.shields.io/badge/-⚡cat_fur.zip-black)](https://github.com/astra-vision/MaterialPalette/files/14601641/cat_fur.zip)\n| \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/b16bc25f-e5c5-45ad-bf3b-ef28cb57ed30\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/0ae31915-7bc5-4177-8b84-6988cccc2c24\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/e501c66d-a5b7-42e4-9ec2-0a12898280ed\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/290b685a-554c-4c62-ab0d-9d66a2945f09\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/378be48d-61e5-4a8a-b2cd-1002aec541bf\" alt=\"J\" width=\"100\"/\u003e | [![x](https://img.shields.io/badge/-⚡damaged.zip-black)](https://github.com/astra-vision/MaterialPalette/files/14601642/damaged.zip)\n| \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/3c69d0c0-d91a-4d19-b0c0-b9dceb4477cf\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/ec6c62ea-00f7-4284-8cc3-6604159a3b5f\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/26c6ad3d-2306-4ad3-97a7-6713d5f4e5ee\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/94f7caa1-3ade-4b62-b0c6-b758a3a05d3f\" alt=\"J\" width=\"100\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/36630e65-9a2f-4a77-bb1b-0214d5f1b6f9\" alt=\"J\" width=\"100\"/\u003e | [![x](https://img.shields.io/badge/-⚡ivy_bricks.zip-black)](https://github.com/astra-vision/MaterialPalette/files/14601643/ivy_bricks.zip)\n\n\u003csup\u003eAll generations were downscaled for memory constraints.\u003c/sup\u003e\n\n\nGo ahead and download one of the above LoRA concept checkpoints, example for \"blue_tiles\":\n\n```\nwget https://github.com/astra-vision/MaterialPalette/files/14601640/blue_tiles.zip;\nunzip blue_tiles.zip\n```\nTo generate from a checkpoint, use the [`concept`](./concept/) module either via the command line interface or the functional interface in python:\n- ![](https://img.shields.io/badge/$-command_line-white?style=flat-square)\n  ```\n  python concept/infer.py path/to/LoRA/checkpoint\n  ```\n- ![](https://img.shields.io/badge/-python-white?style=flat-square\u0026logo=python)\n  ```\n  import concept\n  concept.infer(path_to_LoRA_checkpoint)\n  ```\n\nResults will be placed relative to the checkpoint directory in a `outputs` folder.\n\nYou have control over the following parameters:\n- `stitch_mode`: concatenation, average, or weighted average (*default*);\n- `resolution`: the output resolution of the generated texture;\n- `prompt`: one of the four prompt templates:\n  - `\"p1\"`: `\"top view realistic texture of S*\"`,\n  - `\"p2\"`: `\"top view realistic S* texture\"`,\n  - `\"p3\"`: `\"high resolution realistic S* texture in top view\"`,\n  - `\"p4\"`: `\"realistic S* texture in top view\"`;\n- `seed`: inference seed when sampling noise;\n- `renorm`: whether or not to renormalize the generated samples generations based on input image (this option can only be used when called from inside the pipeline, *ie.* when the input image is available);\n- `num_inference_steps`: number of denoising steps.\n\n\u003csup\u003eA complete list of parameters can be viewed with `python concept/infer.py --help`\u003c/sup\u003e\n\n\n### § Complete Pipeline\n\nWe provide an example (input image with user masks used for the pipeline figure). You can download it here: [**mansion.zip**](https://github.com/astra-vision/MaterialPalette/files/14619163/mansion.zip) (credits photograph:  [Max Rahubovskiy](https://www.pexels.com/@heyho/)).\n\nTo help you get started with your own images, you should follow this simple data structure: one folder per inverted image, inside should be the input image (`.jpg`, `.jpeg`, or `.png`) and a subdirectory named `masks` containing the different region masks as `.png` (these **must all have the same aspect ratio** as the RGB image). Here is an overview of our mansion example:\n```\n├── masks/\n│ ├── wood.png\n│ ├── grass.png\n│ └── stone.png\n└── mansion.jpg\n```\n\n|region|mask|overlay|generation|albedo|normals|roughness|\n|:--:|:--:|:--:|:--:|:--:|:--:|:--:|\n|![#6C8EBF](https://placehold.co/15x15/6C8EBF/6C8EBF.png) | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/23d422e4-6d69-4dd5-a823-44b284b1589d\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/84601e24-74d3-4da0-96e2-a3554f3481b4\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/918b2f5a-e975-444c-8a1b-523df9492eab\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/fa367b0c-5e22-4148-b785-23d147faead0\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/444a63b4-3eea-47de-9dac-1e9b122453a7\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/3a551001-249d-41ab-9e7a-a990950a8632\" alt=\"J\" height=\"85\"/\u003e|\n|![#EDB01A](https://placehold.co/15x15/EDB01A/EDB01A.png) | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/712887ef-d235-433b-9e95-5bb58c5d96ee\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/56b8d10f-041f-414b-ba2c-6ea08cbdb2c2\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/f8f9e7b3-e2f9-4603-823a-9f79cfe8d2a9\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/ede74224-80ae-4fe3-8eee-aae53935cc0e\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/3378dd1f-8801-47e8-a570-57e908f21e4d\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/77b31a23-5430-4221-a28a-c03e9099d45c\" alt=\"J\" height=\"85\"/\u003e|\n|![#AA4A44](https://placehold.co/15x15/AA4A44/AA4A44.png) | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/ee22ad46-3f63-460a-ab8a-8b071cfd2b75\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/90a9904c-db25-4fec-a60c-46dcadf8de59\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/fa0bad8d-1d9e-4019-9a99-14be732612b3\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/c1569bee-d387-4b60-b12b-9aadaf693dcc\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/d10bb074-a305-44c7-8536-a29b761ad14d\" alt=\"J\" height=\"85\"/\u003e|\u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/9ebed2ac-408d-4ad8-8ec4-99344e9ad85f\" alt=\"J\" height=\"85\"/\u003e|\n\n\u003c!-- | Input image | mask 1 | mask 2 | mask 3 | mask 4 |\n| :-: | :-: | :-: | :-: | :-: |\n| `bricks.jpg` | `runningbond.png` | `herringbone.png` | `basketweave.png` | `stonewall.png` |\n| \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/55e89f01-81b5-4916-a817-c430eb70b12c\" alt=\"J\" width=\"150\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/0afa522a-207e-4762-b9fb-823736776458\" alt=\"J\" width=\"150\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/a90f5e0f-966f-475e-8e0d-dbd331960a5e\" alt=\"J\" width=\"150\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/182f4f15-b6aa-4888-87a7-59dc941d4688\" alt=\"J\" width=\"150\"/\u003e | \u003cimg src=\"https://github.com/astra-vision/MaterialPalette/assets/30524163/f68419e6-62ae-433d-9178-5e3e41417893\" alt=\"J\" width=\"150\"/\u003e --\u003e\n\n\nTo invert and generate textures from a folder, use [`pipeline.py`](./pipeline.py):\n\n- ![](https://img.shields.io/badge/$-command_line-white?style=flat-square)\n  ```\n  python pipeline.py path/to/folder\n  ```\n\nUnder the hood, it uses two modules:\n1. [`concept`](./concept), to extract and generate the texture ([`concept.crop`](./concept/crop.py), [`concept.invert`](./concept/invert.py), and [`concept.infer`](./concept/infer.py));\n2. [`capture`](./capture/), to perform the BRDF decomposition.\n\nA minimal example is provided here:\n\n- ![](https://img.shields.io/badge/-python-white?style=flat-square\u0026logo=python)\n  ```\n  ## Extract square crops from image for each of the binary masks located in \u003cpath\u003e/masks\n  regions = concept.crop(args.path)\n\n  ## Iterate through regions to invert the concept and generate texture views\n  for region in regions.iterdir():\n      lora = concept.invert(region)\n      concept.infer(lora, renorm=True)\n\n  ## Construct a dataset with all generations and load pretrained decomposition model\n  data = capture.get_data(predict_dir=args.path, predict_ds='sd')\n  module = capture.get_inference_module(pt='model.ckpt')\n\n  ## Proceed with inference on decomposition model\n  decomp = Trainer(default_root_dir=args.path, accelerator='gpu', devices=1, precision=16)\n  decomp.predict(module, data)\n  ```\n\u003csup\u003eTo view options available for the concept learning, use ``PYTHONPATH=. python concept/invert.py --help``\u003c/sup\u003e\n\n\u003e [!IMPORTANT]\n\u003e By default, both `train_text_encoder` and `gradient_checkpointing` are set to `True`. Also, this implementation does not include the `LPIPS` filter/ranking of the generations. The code will only output a single sample per region. You may experiment with different prompts and parameters (see [\"Generation\"](#-generation) section).\n\n## 3. Project structure\n\nThe [`pipeline.py`](./pipeline.py) file is the entry point to run the whole pipeline on a folder containing the input image at its root and a `masks/` sub-directory containing all user defined masks. The [`train.py`](./train.py) file is used to train the decomposition model. The most important files are shown here:\n```\n.\n├── capture/        % Module for decomposition\n│ ├── callbacks/    % Lightning trainer callbacks\n│ ├── data/         % Dataset, subsets, Lightning datamodules\n│ ├── render/       % 2D physics based renderer\n│ ├── utils/        % Utility functions\n│ └── source/       % Network, loss, and LightningModule\n│   └── routine.py  % Training loop\n│\n└── concept/        % Module for inversion and texture generation\n  ├── crop.py       % Square crop extraction from image and masks\n  ├── invert.py     % Optimization code to learn the concept S*\n  └── infer.py      % Inference code to generate texture from S*\n```\nIf you have any questions, post via the [*issues tracker*](https://github.com/astra-vision/MaterialPalette/issues) or contact the corresponding author.\n\n## 4. (optional) Training\n\nWe provide the pre-trained decomposition weights (see [\"Installation\"](#1-installation)). However, if you are looking to retrain the domain adaptive model for your own purposes, we provide the code to do so. Our method relies on the training of a multi-task network on labeled (real) and unlabeled (synthetic) images, *jointly*. In case you wish to retrain on the same datasets, you will have to download both the ***AmbientCG*** and ***TexSD*** datasets.\n\nFirst download the PBR materials (source) dataset from [AmbientCG](https://ambientcg.com/):\n```\npython capture/data/download.py path/to/target/directory\n```\n\nTo run the training script, use:\n```\npython train.py --config=path/to/yml/config\n```\n\n\u003csup\u003eAdditional options can be found with `python train.py --help`.\u003c/sup\u003e\n\n\u003e [!NOTE]\n\u003e The decomposition model allows estimating the pixel-wise BRDF maps from a single texture image input.\n\n## Acknowledgments\nThis research project was mainly funded by the French Agence Nationale de la Recherche (ANR) as part of project SIGHT (ANR-20-CE23-0016). Fabio Pizzati was partially funded by KAUST (Grant DFR07910). Results were obtained using HPC resources from GENCI-IDRIS (Grant 2023-AD011014389).\n\nThe repository contains code taken from [`PEFT`](https://github.com/huggingface/peft), [`SVBRDF-Estimation`](https://github.com/mworchel/svbrdf-estimation/tree/master), [`DenseMTL`](https://github.com/astra-vision/DenseMTL). As for visualization, we used [`DeepBump`](https://github.com/HugoTini/DeepBump) and [**Blender**](https://www.blender.org/). Credit to Runway for providing us all the [`stable-diffusion-v1-5`](https://huggingface.co/runwayml/stable-diffusion-v1-5) model weights. All images and 3D scenes used in this work have permissive licenses. Special credits to [**AmbientCG**](https://ambientcg.com/list) for the huge work.\n\nThe authors would also like to thank all members of the [Astra-Vision](https://astra-vision.github.io/) team for their valuable feedback.\n\n## License\nIf you find this code useful, please cite our paper:\n```\n@inproceedings{lopes2024material,\n    author = {Lopes, Ivan and Pizzati, Fabio and de Charette, Raoul},\n    title = {Material Palette: Extraction of Materials from a Single Image},\n    booktitle = {CVPR},\n    year = {2024},\n    project = {https://astra-vision.github.io/MaterialPalette/}\n}\n```\n**Material Palette** is released under [MIT License](./LICENSE).\n\n---\n\n[🢁 jump to top](#material-palette-extraction-of-materials-from-a-single-image-cvpr-2024)\n","funding_links":[],"categories":["New Concept Learning"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fastra-vision%2FMaterialPalette","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fastra-vision%2FMaterialPalette","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fastra-vision%2FMaterialPalette/lists"}