{"id":13513826,"url":"https://threedle.github.io/text2mesh/","last_synced_at":"2025-03-31T02:33:05.323Z","repository":{"id":39535187,"uuid":"430211159","full_name":"threedle/text2mesh","owner":"threedle","description":"3D mesh stylization driven by a text input in PyTorch","archived":false,"fork":false,"pushed_at":"2024-05-19T16:19:52.000Z","size":2018913,"stargazers_count":951,"open_issues_count":22,"forks_count":135,"subscribers_count":22,"default_branch":"main","last_synced_at":"2025-03-20T09:39:55.936Z","etag":null,"topics":["3d","computer-graphics","differentiable-rendering","geometry-processing","mesh-generation","meshes","neural-fields","neural-style","pytorch"],"latest_commit_sha":null,"homepage":"https://threedle.github.io/text2mesh/","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/threedle.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2021-11-20T21:17:52.000Z","updated_at":"2025-03-18T11:45:24.000Z","dependencies_parsed_at":"2024-10-27T23:56:42.208Z","dependency_job_id":null,"html_url":"https://github.com/threedle/text2mesh","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/threedle%2Ftext2mesh","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/threedle%2Ftext2mesh/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/threedle%2Ftext2mesh/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/threedle%2Ftext2mesh/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/threedle","download_url":"https://codeload.github.com/threedle/text2mesh/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246407399,"owners_count":20772126,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["3d","computer-graphics","differentiable-rendering","geometry-processing","mesh-generation","meshes","neural-fields","neural-style","pytorch"],"created_at":"2024-08-01T05:00:38.359Z","updated_at":"2025-03-31T02:33:00.314Z","avatar_url":"https://github.com/threedle.png","language":"Jupyter Notebook","readme":"# Text2Mesh [[Project Page](https://threedle.github.io/text2mesh/)]\n[![arXiv](https://img.shields.io/badge/arXiv-Text2Mesh-b31b1b.svg)](https://arxiv.org/abs/2112.03221)\n![Pytorch](https://img.shields.io/badge/PyTorch-\u003e=1.9.0-Red?logo=pytorch)\n![crochet candle](images/vases.gif)\n**Text2Mesh** is a method for text-driven stylization of a 3D mesh, as described in \"Text2Mesh: Text-Driven Neural Stylization for Meshes\" CVPR 2022.\n\n## Getting Started\n### Installation\n\n**Note:** The below installation will fail if run on something other than a CUDA GPU machine.\n```\nconda env create --file text2mesh.yml\nconda activate text2mesh\n```\nIf you experience an error installing kaolin saying something like `nvcc not found`, you may need to set your `CUDA_HOME` environment variable to the 11.3 folder i.e. `export CUDA_HOME=/usr/local/cuda-11.3`, then rerunning the installation. \n\n### System Requirements\n- Python 3.7\n- CUDA 11\n- GPU w/ minimum 8 GB ram\n\n### Run examples\nCall the below shell scripts to generate example styles. \n```bash\n# cobblestone alien\n./demo/run_alien_cobble.sh\n# shoe made of cactus \n./demo/run_shoe.sh\n# lamp made of brick\n./demo/run_lamp.sh\n# ...\n```\nThe outputs will be saved to `results/demo`, with the stylized .obj files, colored and uncolored render views, and screenshots during training.\n\n#### Outputs\n\u003cp float=\"center\"\u003e\n\u003cimg alt=\"alien\" height=\"135\" src=\"images/alien.png\" width=\"240\"/\u003e\n\u003cimg alt=\"alien geometry\" height=\"135\" src=\"images/alien_cobble_init.png\" width=\"240\"/\u003e\n\u003cimg alt=\"alien style\" height=\"135\" src=\"images/alien_cobble_final.png\" width=\"240\"/\u003e\n\u003c/p\u003e\n\n\u003cp float=\"center\"\u003e\n\u003cimg alt=\"alien\" height=\"135\" src=\"images/alien.png\" width=\"240\"/\u003e\n\u003cimg alt=\"alien geometry\" height=\"135\" src=\"images/alien_wood_init.png\" width=\"240\"/\u003e\n\u003cimg alt=\"alien style\" height=\"135\" src=\"images/alien_wood_final.png\" width=\"240\"/\u003e\n\u003c/p\u003e\n\n\u003cp float=\"center\"\u003e\n\u003cimg alt=\"candle\" height=\"135\" src=\"images/candle.png\" width=\"240\"/\u003e\n\u003cimg alt=\"candle geometry\" height=\"135\" src=\"images/candle_init.png\" width=\"240\"/\u003e\n\u003cimg alt=\"candle style\" height=\"135\" src=\"images/candle_final.png\" width=\"240\"/\u003e\n\u003c/p\u003e\n\n\u003cp float=\"center\"\u003e\n\u003cimg alt=\"person\" height=\"135\" src=\"images/person.png\" width=\"240\"/\u003e\n\u003cimg alt=\"ninja geometry\" height=\"135\" src=\"images/ninja_init.png\" width=\"240\"/\u003e\n\u003cimg alt=\"ninja style\" height=\"135\" src=\"images/ninja_final.png\" width=\"240\"/\u003e\n\u003c/p\u003e\n\n\u003cp float=\"center\"\u003e\n\u003cimg alt=\"shoe\" height=\"135\" src=\"images/shoe.png\" width=\"240\"/\u003e\n\u003cimg alt=\"shoe geometry\" height=\"135\" src=\"images/shoe_init.png\" width=\"240\"/\u003e\n\u003cimg alt=\"shoe style\" height=\"135\" src=\"images/shoe_final.png\" width=\"240\"/\u003e\n\u003c/p\u003e\n\n\u003cp float=\"center\"\u003e\n\u003cimg alt=\"vase\" height=\"135\" src=\"images/vase.png\" width=\"240\"/\u003e\n\u003cimg alt=\"vase geometry\" height=\"135\" src=\"images/vase_init.png\" width=\"240\"/\u003e\n\u003cimg alt=\"vase style\" height=\"135\" src=\"images/vase_final.png\" width=\"240\"/\u003e\n\u003c/p\u003e\n\n\u003cp float=\"center\"\u003e\n\u003cimg alt=\"lamp\" height=\"135\" src=\"images/lamp.png\" width=\"240\"/\u003e\n\u003cimg alt=\"lamp geometry\" height=\"135\" src=\"images/lamp_init.png\" width=\"240\"/\u003e\n\u003cimg alt=\"lamp style\" height=\"135\" src=\"images/lamp_final.png\" width=\"240\"/\u003e\n\u003c/p\u003e\n\n\u003cp float=\"center\"\u003e\n\u003cimg alt=\"horse\" height=\"135\" src=\"images/horse.png\" width=\"240\"/\u003e\n\u003cimg alt=\"horse geometry\" height=\"135\" src=\"images/horse_init.png\" width=\"240\"/\u003e\n\u003cimg alt=\"horse style\" height=\"135\" src=\"images/horse_final.png\" width=\"240\"/\u003e\n\u003c/p\u003e\n\n## Important tips for running on your own meshes\nText2Mesh learns to produce color and displacements over the input mesh vertices. The mesh triangulation effectively defines the resolution for the stylization. Therefore, it is important that the mesh triangles are small enough such that they can accurately potray the color and displacement. If a mesh contains large triangles, the stylization will not contain sufficent resolution (and leads to low quality results). For example, the triangles on the seat of the chair below are too large.\n\n\u003cp align=\"center\"\u003e\n\u003cimg alt=\"large-triangles\" src=\"images/large-triangles.png\" height=\"25%\" width=\"25%\" /\u003e\n\u003c/p\u003e\n\nYou should remesh such shapes as a pre-process in to create smaller triangles which are uniformly dispersed over the surface. Our example remeshing script can be used with the following command (and then use the remeshed shape with Text2Mesh):\n\n```\npython3 remesh.py --obj_path [the mesh's path] --output_path [the full output path]\n```\n\nFor example, to remesh a file name called `chair.obj`, the following command should be run:  \n\n```\npython3 remesh.py --obj_path chair.obj --output_path chair-remesh.obj\n```\n\n\n## Other implementations\n[Kaggle Notebook](https://www.kaggle.com/neverix/text2mesh/) (by [neverix](https://www.kaggle.com/neverix))\n\n## External projects using Text2Mesh\n- [Endava 3D Asset Tool](https://www.endava.com/en/blog/Engineering/2022/An-R-D-Project-on-AI-in-3D-Asset-Creation-for-Games) integrates Text2Mesh into their modeling software to create 3D assets for games.\n\n- [Psychedelic Trips Art Gallery](https://www.flickr.com/photos/mcanet/sets/72177720299890759/) uses Text2Mesh to generate AI Art and fabricate (3D print) the results.\n\n## Citation\n```\n@InProceedings{Michel_2022_CVPR,\n    author    = {Michel, Oscar and Bar-On, Roi and Liu, Richard and Benaim, Sagie and Hanocka, Rana},\n    title     = {Text2Mesh: Text-Driven Neural Stylization for Meshes},\n    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},\n    month     = {June},\n    year      = {2022},\n    pages     = {13492-13502}\n}\n```\n","funding_links":[],"categories":["Text"],"sub_categories":["Text to Mesh"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/threedle.github.io%2Ftext2mesh%2F","html_url":"https://awesome.ecosyste.ms/projects/threedle.github.io%2Ftext2mesh%2F","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/threedle.github.io%2Ftext2mesh%2F/lists"}