{"id":13745703,"url":"https://github.com/DLR-RM/BlenderProc","last_synced_at":"2025-05-09T06:31:10.526Z","repository":{"id":37735363,"uuid":"214165016","full_name":"DLR-RM/BlenderProc","owner":"DLR-RM","description":"A procedural Blender pipeline for photorealistic training image generation","archived":false,"fork":false,"pushed_at":"2025-04-16T19:06:05.000Z","size":100689,"stargazers_count":3043,"open_issues_count":90,"forks_count":464,"subscribers_count":43,"default_branch":"main","last_synced_at":"2025-05-03T06:01:48.492Z","etag":null,"topics":["3d-engines","3d-front-dataset","3d-graphics","3d-reconstruction","blender","blender-installation","blender-pipeline","camera-positions","camera-sampling","computer-graphics","depth-images","pose-estimation","python","rendering","segmentation","suncg-scene","synthetic","synthetic-data"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"gpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/DLR-RM.png","metadata":{"files":{"readme":"README.md","changelog":"change_log.md","contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":"CITATION.cff","codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2019-10-10T11:29:14.000Z","updated_at":"2025-05-02T21:14:35.000Z","dependencies_parsed_at":"2023-02-17T09:00:57.273Z","dependency_job_id":"df3b6292-5bb5-4f3e-9f27-6c71ad96cdf5","html_url":"https://github.com/DLR-RM/BlenderProc","commit_stats":{"total_commits":3647,"total_committers":67,"mean_commits":54.43283582089552,"dds":0.8091582122292296,"last_synced_commit":"0786b980b719e272f736fb32ed27d4112349e5e1"},"previous_names":[],"tags_count":33,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DLR-RM%2FBlenderProc","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DLR-RM%2FBlenderProc/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DLR-RM%2FBlenderProc/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DLR-RM%2FBlenderProc/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/DLR-RM","download_url":"https://codeload.github.com/DLR-RM/BlenderProc/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":253206033,"owners_count":21871158,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["3d-engines","3d-front-dataset","3d-graphics","3d-reconstruction","blender","blender-installation","blender-pipeline","camera-positions","camera-sampling","computer-graphics","depth-images","pose-estimation","python","rendering","segmentation","suncg-scene","synthetic","synthetic-data"],"created_at":"2024-08-03T06:00:34.963Z","updated_at":"2025-05-09T06:31:05.508Z","avatar_url":"https://github.com/DLR-RM.png","language":"Python","readme":"# BlenderProc2\n\n[![Documentation](https://img.shields.io/badge/documentation-passing-brightgreen.svg)](https://dlr-rm.github.io/BlenderProc/)\n[![Open In Collab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/DLR-RM/BlenderProc/blob/main/examples/basics/basic/basic_example.ipynb)\n[![License: GPL v3](https://img.shields.io/badge/License-GPLv3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0)\n\n\u003cp align=\"center\"\u003e\n\u003cimg src=\"https://user-images.githubusercontent.com/6104887/137109535-275a2aa3-f5fd-4173-9d16-a9a9b86f66e7.gif\" alt=\"Front readme image\" width=100%\u003e\n\u003c/p\u003e\n\nA procedural Blender pipeline for photorealistic rendering.\n\n[Documentation](https://dlr-rm.github.io/BlenderProc) | [Tutorials](#tutorials) | [Examples](#examples) | [ArXiv paper](https://arxiv.org/abs/1911.01911) | [Workshop paper](https://sim2real.github.io/assets/papers/2020/denninger.pdf) | [JOSS article](https://joss.theoj.org/papers/10.21105/joss.04901)\n\n## Features\n\n* Loading: `*.obj`, `*.ply`, `*.blend`, `*.fbx`, BOP, ShapeNet, Haven, 3D-FRONT, etc.\n* Objects: Set or sample object poses, apply physics and collision checking.\n* Materials: Set or sample physically-based materials and textures\n* Lighting: Set or sample lights, automatic lighting of 3D-FRONT scenes.\n* Cameras: Set, sample or load camera poses from file.\n* Rendering: RGB, stereo, depth, normal and segmentation images/sequences.\n* Writing: .hdf5 containers, COCO \u0026 BOP annotations.\n\n\n## Installation\n\n### Via pip\n\nThe simplest way to install blenderproc is via pip:\n\n```bash\npip install blenderproc\n```\n\n### Via git\n\nAlternatively, if you need to make changes to blenderproc or you want to make use of the most recent version on the main-branch, clone the repository:\n\n```bash\ngit clone https://github.com/DLR-RM/BlenderProc\n```\n\nTo still make use of the blenderproc command and therefore use blenderproc anywhere on your system, make a local pip installation:\n\n```bash\ncd BlenderProc\npip install -e .\n```\n\n## Usage\n\nBlenderProc has to be run inside the blender python environment, as only there we can access the blender API. \nTherefore, instead of running your script with the usual python interpreter, the command line interface of BlenderProc has to be used.\n\n```bash\nblenderproc run \u003cyour_python_script\u003e\n```\n\nIn general, one run of your script first loads or constructs a 3D scene, then sets some camera poses inside this scene and renders different types of images (RGB, distance, semantic segmentation, etc.) for each of those camera poses.\nUsually, you will run your script multiple times, each time producing a new scene and rendering e.g. 5-20 images from it.\nWith a little more experience, it is also possible to change scenes during a single script call, read [here](docs/tutorials/key_frames.md#render-multiple-times) how this is done.\n\n## Quickstart\n\nYou can test your BlenderProc pip installation by running\n\n```bash\nblenderproc quickstart\n```\n\nThis is an alias to `blenderproc run quickstart.py` where `quickstart.py` is:\n\n```python\nimport blenderproc as bproc\nimport numpy as np\n\nbproc.init()\n\n# Create a simple object:\nobj = bproc.object.create_primitive(\"MONKEY\")\n\n# Create a point light next to it\nlight = bproc.types.Light()\nlight.set_location([2, -2, 0])\nlight.set_energy(300)\n\n# Set the camera to be in front of the object\ncam_pose = bproc.math.build_transformation_mat([0, -5, 0], [np.pi / 2, 0, 0])\nbproc.camera.add_camera_pose(cam_pose)\n\n# Render the scene\ndata = bproc.renderer.render()\n\n# Write the rendering into an hdf5 file\nbproc.writer.write_hdf5(\"output/\", data)\n```\n\nBlenderProc creates the specified scene and renders the image into `output/0.hdf5`.\nTo visualize that image, simply call:\n\n```bash\nblenderproc vis hdf5 output/0.hdf5\n```\n\nThats it! You rendered your first image with BlenderProc!\n\n### Debugging in the Blender GUI\n\nTo understand what is actually going on, BlenderProc has the great feature of visualizing everything inside the blender UI.\nTo do so, simply call your script with the `debug` instead of `run` subcommand:\n```bash\nblenderproc debug quickstart.py\n```\n*Make sure that `quickstart.py` actually exists in your working directory.*\n\nNow the Blender UI opens up, the scripting tab is selected and the correct script is loaded.\nTo start the BlenderProc pipeline, one now just has to press `Run BlenderProc` (see red circle in image).\nAs in the normal mode, print statements are still printed to the terminal.\n\n\u003cp align=\"center\"\u003e\n\u003cimg src=\"images/debug.jpg\" alt=\"Front readme image\" width=500\u003e\n\u003c/p\u003e\n\nThe pipeline can be run multiple times, as in the beginning of each run the scene is cleared.\n\n### Breakpoint-Debugging in IDEs\n\nAs blenderproc runs in blenders separate python environment, debugging your blenderproc script cannot be done in the same way as with any other python script.\nTherefore, remote debugging is necessary, which is explained for vscode and PyCharm in the following:\n\n#### Debugging with vscode\n\nFirst, install the `debugpy` package in blenders python environment.\n\n```\nblenderproc pip install debugpy\n```\n\nNow add the following configuration to your vscode [launch.json](https://code.visualstudio.com/docs/python/debugging#_initialize-configurations).\n\n```json\n{                        \n    \"name\": \"Attach\",\n    \"type\": \"python\",\n    \"request\": \"attach\",\n    \"connect\": {\n        \"host\": \"localhost\",\n        \"port\": 5678\n    }\n}\n```\n\nFinally, add the following lines to the top (after the imports) of your blenderproc script which you want to debug.\n\n```python\nimport debugpy\ndebugpy.listen(5678)\ndebugpy.wait_for_client()\n```\n\nNow run your blenderproc script as usual via the CLI and then start the added \"Attach\" configuration in vscode.\nYou are now able to add breakpoints and go through the execution step by step.\n\n#### Debugging with PyCharm Professional\n\nIn Pycharm, go to `Edit configurations...` and create a [new configuration](https://www.jetbrains.com/help/pycharm/remote-debugging-with-product.html#remote-debug-config) based on `Python Debug Server`.\nThe configuration will show you, specifically for your version, which pip package to install and which code to add into the script.\nThe following assumes Pycharm 2021.3:\n\nFirst, install the `pydevd-pycharm` package in blenders python environment.\n\n```\nblenderproc pip install pydevd-pycharm~=212.5457.59\n```\n\nNow, add the following code to the top (after the imports) of your blenderproc script which you want to debug.\n\n```python\nimport pydevd_pycharm\npydevd_pycharm.settrace('localhost', port=12345, stdoutToServer=True, stderrToServer=True)\n```\n\nThen, first run your `Python Debug Server` configuration in PyCharm and then run your blenderproc script as usual via the CLI.\nPyCharm should then go in debug mode, blocking the next code line.\nYou are now able to add breakpoints and go through the execution step by step.\n\n## What to do next?\n\nAs you now ran your first BlenderProc script, your ready to learn the basics:\n\n### Tutorials\n\nRead through the tutorials, to get to know with the basic principles of how BlenderProc is used:\n\n1. [Loading and manipulating objects](docs/tutorials/loader.md)\n2. [Configuring the camera](docs/tutorials/camera.md)\n3. [Rendering the scene](docs/tutorials/renderer.md)\n4. [Writing the results to file](docs/tutorials/writer.md)\n5. [How key frames work](docs/tutorials/key_frames.md)\n6. [Positioning objects via the physics simulator](docs/tutorials/physics.md)\n\n### Examples\n\nWe provide a lot of [examples](examples/README.md) which explain all features in detail and should help you understand how BlenderProc works. Exploring our examples is the best way to learn about what you can do with BlenderProc. We also provide support for some datasets.\n\n* [Basic scene](examples/basics/basic/README.md): Basic example, this is the ideal place to start for beginners\n* [Camera sampling](examples/basics/camera_sampling/README.md): Sampling of different camera positions inside of a shape with constraints for the rotation.\n* [Object manipulation](examples/basics/entity_manipulation/README.md): Changing various parameters of objects.\n* [Material manipulation](examples/basics/material_manipulation/README.md): Material selecting and manipulation.\n* [Physics positioning](examples/basics/physics_positioning/README.md): Enabling simple simulated physical interactions between objects in the scene.\n* [Semantic segmentation](examples/basics/semantic_segmentation/README.md): Generating semantic segmentation labels for a given scene.\n* [BOP Challenge](README_BlenderProc4BOP.md): Generate the pose-annotated data used at the BOP Challenge 2020\n* [COCO annotations](examples/advanced/coco_annotations/README.md): Write COCO annotations to a .json file for selected objects in the scene.\n\nand much more, see our [examples](examples/README.md) for more details.\n\n\n## Contributions\n\nFound a bug? help us by reporting it. Want a new feature in the next BlenderProc release? Create an issue. Made something useful or fixed a bug? Start a PR. Check the [contributions guidelines](CONTRIBUTING.md).\n\n## Change log\n\nSee our [change log](change_log.md). \n\n## Citation \n\nIf you use BlenderProc in a research project, please cite as follows:\n\n```\n@article{Denninger2023, \n    doi = {10.21105/joss.04901},\n    url = {https://doi.org/10.21105/joss.04901},\n    year = {2023},\n    publisher = {The Open Journal}, \n    volume = {8},\n    number = {82},\n    pages = {4901}, \n    author = {Maximilian Denninger and Dominik Winkelbauer and Martin Sundermeyer and Wout Boerdijk and Markus Knauer and Klaus H. Strobl and Matthias Humt and Rudolph Triebel},\n    title = {BlenderProc2: A Procedural Pipeline for Photorealistic Rendering}, \n    journal = {Journal of Open Source Software}\n} \n```\n\n---\n\n\u003cdiv align=\"center\"\u003e\n  \u003ca href=\"https://www.dlr.de/EN/Home/home_node.html\"\u003e\u003cimg src=\"images/logo.svg\" hspace=\"3%\" vspace=\"60px\"\u003e\u003c/a\u003e\n\u003c/div\u003e\n","funding_links":[],"categories":["Datasets","Uncategorized","🔮Add-ons [^](#table)"],"sub_categories":["Sensor and Acuator Interfaces","Uncategorized","🪀Misc [^](#table)"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FDLR-RM%2FBlenderProc","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FDLR-RM%2FBlenderProc","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FDLR-RM%2FBlenderProc/lists"}