{"id":17222870,"url":"https://github.com/cheind/pytorch-blender","last_synced_at":"2025-04-04T18:06:33.512Z","repository":{"id":41040740,"uuid":"169447021","full_name":"cheind/pytorch-blender","owner":"cheind","description":":sweat_drops: Seamless, distributed, real-time integration of Blender into PyTorch data pipelines","archived":false,"fork":false,"pushed_at":"2023-10-28T04:32:11.000Z","size":29454,"stargazers_count":580,"open_issues_count":9,"forks_count":47,"subscribers_count":31,"default_branch":"develop","last_synced_at":"2025-03-28T17:09:19.236Z","etag":null,"topics":["blender","openai-gym","pytorch","reinforcement-learning","zmq"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/cheind.png","metadata":{"files":{"readme":"Readme.md","changelog":"Changelog.md","contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2019-02-06T17:32:38.000Z","updated_at":"2025-03-24T21:23:35.000Z","dependencies_parsed_at":"2022-08-10T01:29:32.586Z","dependency_job_id":"f915302a-9b83-4016-9822-f95cb2180607","html_url":"https://github.com/cheind/pytorch-blender","commit_stats":null,"previous_names":[],"tags_count":4,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cheind%2Fpytorch-blender","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cheind%2Fpytorch-blender/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cheind%2Fpytorch-blender/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cheind%2Fpytorch-blender/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/cheind","download_url":"https://codeload.github.com/cheind/pytorch-blender/tar.gz/refs/heads/develop","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247226213,"owners_count":20904465,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["blender","openai-gym","pytorch","reinforcement-learning","zmq"],"created_at":"2024-10-15T04:06:35.102Z","updated_at":"2025-04-04T18:06:33.495Z","avatar_url":"https://github.com/cheind.png","language":"Python","readme":"[![Build Status](https://github.com/cheind/pytorch-blender/actions/workflows/python-package.yml/badge.svg)](https://github.com/cheind/pytorch-blender/actions/workflows/python-package.yml)\n\n# blendtorch\n\n**blendtorch** is a Python framework to seamlessly integrate Blender into PyTorch for deep learning from artificial visual data. We utilize Eevee, a new physically based real-time renderer, to synthesize images and annotations in real-time and thus avoid stalling model training in many cases. \n\nIf you find the project helpful, you consider [citing](#cite_anchor) it.\n\nFeature summary\n - ***Data Generation***: Stream distributed Blender renderings directly into PyTorch data pipelines in real-time for supervised learning and domain randomization applications. Supports arbitrary pickle-able objects to be send alongside images/videos. Built-in recording capability to replay data without Blender. Bi-directional communication channels allow Blender simulations to adapt during network training. \u003c/br\u003eMore info [\\[examples/datagen\\]](examples/datagen), [\\[examples/compositor_normals_depth\\]](examples/compositor_normals_depth),  [\\[examples/densityopt\\]](examples/densityopt)\n - ***OpenAI Gym Support***: Create and run remotely controlled Blender gyms to train reinforcement agents. Blender serves as simulation, visualization, and interactive live manipulation environment.\n \u003c/br\u003eMore info [\\[examples/control\\]](examples/control)\n\nThe figure below visualizes the basic concept of **blendtorch** used in the context of generating artificial training data for a real-world detection task.\n\n\u003cdiv align=\"center\"\u003e\n\u003cimg src=\"etc/blendtorch_intro_v3.svg\" width=\"90%\"\u003e\u003cbr\u003e\n\u003csup\u003e\u003cstrong\u003eFig 1:\u003c/strong\u003e With Blendtorch, you are able to train your PyTorch modules on massively randomized artificial data generated by Blender simulations.\u003c/sup\u003e\n\u003c/div\u003e\n\n## Getting started\n 1. Read the installation instructions below\n 1. To get started with **blendtorch** for training data training read [\\[examples/datagen\\]](examples/datagen). \n 1. To learn about using **blendtorch** for creating reinforcement training environments read [\\[examples/control\\]](examples/control).\n\n## Prerequisites\nThis package has been tested with\n - [Blender](https://www.blender.org/) \u003e= 2.83/2.91/3.0/3.1 (Python \u003e= 3.7)\n - [PyTorch](http://pytorch.org) \u003e= 1.5/1.10 (Python \u003e= 3.7)\n\nrunning Windows 10 and Linux. Other versions might work as well, but have not been tested. \n\n## Installation\n\n**blendtorch** is composed of two distinct sub-packages: \n - `bendtorch.btt` located in [pkg_pytorch](./pkg_pytorch) and \n - `blendtorch.btb` located in [pkg_blender](./pkg_blender),\n\nproviding the PyTorch and Blender views on **blendtorch**. `bendtorch.btt` will be installed to your local Python environment, while `blendtorch.btb` will be installed to the Python environment that ships with Blender.\n\n1. Clone this repository\n    \n    ```\n    git clone https://github.com/cheind/pytorch-blender.git \u003cDST\u003e\n    ```\n1. Extend `PATH`    \n    \n    Ensure Blender executable is in your environments lookup `PATH`. On Windows this can be accomplished by\n    ```\n    set PATH=c:\\Program Files\\Blender Foundation\\Blender 2.91;%PATH%\n    ```\n    On Ubuntu when blender is [installed using snap](https://snapcraft.io/install/blender/ubuntu), the path may be included by adding the following line to your ~/.bashrc,\n    ```\n    export PATH=/snap/blender/current/${PATH:+:${PATH}}\n    ```\n1. Complete Blender settings\n    \n    Open Blender at least once, and complete the initial settings. If this step is missed, some of the tests (especially the tests relating RL) will fail (Blender 2.91).\n1. Install `blendtorch.btb`\n    \n    Run\n    ```\n    blender --background --python \u003cDST\u003e/scripts/install_btb.py\n    ```\n    to `blendtorch-btb` into the Python environment bundled with Blender. \n1. Install `blendtorch.btt`\n\n    Run\n    ```\n    pip install -e \u003cDST\u003e/pkg_pytorch\n    ```\n    installs `blendtorch-btt` into the Python environment that you intend to run PyTorch from. \n1. Install `gym` [optional]\n\n    While not required, it is advised to install OpenAI gym if you intend to use **blendtorch** for reinforcement learning\n    ```\n    pip install gym\n    ```\n1. Install dev requirements [optional]\n\n    This step is optional. If you plan to run the unit tests\n    ```\n    pip install -r requirements_dev.txt\n    pytest tests/\n    ```\n\n## Troubleshooting\nRun\n```\nblender --version\n```\nand check if the correct Blender version (\u003e=2.83) is written to console. Next, ensure that `blendtorch-btb` installed correctly\n```\nblender --background --python-use-system-env --python-expr \"import blendtorch.btb as btb; print(btb.__version__)\"\n```\nwhich should print **blendtorch** version number on success. Next, ensure that `blendtorch-btt` installed correctly\n```\npython -c \"import blendtorch.btt as btt; print(btt.__version__)\"\n```\nwhich should print **blendtorch** version number on success.\n\n## Architecture\nPlease see [\\[examples/datagen\\]](examples/datagen) and [\\[examples/control\\]](examples/control) for an in-depth architectural discussion. Bi-directional communication is explained in [\\[examples/densityopt\\]](examples/densityopt).\n\n## Runtimes\n\nThe following tables show the mean runtimes per batch (8) and per image for a simple Cube scene (640x480xRGBA). See [benchmarks/benchmark.py](./benchmarks/benchmark.py) for details. The timings include rendering, transfer, decoding and batch collating. Reported timings are for Blender 2.8. Blender 2.9 performs equally well on this scene, but is usually faster for more complex renderings.\n\n| Blender Instances  | Runtime sec/batch | Runtime sec/image | Arguments|\n|:-:|:-:|:-:|:-:|\n| 1  | 0.236 | 0.030| UI refresh|\n| 2  | 0.14 | 0.018| UI refresh|\n| 4  | 0.099 | 0.012| UI refresh|\n| 5  | 0.085 | 0.011| no UI refresh|\n\nNote: If no image transfer is needed, i.e in reinforcement learning of physical simulations, 2000Hz are easily achieved.\n\n\u003ca name=\"cite_anchor\"\u003e\u003c/a\u003e\n## Cite\nThe code accompanies our academic work [[1]](https://arxiv.org/abs/1907.01879),[[2]](https://arxiv.org/abs/2010.11696) in the field of machine learning from artificial images. Please consider the following publications when citing **blendtorch**\n```\n@inproceedings{blendtorch_icpr2020_cheind,\n    author = {Christoph Heindl, Lukas Brunner, Sebastian Zambal and Josef Scharinger},\n    title = {BlendTorch: A Real-Time, Adaptive Domain Randomization Library},\n    booktitle = {\n        1st Workshop on Industrial Machine Learning \n        at International Conference on Pattern Recognition (ICPR2020)\n    },\n    year = {2020},\n}\n\n@inproceedings{robotpose_etfa2019_cheind,\n    author={Christoph Heindl, Sebastian Zambal, Josef Scharinger},\n    title={Learning to Predict Robot Keypoints Using Artificially Generated Images},\n    booktitle={\n        24th IEEE International Conference on \n        Emerging Technologies and Factory Automation (ETFA)\n    },    \n    year={2019}\n}\n```\n\n## Caveats\n- Despite offscreen rendering is supported in Blender 2.8x it requires a UI frontend and thus cannot run in `--background` mode. If your application does not require offscreen renderings you may enable background usage (see [tests/](tests/) for examples).\n- The renderings produced by Blender are by default in linear color space and thus will appear darker than expected when displayed.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcheind%2Fpytorch-blender","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fcheind%2Fpytorch-blender","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcheind%2Fpytorch-blender/lists"}