https://github.com/diyer22/bpycv
Computer vision utils for Blender (generate instance annoatation, depth and 6D pose by one line code)
https://github.com/diyer22/bpycv
6dof-pose blender blender-cv blender-python computer-vision data-synthesis dataset-generation deep-learning depth instance-segmentation synthetic-datasets ycb
Last synced: about 1 month ago
JSON representation
Computer vision utils for Blender (generate instance annoatation, depth and 6D pose by one line code)
- Host: GitHub
- URL: https://github.com/diyer22/bpycv
- Owner: DIYer22
- License: mit
- Created: 2019-12-29T12:46:59.000Z (almost 6 years ago)
- Default Branch: master
- Last Pushed: 2025-08-05T08:04:58.000Z (2 months ago)
- Last Synced: 2025-09-04T20:03:17.488Z (about 1 month ago)
- Topics: 6dof-pose, blender, blender-cv, blender-python, computer-vision, data-synthesis, dataset-generation, deep-learning, depth, instance-segmentation, synthetic-datasets, ycb
- Language: Python
- Homepage:
- Size: 327 KB
- Stars: 489
- Watchers: 18
- Forks: 59
- Open Issues: 5
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# `bpycv`: Computer Vision and Deep Learning Utils for Blender
### Contents: [Features](#-features) | [Install](#-install) | [Demo](#-demo) | [Tips](#-tips)

*Figure.1 Render instance annoatation, RGB image and depth*## ▮ Features
- [x] Render annotations for semantic segmentation, instance segmentation and panoptic segmentation
- [x] Generate 6DoF pose ground truth
- [x] Render depth ground truth
- [x] Pre-defined domain randomization:
- [Light and background](https://github.com/DIYer22/bpycv_example_data/tree/main/background_and_light), automatic download from [HDRI Haven](https://hdrihaven.com/hdris/)
- [Distractors](https://arxiv.org/pdf/1804.06516) from [ShapeNet](https://shapenet.org/) (e.g. vase, painting, pallet in Figure.1)
- Textures from [Texture Haven](https://texturehaven.com/textures/)
- [x] Easy installation and demo running
- [x] Docker support: `docker run -v /tmp:/tmp diyer22/bpycv` (see [Dockerfile](Dockerfile))
- [x] A Python Codebase for building synthetic datasets (see [example/ycb_demo.py](example/ycb_demo.py))
- [x] Conversion to [Cityscapes annotation format](https://github.com/DIYer22/bpycv/issues/38)
- [x] Easy development and debugging due to:
- No complicated packaging
- Use of Blender's native API and calling methods**Our Projects that rely on `bpycv`:**
- 🔬 Research: [Building Five Irregular Shape Instance Segmentation Datasets](https://github.com/iShape/build_synthetic_ishape)
- 🎮 Demo: [3D Reconstruction + Blender Data Synthesis + AutoML == Free Detection and Segmentation Model](https://www.youtube.com/watch?v=qTKNQu30rhM)
- 🥈 Achievement: [We win 2nd place in IROS 2020 Open Cloud Robot Table Organization Challenge (OCRTOC)](https://github.com/DIYer22/bpycv/issues/15)## ▮ Install
`bpycv` support Blender 2.9, 3.x, 4.x1. Download and install Blender [here](https://www.blender.org/download/).
2. Open Blender dir in terminal and run install script:
```bash
# For Windows user: ensure powershell has administrator permission
# For MacOS user: replace ./blender to /Applications/Blender.app/Contents/MacOS/Blender# Ensure pip: equl to /blender-path/3.xx/python/bin/python3.10 -m ensurepip
./blender -b --python-expr "from subprocess import sys,call;call([sys.executable,'-m','ensurepip'])"
# Update pip toolchain
./blender -b --python-expr "from subprocess import sys,call;call([sys.executable]+'-m pip install -U pip setuptools wheel'.split())"
# pip install bpycv
./blender -b --python-expr "from subprocess import sys,call;call([sys.executable]+'-m pip install -U bpycv'.split())"
# Check bpycv ready
./blender -b -E CYCLES --python-expr "import bpycv,cv2;d=bpycv.render_data();bpycv.tree(d);cv2.imwrite('/tmp/try_bpycv_vis(inst-rgb-depth).jpg', d.vis()[...,::-1])"
```## ▮ Demo
#### 1. Fast Instance Segmentation and Depth Demo
Copy-paste this code to `Scripting/Text Editor` and click `Run Script` button(or `Alt+P`)
```python
import cv2
import bpy
import bpycv
import random
import numpy as np# remove all MESH objects
[bpy.data.objects.remove(obj) for obj in bpy.data.objects if obj.type == "MESH"]for index in range(1, 20):
# create cube and sphere as instance at random location
location = [random.uniform(-2, 2) for _ in range(3)]
if index % 2:
bpy.ops.mesh.primitive_cube_add(size=0.5, location=location)
categories_id = 1
else:
bpy.ops.mesh.primitive_uv_sphere_add(radius=0.5, location=location)
categories_id = 2
obj = bpy.context.active_object
# set each instance a unique inst_id, which is used to generate instance annotation.
obj["inst_id"] = categories_id * 1000 + index# render image, instance annoatation and depth in one line code
result = bpycv.render_data()
# result["ycb_meta"] is 6d pose GT# save result
cv2.imwrite(
"demo-rgb.jpg", result["image"][..., ::-1]
) # transfer RGB image to opencv's BGR# save instance map as 16 bit png
cv2.imwrite("demo-inst.png", np.uint16(result["inst"]))
# the value of each pixel represents the inst_id of the object# convert depth units from meters to millimeters
depth_in_mm = result["depth"] * 1000
cv2.imwrite("demo-depth.png", np.uint16(depth_in_mm)) # save as 16bit png# visualization instance mask, RGB, depth for human
cv2.imwrite("demo-vis(inst_rgb_depth).jpg", result.vis()[..., ::-1])
print(f"Saving vis image to: 'demo-vis(inst_rgb_depth).jpg'")
```
Open `./demo-vis(inst_rgb_depth).jpg`:
#### 2. YCB Demo
```shell
mkdir ycb_demo
cd ycb_demo/# prepare code and example data
git clone https://github.com/DIYer22/bpycv
git clone https://github.com/DIYer22/bpycv_example_datacd bpycv/example/
blender -b -P ycb_demo.py
cd dataset/vis/
ls . # visualize result here
# 0.jpg
```
Open visualize result `ycb_demo/bpycv/example/dataset/vis/0.jpg`:

*instance_map | RGB | depth*[example/ycb_demo.py](example/ycb_demo.py) Inculding:
- Domain randomization for background, light and distractor (from ShapeNet)
- Codebase for building synthetic datasets base on YCB dataset#### 3. 6DoF Pose Demo
Generate and visualize 6DoF pose GT: [example/6d_pose_demo.py](example/6d_pose_demo.py)
## ▮ Tips
> Blender may can't direct load `.obj` or `.dea` file from YCB and ShapeNet dataset.
> It's better to transefer and format using [`meshlabserver`](https://github.com/cnr-isti-vclab/meshlab/releases) by run `meshlabserver -i raw.obj -o for_blender.obj -m wt`
**[suggestion](https://github.com/DIYer22/bpycv/issues) and pull request are welcome** 😊