{"id":15640355,"url":"https://github.com/cansik/realsense-processing","last_synced_at":"2025-04-15T22:49:56.887Z","repository":{"id":59826617,"uuid":"141164174","full_name":"cansik/realsense-processing","owner":"cansik","description":"Intel RealSense 2 support for the Processing framework.","archived":false,"fork":false,"pushed_at":"2023-10-21T08:52:54.000Z","size":109258,"stargazers_count":81,"open_issues_count":6,"forks_count":15,"subscribers_count":5,"default_branch":"master","last_synced_at":"2025-04-15T22:49:47.866Z","etag":null,"topics":["camera","intel","java","library","processing","realsense","realsense-camera"],"latest_commit_sha":null,"homepage":"","language":"Java","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/cansik.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null}},"created_at":"2018-07-16T16:22:56.000Z","updated_at":"2025-02-09T19:42:10.000Z","dependencies_parsed_at":"2024-02-06T00:51:48.929Z","dependency_job_id":null,"html_url":"https://github.com/cansik/realsense-processing","commit_stats":null,"previous_names":[],"tags_count":19,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cansik%2Frealsense-processing","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cansik%2Frealsense-processing/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cansik%2Frealsense-processing/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cansik%2Frealsense-processing/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/cansik","download_url":"https://codeload.github.com/cansik/realsense-processing/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":249167434,"owners_count":21223505,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["camera","intel","java","library","processing","realsense","realsense-camera"],"created_at":"2024-10-03T11:34:46.244Z","updated_at":"2025-04-15T22:49:56.864Z","avatar_url":"https://github.com/cansik.png","language":"Java","readme":"# Intel RealSense for Processing [![Build](https://github.com/cansik/realsense-processing/actions/workflows/build.yml/badge.svg)](https://github.com/cansik/realsense-processing/actions/workflows/build.yml)\nIntel RealSense 2 support for the [Processing](https://processing.org/) framework.\n\n![Example](readme/rs-examples.jpg)\n\n## Introduction\n\n**Intel RealSense for Procesing** is a port of the **[Intel RealSense](https://github.com/IntelRealSense/librealsense)** library for processing. With this library it is possible to use the Intel RealSense T200 / D400 / D500 camera series within processing. The idea is **not** to expose the full API into Processing, however a simple and convenient way to work with RealSense devices. For full API support switching over to the underlying [java wrapper](https://github.com/cansik/librealsense-java) is recommended.\n\nSupported Intel RealSense Version: [2.53.1](https://github.com/IntelRealSense/librealsense/releases/tag/v2.53.1)\n\n#### Important ⚠️\n\n- If you were using **the old API** (PreReleases, dated Feb 2019) and do not want to update your sketch, download the [1.1.0 library](https://github.com/cansik/realsense-processing/releases/tag/1.1.0) from the releases and [install it manually](https://github.com/processing/processing/wiki/How-to-Install-a-Contributed-Library#manual-install) into your processing library folder.\n- It is not recommended to kill a sketch in processing, without closing the camera.\n- Currently the library is still under development.\n- `Linux` (x86 / x64 / armhf / arm64), `MacOS` (x64) and `Windows` (x86 / x64) binaries are already bundled into the jar file.\n\n\n#### Supported Configurations\nHere are some configurations I have tested and which are working with the Intel RealSense D435. Please make sure you are using a **USB 3.0 or 3.1** cable!\n\n| width | height | fps                         | depth stream | color stream |\n|-------|--------|-----------------------------|--------------|--------------|\n| 424   | 240    | `6`, `15`, `30`, `60`       | ✅            | ✅            |\n| 480   | 270    | `6`, `15`, `30`, `60`, `90` | ✅            | ❌            |\n| 640   | 480    | `6`, `15`, `30`, `60`       | ✅            | ✅            |\n| 640   | 480    | `90`                        | ✅            | ❌            |\n| 848   | 480    | `6`, `15`, `30`, `60`       | ✅            | ✅            |\n| 848   | 480    | `90`                        | ✅            | ❌            |\n| 960   | 540    | `6`, `15`, `30`, `60`       | ❌            | ✅            |\n| 1280  | 720    | `30`                        | ✅            | ✅            |\n| 1280  | 800    | `6`, `15`, `30`, `60`, `90` | ❌            | ❌            |\n| 1920  | 1080   | `6`, `15`, `30`             | ❌            | ✅            |\n\n## Installation\nThere are multiple ways on how to install the library for this repository into your project.\n\n### Contribution Manager\nUse the contribution manager inside Processing to directly install the library into your local Processing instance.\n\n![Contribution Manager](readme/contribution.png)\n\n### Gradle / Maven\nInclude the library directly into your gradle / maven build by using [jitpack](https://jitpack.io/#cansik/realsense-processing/latest).\n\n```groovy\nrepositories {\n    maven { url 'https://jitpack.io' }\n}\n\ndependencies {\n    implementation 'com.github.cansik:realsense-processing:2.5.0'\n}\n```\n\n### Manual\n\nDownload the [latest build](https://github.com/cansik/realsense-processing/releases/tag/contributed) and extract the files into your [processing library](https://github.com/processing/processing/wiki/How-to-Install-a-Contributed-Library) folder.\n\n## Example\n\nHere are some examples which show how to use the library. You will find more [examples here](https://github.com/cansik/realsense-processing/tree/master/examples). \n(*The examples have been tested with a RealSense D430.*)\n\n### Camera\n\nTo use a RealSense camera within processing, you have to create a new instance of a `RealSenseCamera`. This object will give you all the possibilities of the API.\n\n```processing\nimport ch.bildspur.realsense.*;\n\nRealSenseCamera camera = new RealSenseCamera(this);\n\nvoid setup() {\n    // check if a camera is available\n    boolean a = camera.isDeviceAvailable();\n    \n    // check how many cameras are available\n    int c = camera.getDeviceCount();\n}\n```\n\nTo start a specific camera device (or multiple of them), check out the [Multi Camera Color Stream example](https://github.com/cansik/realsense-processing/blob/master/examples/MultiCameraColorStream/MultiCameraColorStream.pde). To control an undefined amount of cameras, check out the [Advanced Device Handling example](https://github.com/cansik/realsense-processing/blob/master/examples/AdvancedDeviceHandling/AdvancedDeviceHandling.pde).\n\n### Streams\n\nRealSense cameras usually are equiped with multiple Sensors. Mainly video but also depth and position sensors. To use the data streams of these sensors, you have to enable them before starting the camera. It is possible to use the default values (`640x480 30 FPS`) or set your own settings. A complete list of valid settings can be found in the [RealSense Viewer](https://github.com/IntelRealSense/librealsense/tree/master/tools/realsense-viewer) app.\n\nAfter enabling the streams, you have to call the method `readFrames()` every time you are looking for new frames from the camera. If you do not call this method, your streams will always be black or not updated.\n\nThis example activates color and infrared streams and reads their frame data. The frames provided by videostreams are in the `PImage` RGB format. \n\n```processing\nvoid setup()\n{\n    size(1280, 480);\n    \n    camera.enableColorStream(640, 480, 30);\n    camera.enableIRStream(640, 480, 30);\n    camera.start();\n}\n\nvoid draw()\n{\n    background(0);\n    \n    // read frames\n    camera.readFrames();\n    \n    // show images\n    image(camera.getColorImage(), 0, 0);\n    image(camera.getIRImage(), 640, 0);\n}\n```\n\n#### Infrared\n\nIt is important to notice that the **D415** and **D430** cameras both support multiple infrared streams. To read both of them it is possible to tell the camera, which one to enable and to get. \n\n```processing\nimport ch.bildspur.realsense.type.*;\n\nvoid setup() {\n    //...\n    camera.enableIRStream(640, 480, 30, IRStream.Second);\n}\n\nvoid draw() {\n    //...\n    image(camera.getIRImage(IRStream.Second), 0, 0);\n}\n```\n\n#### Depth\nThe depth stream of a camera is not pixel based. It usually comes as a 16bit raw byte streams which can be interpreted as depth data. To view this data it is possible to enable the `Colorizer` filter. This filter colorizes the depth data by using a color scheme. It is even possible to change the scheme to eight different presets.\n\n```processing\nimport ch.bildspur.realsense.*;\nimport ch.bildspur.realsense.type.*;\n\nRealSenseCamera camera = new RealSenseCamera(this);\n\nvoid setup()\n{\n    size(640, 480);\n    \n    camera.enableDepthStream(640, 480);\n    camera.enableColorizer(ColorScheme.Cold);\n    \n    camera.start();\n}\n\nvoid draw()\n{\n    background(0);\n    \n    camera.readFrames();\n    image(camera.getDepthImage(), 0, 0);\n}\n```\n\n### Measure Distance\nIt is possible to measure distance on a depth frame by using `getDistance(int x, int y)`. This will return you a float which represents the distance from the camera to the selected pixel in *meters*.\n\n```processing\nvoid draw {\n    //...\n    float distance = camera.getDistance(mouseX, mouseY)\n}\n```\n\nIt is important to notice that usually depth and color streams are not aligned, which makes it impossible to measure depth on a color image. For this problem you will have to [align](#Alignment) the two streams.\n\n### Projected Point\nTo project a point into the camera space by using the depth intrinsics, it is possible to use the method `getProjectedPoint` which returns a `PVector` containig the coordinates.\n\n```\nvoid draw {\n    //...\n    PVector vertex = camera.getProjectedPoint(mouseX, mouseY)\n}\n```\n\nFor more information have a look at the [ProjectedPoint example](https://github.com/cansik/realsense-processing/blob/master/examples/ProjectedPoint/ProjectedPoint.pde).\n\n### Depth Data\nTo work with the raw depth data it is possible to enable the depth stream without the colorizer filter and start reading the depth data by using `getDepthData()`. This returns a 2-dimensional array of `short` with the `Y / X` order.\n\n```processing\nshort[][] data = camera.getDepthData();\n\nfor (int y = 0; y \u003c height; y++) {\n    for (int x = 0; x \u003c width; x++) {\n        int intensity = data[y][x];\n    }\n}  \n``` \n\n### Alignment\nTo align all the incoming frames to one specific (by default `depth` to `color` frame), it is possible to enable the alignment as a preprocessor.\n\n```processing\n// enable color \u0026 depth stream\ncamera.enableColorStream();\ncamera.enableDepthStream();\n  \n// align the depth to the color stream\ncamera.enableAlign();\n\ncamera.start();\n```\n\n### Filters\nIt is possible to use all the filters offered by the RealSense API inside processing. Just **add** the filter by using its add method. Some of the filters offer you to set the configuration settings while adding them. All of them support calling a default constructor (for example `addThresholdFilter()`) to use the default configuration.\n\n```processing\n// include the following package for the types\nimport ch.bildspur.realsense.type.*;\n\n// list of all supported filters\ncamera.addThresholdFilter(0.0f, 1.0f);\ncamera.addSpatialFilter(2, 0.5f, 20, 0);\ncamera.addDecimationFilter(2);\ncamera.addDisparityTransform(true);\ncamera.addHoleFillingFilter(HoleFillingType.FarestFromAround);\ncamera.addTemporalFilter(0.4f, 20, PersistencyIndex.ValidIn2_Last4);\n\n// The following filters have not been tested yet:\ncamera.addUnitsTransform();\ncamera.addZeroOrderInvalidationFilter();\n```\n\n#### Filter Options\n\nTo change the initial sensor options it is possible to get the filter block from the add method and use the methods provided there. Here an example on how to do use the threshold filter.\n\n```processing\n// include the processing package\nimport ch.bildspur.realsense.processing.*;\n\n// in setup\nRSThresholdFilter thresholdFilter = camera.addThresholdFilter();\n\n// in draw\nthresholdFilter.setMinDistance(5.0);\nthresholdFilter.setMaxDistance(8.0);\n```\n\nCheck out the [ControlThresholdFilter](examples/ControlThresholdFilter/ControlThresholdFilter.pde) example as well.\n\n### Sensor Options\nA RealSense camera usually contains multiple sensors, each with it's unique options and settings. Currently supported are only the `Depth` and `RGB` sensor. Here is an example on how to set the `Enable Auto Exposure` option on the RGB sensor.\n\n```processing\nimport org.intel.rs.types.Option;\n\ncamera.start();\ncamera.getRGBSensor().setOption(Option.EnableAutoExposure, 1.0f);\n```\n\nRealSense options are always of type **float**, ranging from `min` to `max` and do have a `default` value. Please be aware, that setting sensor options is **only possible** after the camera has been started!\n\n### Configuration\nIt is possible to load a predefined `JSON` file which contains a custom configuration. These configurations can be created in the RealSense Viewer app provided by Intel. To apply a `JSON` configuration, the camera has to be running already:\n\n```processing\n// load json config from file\nString jsonConfig = String.join(\"\\n\", loadStrings(\"RawStereoConfig.json\"));\n\n// enable an example stream and start camera\ncamera.enableColorStream();\ncamera.start();\n\n// load a json cofiguration as a string\ncamera.setJsonConfiguration(jsonConfig);\n```\n\n### Advanced\nFor more advanced topics, the wrapper allows you to use the underlaying Java API through following getter methods.\n\n```processing\n// getters for interacting with the java API\nContext context = camera.getContext();\nConfig config = camera.getConfig();\nPipeline pipeline = camera.getPipeline();\nPipelineProfile profile = camera.getPipelineProfile();\nFrameList frames = camera.getFrames();\n```\n\nAlso check out the following example, which uses this API getters to display a pointcloud.\n\n- [Pointcloud Example](https://github.com/cansik/realsense-processing/blob/master/examples/PointCloudViewer/PointCloudViewer.pde) (Advanced API)\n\n## FAQ\nWe try to gather the most frequent questions and answer them here, so we do not have to answer them in every issue.\n\n\u003e The method start(Device) in the type RealSenseCamera is not applicable for the arguments (int, int, int, boolean, boolean)\n\nYou are still using the deprecated API. Please update your code to the 2.0 API structure or install the deprecated API as described in the section [Important](Important).\n\n\u003e The image from the RealSense looks distorted and glitchy.\n\nWhen shutting down the camera without using the `stop()` method, the camera can fall into a bricked state. Just plug out the camera and plug it back in to reset it.\n\n\u003e The camera directly starts with an error that the device could not have been opened.\n\nEither the camera is already used in another application (RealSense-Viewer?) or it is in a bricked state. Just plug out the camera and plug it back in to reset it.\n\n## Build\n\nTo build the library yourself just use the predefined gradle command. The zipped processing library will be in the `release` folder.\n\n```bash\n# windows\ngradlew.bat releaseProcessingLib\n\n# mac / unix\n./gradlew releaseProcessingLib\n```\n\n## About\n\nThe processing library is maintained by [cansik](https://github.com/cansik) and based on the Intel RealSense [Java wrapper](https://github.com/cansik/librealsense-java).\n","funding_links":[],"categories":["Libraries"],"sub_categories":["Contributions"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcansik%2Frealsense-processing","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fcansik%2Frealsense-processing","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcansik%2Frealsense-processing/lists"}