{"id":13529093,"url":"https://github.com/googlesamples/arcore-depth-lab","last_synced_at":"2025-05-16T16:01:48.731Z","repository":{"id":40411250,"uuid":"272831349","full_name":"googlesamples/arcore-depth-lab","owner":"googlesamples","description":"ARCore Depth Lab is a set of Depth API samples that provides assets using depth for advanced geometry-aware features in AR interaction and rendering. (UIST 2020)","archived":false,"fork":false,"pushed_at":"2024-06-05T20:28:30.000Z","size":80334,"stargazers_count":819,"open_issues_count":5,"forks_count":154,"subscribers_count":33,"default_branch":"master","last_synced_at":"2025-04-12T14:17:27.587Z","etag":null,"topics":["ar","arcore","arcore-unity","depth","depth-api","depthlab","interaction","mobile"],"latest_commit_sha":null,"homepage":"https://augmentedperception.github.io/depthlab/","language":"C#","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/googlesamples.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2020-06-16T23:16:30.000Z","updated_at":"2025-04-09T23:19:00.000Z","dependencies_parsed_at":"2024-11-02T15:42:16.599Z","dependency_job_id":null,"html_url":"https://github.com/googlesamples/arcore-depth-lab","commit_stats":null,"previous_names":[],"tags_count":3,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/googlesamples%2Farcore-depth-lab","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/googlesamples%2Farcore-depth-lab/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/googlesamples%2Farcore-depth-lab/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/googlesamples%2Farcore-depth-lab/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/googlesamples","download_url":"https://codeload.github.com/googlesamples/arcore-depth-lab/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248578876,"owners_count":21127714,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ar","arcore","arcore-unity","depth","depth-api","depthlab","interaction","mobile"],"created_at":"2024-08-01T07:00:33.019Z","updated_at":"2025-04-12T14:17:31.614Z","avatar_url":"https://github.com/googlesamples.png","language":"C#","readme":"# ARCore Depth Lab - Depth API Samples for Unity\n\nCopyright 2020 Google LLC\n\n**Depth Lab** is a set of ARCore Depth API samples that provides assets using\ndepth for advanced geometry-aware features in AR interaction and rendering. Some\nof these features have been used in this\n[Depth API overview](https://www.youtube.com/watch?v=VOVhCTb-1io) video.\n\n[![DepthLab examples](depthlab.gif)](https://augmentedperception.github.io/depthlab)\n\n[**ARCore Depth API**](https://developers.google.com/ar/develop/unity/depth/overview)\nis enabled on a subset of ARCore-certified Android devices. **iOS devices\n(iPhone, iPad) are not supported**. Find the list of devices with Depth API\nsupport (marked with **Supports Depth API**) here:\n[https://developers.google.com/ar/devices](https://developers.google.com/ar/discover/supported-devices).\nSee the [ARCore developer documentation](https://developers.google.com/ar) for\nmore information.\n\nDownload the pre-built ARCore Depth Lab app on\n[Google Play Store](https://play.google.com/store/apps/details?id=com.google.ar.unity.arcore_depth_lab)\ntoday.\n\n[\u003cimg alt=\"Get ARCore Depth Lab on Google Play\" height=\"50px\" src=\"https://play.google.com/intl/en_us/badges/images/apps/en-play-badge-border.png\" /\u003e](https://play.google.com/store/apps/details?id=com.google.ar.unity.arcore_depth_lab)\n\n## Branches\n\nARCore Depth Lab has two branches: `master` and `arcore_unity_sdk`.\n\nThe `master` branch contains a subset of Depth Lab features in v1.1.0 and is\nbuilt upon the recommended\n[AR Foundation 4.2.0 (preview 7)](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/manual/index.html)\nor newer. The `master` branch supports features including oriented 3D reticles,\ndepth map visualization, collider with depth mesh, avatar locomotion, raw point\ncloud visualization, recording and playback.\n\nThe `arcore_unity_sdk` branch contains the full features of Depth Lab and is\nbuilt upon\n[ARCore SDK for Unity v1.24.0](https://github.com/google-ar/arcore-unity-sdk/releases)\nor newer. We recommend using the `master` branch to build new projects with the\nAR Foundation SDK and refer to this branch when necessary.\n\n## Getting started\n\nThese samples target\n[**Unity 2020.3.6f1**](https://unity3d.com/get-unity/download/archive) and\nrequire\n[**AR Foundation 4.2.0-pre.7**](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.2/manual/index.html)\nor newer,\n[ARCore Extensions](https://developers.google.com/ar/develop/unity-arf) **1.24**\nor newer. The [ARCore Extensions sources](https://github.com/google-ar/arcore-unity-extensions) \nare automatically included via the Unity package manager.\n\nThis project only builds with the Build Platform **Android**. Build the project\nto an Android device instead of using the **Play** button in the Unity editor.\n\n## Sample features\n\nThe sample scenes demonstrate three different ways to access depth. Supported\nfeatures in the `master` branch is labeled with :star:, while the rest features\ncan be found in the `arcore_unity_sdk` branch.\n\n1.  **Localized depth**: Sample single depth values at certain texture\n    coordinates (CPU).\n    *   Oriented 3D reticles :star:\n    *   Character locomotion on uneven terrain :star:\n    *   Collision checking for AR object placement\n    *   Laser beam reflections\n    *   Rain and snow particle collision\n2.  **Surface depth**: Create a connected mesh representation of the depth data\n    (CPU/GPU).\n    *   Point cloud fusion :star:\n    *   AR shadow receiver\n    *   Paint splat\n    *   Physics simulation\n    *   Surface retexturing\n3.  **Dense depth**: Process depth data at every screen pixel (GPU).\n    *   False-color depth map :star:\n    *   AR fog\n    *   Occlusions\n    *   Depth-of-field blur\n    *   Environment relighting\n    *   3D photo\n\n## Building samples\n\nIndividual scenes can be built and run by enabling a particular scene (e.g.,\n`OrientedReticle` to try out the oriented 3D reticle.) and the\n`ARFDepthComponents` object in the scene. Remember to disable the\n`ARFDepthComponents` object in individual scenes when building all demos with\nthe `DemoCarousel` scene.\n\nWe also provide a demo user interface that allows users to seamlessly switch\nbetween examples. Please make sure to set the **Build Platform** to **Android**\nand verify that the main `DemoCarousel` scene is the first enabled scene in the\n**Scenes In Build** list under **Build Settings**. Enable all scenes that are\npart of the demo user interface.\n\n`Assets/ARRealismDemos/DemoCarousel/Scenes/DemoCarousel.unity\nAssets/ARRealismDemos/OrientedReticle/Scenes/OrientedReticle.unity\nAssets/ARRealismDemos/DepthEffects/Scenes/DepthEffects.unity\nAssets/ARRealismDemos/Collider/Scenes/Collider.unity\nAssets/ARRealismDemos/AvatarLocomotion/Scenes/AvatarLocomotion.unity\nAssets/ARRealismDemos/PointCloud/Scenes/RawPointClouds.unity`\n\nThe following scenes can be found in the `arcore_unity_sdk` branch, but are not\nyet available with the AR Foundation SDK.\n\n`Assets/ARRealismDemos/MaterialWrap/Scenes/MaterialWrap.unity\nAssets/ARRealismDemos/Splat/Scenes/OrientedSplat.unity\nAssets/ARRealismDemos/LaserBeam/Scenes/LaserBeam.unity\nAssets/ARRealismDemos/Relighting/Scenes/PointsRelighting.unity\nAssets/ARRealismDemos/DepthEffects/Scenes/FogEffect.unity\nAssets/ARRealismDemos/SnowParticles/Scenes/ArCoreSnowParticles.unity\nAssets/ARRealismDemos/RainParticles/Scenes/RainParticlesScene.unity\nAssets/ARRealismDemos/DepthEffects/Scenes/DepthOfFieldEffect.unity\nAssets/ARRealismDemos/Water/Scenes/Water.unity\nAssets/ARRealismDemos/CollisionDetection/Scenes/CollisionAwareObjectPlacement.unity\nAssets/ARRealismDemos/ScreenSpaceDepthMesh/Scenes/ScreenSpaceDepthMesh.unity\nAssets/ARRealismDemos/ScreenSpaceDepthMesh/Scenes/StereoPhoto.unity`\n\n## Sample project structure\n\nThe main sample assets are placed inside the `Assets/ARRealismDemos` folder.\nEach subfolder contains sample features or helper components.\n\n### `AvatarLocomotion`\n\nThe AR character in this scene follows user-set waypoints while staying close to\nthe surface of an uneven terrain. This scene uses raycasting and depth lookups\non the CPU to calculate a 3D point on the surface of the terrain.\n\n### `Collider`\n\nThis physics simulation playground uses screen-space depth meshes to enable\ncollisions between Unity's rigid-body objects and the physical environment.\n\nAfter pressing an on-screen button, a `Mesh` object is procedurally generated\nfrom the latest depth map. This is used to update the `sharedMesh` parameter of\nthe `MeshCollider` object. A randomly selected primitive rigid-body object is\nthen thrown into the environment.\n\n### `CollisionDetection`\n\nThis AR object placement scene uses depth lookups on the CPU to test collisions\nbetween the vertices of virtual objects and the physical environment.\n\n### `Common`\n\nThis folder contains scripts and prefabs that are shared between the feature\nsamples. For more details, see the [`Helper Classes`](#helper-classes) section\nbelow.\n\n### `DemoCarousel`\n\nThis folder contains the main scene, which provides a carousel user interface.\nThis scene allows the user to seamlessly switch between different features. A\nscene can be selected by directly touching a preview thumbnail or dragging the\ncarousel UI to the desired position.\n\n### `DepthEffects`\n\nThis folder contains three dense depth shader processing examples.\n\nThe `DepthEffects` scene contains a fragment-shader effect that can transition\nfrom the AR camera view to a false-color depth map. Warm colors indicate closer\nregions in the depth map. Cold colors indicate further regions.\n\nThe `DepthOfFieldEffect` scene contains a simulated **Bokeh** fragment-shader\neffect. This blurs the regions of the AR view that are not at the user-defined\nfocus distance. The focus anchor is set in the physical environment by touching\nthe screen. The focus anchor is a 3D point that is locked to the environment and\nalways in focus.\n\nThe `FogEffect` scene contains a fragment-shader effect that adds a virtual fog\nlayer on the physical environment. Close objects will be more visible than\nobjects further away. A slider controls the density of the fog.\n\n### `LaserBeam`\n\nThis laser reflection scene allows the user to shoot a slowly moving laser beam\nby touching anywhere on the screen.\n\nThis uses:\n\n*   The `DepthSource.GetVertexInWorldSpaceFromScreenXY(..)` function to look up\n    a raycasted 3D point\n*   The `ComputeNormalMapFromDepthWeightedMeanGradient(..)` function to look up\n    the surface normal based on a provided 2D screen position.\n\n### `MaterialWrap`\n\nThis experience allows the user to change the material of real-world surfaces\nthrough touch. This uses depth meshes.\n\n### `OrientedReticle`\n\nThis sample uses depth hit testing to obtain the raycasted 3D position and\nsurface normal of a raycasted screen point.\n\n### `PointCloud`\n\nThis sample computes a point cloud on the CPU using the depth array. Press the\n**Update** button to compute a point cloud based on the latest depth data.\n\n### `RawPointClouds`\n\nThis sample fuses point clouds with the raw depth maps on the CPU using the\ndepth array. Drag the **confidence** slider to change the visibility of each\npoint based on the confidence value of the corresponding raw depth.\n\n### `RainParticles`\n\nThis sample uses the GPU depth texture to compute collisions between rain\nparticles and the physical environment.\n\n### `Relighting`\n\nThis sample uses the GPU depth texture to computationally re-light the physical\nenvironment through the AR camera. Areas of the physical environment close to\nthe artificial light sources are lit, while areas farther away are darkened.\n\n### `ScreenSpaceDepthMesh`\n\nThis sample uses depth meshes. A template mesh containing a regular grid of\ntriangles is created once on the CPU. The GPU shader displaces each vertex of\nthe regular grid based on the reprojection of the depth values provided by the\nGPU depth texture. Press **Freeze** to take a snapshot of the mesh and press\n**Unfreeze** to revert back to the live updating mesh.\n\n### `StereoPhoto`\n\nThis sample uses depth meshes and\n[`ScreenSpaceDepthMesh`](#ScreenSpaceDepthMesh). After freezing the mesh, we\ncache the current camera's projection and view matrices, circulate the camera\naround a circle, and perform projection mapping onto the depth mesh with the\ncached camera image. Press **Capture** to create the animated 3D photo and press\n**Preview** to go back to camera preview mode.\n\n### `SnowParticles`\n\nThis sample uses the GPU depth texture to compute collisions between snow\nparticles, the physical environment, and the orientation of each snowflake.\n\n### `Splat`\n\nThis sample uses the [`Oriented Reticle`](#orientedreticle) and the depth mesh\nin placing a surface-aligned texture decal within the physical environment.\n\n### `Water`\n\nThis sample uses a modified GPU occlusion shader to create a flooding effect\nwith artificial water in the physical environment.\n\n## Helper classes\n\n### `DepthSource`\n\nA singleton instance of this class contains references to the CPU array and GPU\ntexture of the depth map, camera intrinsics, and many other depth look up and\ncoordinate transformation utilities. This class acts as a high-level wrapper for\nthe [`MotionStereoDepthDataSource`](#motionstereodepthdatasource) class.\n\n### `DepthTarget`\n\nEach `GameObject` containing a `DepthTarget` becomes a subscriber to the GPU\ndepth data. `DepthSource` will automatically update the depth data for each\n`DepthTarget`. At least one instance of `DepthTarget` has to be present in the\nscene in order for `DepthSource` to provide depth data.\n\n### `MotionStereoDepthDataSource`\n\nThis class contains low-level operations and direct access to the depth data. It\nshould only be use by advanced developers.\n\n## User privacy requirements\n\nYou must prominently disclose the use of Google Play Services for AR (ARCore)\nand how it collects and processes data in your application. This information\nmust be easily accessible to end users. You can do this by adding the following\ntext on your main menu or notice screen: \"This application runs on\n[Google Play Services for AR](//play.google.com/store/apps/details?id=com.google.ar.core)\n(ARCore), which is provided by Google LLC and governed by the\n[Google Privacy Policy](//policies.google.com/privacy)\".\n\n## Related publication\n\nPlease refer to https://augmentedperception.github.io/depthlab/ for our paper,\nsupplementary material, and presentation published in ACM UIST 2020: \"DepthLab:\nReal-Time 3D Interaction With Depth Maps for Mobile Augmented Reality\".\n\n## References\n\nIf you use ARCore Depth Lab in your research, please reference it as:\n\n```bibtex\n@inproceedings{Du2020DepthLab,\n  title = {{DepthLab: Real-time 3D Interaction with Depth Maps for Mobile Augmented Reality}},\n  author = {Du, Ruofei and Turner, Eric and Dzitsiuk, Maksym and Prasso, Luca and Duarte, Ivo and Dourgarian, Jason and Afonso, Joao and Pascoal, Jose and Gladstone, Josh and Cruces, Nuno and Izadi, Shahram and Kowdle, Adarsh and Tsotsos, Konstantine and Kim, David},\n  booktitle = {Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology},\n  year = {2020},\n  publisher = {ACM},\n  pages = {829--843},\n  series = {UIST '20}\n  doi = {10.1145/3379337.3415881}\n}\n```\n\nor\n\n```\nRuofei Du, Eric Turner, Maksym Dzitsiuk, Luca Prasso, Ivo Duarte, Jason Dourgarian, Joao Afonso, Jose Pascoal, Josh Gladstone, Nuno Cruces, Shahram Izadi, Adarsh Kowdle, Konstantine Tsotsos, and David Kim. 2020. DepthLab: Real-Time 3D Interaction With Depth Maps for Mobile Augmented Reality. Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology (UIST '20), 829-843. DOI: http://dx.doi.org/10.1145/3379337.3415881.\n```\n\nWe would like to also thank Levana Chen, Xinyun Huang, and Ted Bisson for\nintegrating DepthLab with AR Foundation.\n\n## Additional information\n\nYou may use this software under the\n[Apache 2.0 License](https://github.com/googlesamples/arcore-depth-lab/blob/master/LICENSE).\n","funding_links":[],"categories":["6. Mobile End SLAM"],"sub_categories":["6.3 Augmented Reality"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgooglesamples%2Farcore-depth-lab","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fgooglesamples%2Farcore-depth-lab","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgooglesamples%2Farcore-depth-lab/lists"}