{"id":13531338,"url":"https://github.com/natar-io/PapARt","last_synced_at":"2025-04-01T19:32:04.830Z","repository":{"id":62884754,"uuid":"66556565","full_name":"natar-io/PapARt","owner":"natar-io","description":"Paper Augmented Reality Toolkit - interactive projection for Processing","archived":false,"fork":false,"pushed_at":"2023-08-02T15:53:47.000Z","size":492300,"stargazers_count":97,"open_issues_count":0,"forks_count":20,"subscribers_count":11,"default_branch":"master","last_synced_at":"2024-05-21T13:53:52.645Z","etag":null,"topics":["ar","augmeted","camera","camera-systems","depth-camera","inria","processing","projector","reality"],"latest_commit_sha":null,"homepage":null,"language":"Java","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"lgpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/natar-io.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null}},"created_at":"2016-08-25T12:36:53.000Z","updated_at":"2024-03-31T14:18:17.000Z","dependencies_parsed_at":"2023-01-23T08:31:08.374Z","dependency_job_id":"f0acac7d-1fc9-4c3f-b934-277896b776b4","html_url":"https://github.com/natar-io/PapARt","commit_stats":null,"previous_names":[],"tags_count":8,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/natar-io%2FPapARt","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/natar-io%2FPapARt/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/natar-io%2FPapARt/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/natar-io%2FPapARt/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/natar-io","download_url":"https://codeload.github.com/natar-io/PapARt/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246700580,"owners_count":20819899,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ar","augmeted","camera","camera-systems","depth-camera","inria","processing","projector","reality"],"created_at":"2024-08-01T07:01:02.190Z","updated_at":"2025-04-01T19:31:59.816Z","avatar_url":"https://github.com/natar-io.png","language":"Java","readme":"## PapARt Library\n\nPapARt is a software development kit (SDK) that enables the creation of interactive projection mapping.\nIt is a long running project by Jeremy Laviole, created by Inria, Bordeaux University and the lastest updates\nare from CATIE and some personal time.\n\nIt comes from the augmented physical drawing tools created by Jeremy Laviole, which are documented in his PhD thesis (free to read).\n\n### Main features \n\n#### Unified rendering - Projection and SeeThrough \n\nIt is possible to switch easily between AR on top of video (SeeThrough) and \nAR using Projection just by changing a few lines of code. \n\n#### Object tracking \n\nThe native tracking in PapARt is based on ARToolkitPlus, nowadays it is possible to \nuse higher quality marker tracking with ARUCO. \nWe have a built-in support for color detection in RGB, HSV and Cie XYZ color spaces. \nThe latest examples use custom circular tracking for colored stickers that provide \nposition, and orientation. \n\n#### Arm, Hand, Finger tracking using depth cameras \n\nMany depth cameras are supported : Kinect, Orbbec Astra, Realsense Depth cameras (older models). \nFor these depth images, we have two object detection, a simple one that detect and track \nobjects over a plane. The second one is a hierarchical tracking, of arm, hand and finger detection \nand tracking. It achieves high quality results for finger tracking but it is still harder to tweak\nand requires more hardware resources.\n\n#### Integrated UI kit \n\nWe use a fork of ControlP5 called Skatalo which is updated to handle multiple \"touch\" events \ninstead of a single cursor and click. The elements detected and tracked can be used to activate\nwidgets: buttons, toggles, sliders. \n\n#### API at millimeter scale \n\nWe follow the Processing APIs, using millimeters instead of pixels. \nThe interesting consequence is that rendering can be adjusted following the \nhardware capabilities and projector location. \n\n#### Open source ecosystem \n\nPapARt is build on top of Processing, OpenCV, JavaCV and JavaCPP. Our calibration \nboards use the SVG format and are created using Inkscape. \n\nThe latest updates rely on Redis, ARUCO, and other open source projects. \n\nThis library is an outcome enabled by many open source communities. \n#### Advanced examples\n\n\nWe include advanced examples of uses: \n\n* 3D Rendering inside a secondary screen (in PapARt initial article). \n* Compatibility with Unity3D though Natar. \n* 3D Scanner using Gray Code. \n* Teaching of color blending application. \n\n## PapARt 1.6 - Back to monolith \n\nAfter a two years break a new version comes to life. It integrates back the Natar developments \nand builds up a new structure from the micro-service creation experience. \n\nThe main update is the support of modern version of OpenCV : 4.5.4, and modern operating systems and machines. \nThis new support is permitted by the Processing community.\n\n* Windows 10/11 support. \n* Arch linux support. \n* OSX, M1 architecture. \n\nSupport on other architecture and devices should be possible, notably Raspberry PI or Android. \n\n\n### *New* - Hardware production \n\nPapARt hardware from RealityTech will be distributed soon with a free licence (Creative Commons). \nThe 3D models and sample calibrations for known hardware will be released. \n\nThe bill of materials will be included also, with projector, camera, screen support and\nrecommended configuration. \n\nAlthough RealityTech is not in operation anymore, we could help with the creation of such devices \nand for research/industrial projects with PapARt you can contact us at CATIE:  j.laviole@catie.fr .\n\n### New features \n#### Update to Processing 4 and Java 17\n\nProcessing 4 is currently in beta, it brings support of Java 17. \nWe support again all major OS for this release. \n\n#### Integration of Natar \n\nInitially, Natar was the follow-up project of PapARt for larger projects.\nNatar is a communication protocol for images based on Redis. It features a support of \ncalibration files within Redis. \n\n* There are now programs to load calibration files from PapARt to Redis to use in \nother languages and SDKs, notably Unity3D. \n* PapARt can be used to load cameras feeds to Natar, RGB, IR and Depth videos. \n* Natar video feeds can be loaded into PapARt. \n* Natar pose estimators using Aruco can be used in PapARt. \n\nThe full support and update is in progress and full tutorials are to be created. \n\nUpdated support and revival of Natar will be the goal of 1.7. \n\n\n## PapARt 1.4.2 - Christmas 2021 Release \n### Everything is open-source \n\nThe last bits are opened, as RealityTech stopped its AR two years ago. \nMost notably the calibration tools, used to create the hardware, are now \navailable and will be documented. \n\n### 10 years of tabletop AR\n\nThe first public demonstrations were 2011 at the \"Palais de la découverte\" in Paris, a few months \nbefore the first paper was published. \n\nThe first steps were getting a project projection, then it snowballed: \n \n- 2011 Rendering a 3D scene from another perspective into this projection. \n- 2012 Add shadows, lights. \n- 2012 Create Stereoscopic rendering, for stereo - drawings. \n- 2012-2013 Many tools were create do assist drawing (described in the PhD thesis). \n- 2014-2015 The library got better architecture, precision, performance and calibration. \n- 2016 Touch precision got improved. \n- 2017 Natar idea, tests on marker detectors and camera precision.\n- 2017 Natar experiments, Unity version.\n- 2017 Inclusion of any Linux app with projection and touch.\n- 2018 Use of colored dots instead of markers. \n\nIn 2019, it slowed down to a few customer projects, and stopped dead for two years. \n\n### What is next ? \n\n- Projection-based AR is still cool. \n- Holograms are still cool. \n- Cameras get better every year, and depth camera too. \n- Projector's sizes and prices are low enough for students projects and research.\n\nAll of the basics are there. I got quite sick of this project after 8 years on it. \nNow, new people come to projection-based AR and want to give it a go. \n\nYou will suffer with calibration issues until the guides are perfect, or a new hardware is created/sold.\n\nHowever, the tools offered by PapARt are wide enough to create a wide variety of experiences.\nA few developer devices are out there (at least 4) in universities, if the projector and cameras were not salvaged \nthe new guides could come handy. \n\nAside the research projects, two commercial applications are in use, and a few more should be created soon.\nThis project comes back to life from demands in research, by students and retail use. \nI want to give it a push, maybe also ressurect the devices as a Kit to download, or buy pre-build to assemble.\n\n\n\n## PapARt 1.4 - Release Candidate. (July 2018)\n\nThis new release brings many new features. It is now easier to place PaperScreen on the table with \nthe new TableScreen class. \n\nThe color tracking and particularly the circular tracking is quite robust and enable the creation of \nphysical interfaces with a high detection rate. There will be a complete tutorial on how to create \na mixed reality interface with touch and circle tracking. \n\nWe work to improve the current API, as it will be part of the coming Nectar platform. The main \nmotivation for Nectar to push further the possibilites of SAR with PapARt. The rendering will not\nbe limited to Processing for rendering with the Unity3D Nectar plugin. The plugin is in \ninternal test/development phase, and is already quite promising. \n\n[More on the example repository, 1.4rc branch.](https://github.com/poqudrof/Papart-examples/tree/1.4rc).\n\n#### New hosting \n\nThe 1.4 version and development versions are hosted on [gitlab](https://forge.pole-aquinetic.net/RealityTech/PapARt). You can request access if you collaborate with RealityTech, or use RealityTech Hardware platforms. \n\nThe 1.3 version, sister of 1.4 will be free and publicly available on github. \n\n## Version 1.1 and 1.2 (January 2018)\n\nThe first 2018 releases are 1.1 and 1.2.There are two major updates: \n\n* (1.1) ColorTracking: The library enables color tracking. The system learns to recognize and track five colors which can be used to activate buttons, or identify objects. \n* (1.2) Hand recognition and tracking is improved to segment the arm - hands and fingers. The API is in progress and will evolve. This version is distributed with RealityTech's hardware.\n\nOther features: \n\n* Easier to compile thanks to the release of JavaCV/JavaCPP 1.4.  \n* Support of intel Realsense cameras (SR300 and F200). \n* Support of Orbbec cameras (Astra S).\n* JRubyArt support is getting extended.\n* Community and commercial support is moved from the wiki to the [forum](http://forum.rea.lity.tech). \n\n## Version 1.0\n\nThe first big release is ready. If you want to try it out download our precompiled version from the [example repository](https://github.com/poqudrof/Papart-examples). \n\n## Examples\n\nThis repository is for the development of the library.  \nYou may want to go to the **[PapARt-Examples repository](https://github.com/poqudrof/Papart-examples)** to see how to use it or discover  the features and demos. \n\n## Features\n\n[![](https://github.com/poqudrof/PapARt/blob/master/video_screenshot.png?raw=true)](https://youtu.be/bMwKVOuZ9EA)\n\nIt enables the creation of Augmented Reality (AR) applications in [Processing](https://processing.org/). \nLike most AR toolkit, it is vision based and it detects the world using color cameras. \nIn addition to this, PapARt also uses a depth camera. \n\nWe generally use pre-calibrated (intrinsics parameters) and PapARt enables the extrinsic calibration: how cameras are located relatively from one to another. It also provides simple and unprecise tools to create intrinsic calibration. \n\nIt uses tracking libraries such as ARToolkit and OpenCV, and can be extended. \nThe strength of this library is the creation of interactive projection (also called spatial augmented reality in research). \nIn addition to cameras, PapARt calibrates the projector’s extrinsics to create projector/camera systems also called ProCams. \n\nInteractivity is increased thanks to an object and hand tracking enabled by the depth camera.\n\nMore information about the research project here: \n#### https://project.inria.fr/papart/\n\n## Examples\n\nPapARt is large library, and work with many different systems:\n- webcams and professionnal cameras ([PointGrey](https://www.ptgrey.com/) cameras). \n- depthCameras: [Kinect Xbox360](https://github.com/OpenKinect/libfreenect), [Kinect xbox one](https://github.com/OpenKinect/libfreenect2), Intel [Realsense](https://github.com/IntelRealSense/librealsense).  \n- Projector/camera/depth camera systems (the main purpose of the library).  \n\n## How to contribute\n\nThe open source release is new (end of August 2016), feel free to fork, star, and file issues for this sources. \nYou can contribute your examples to the [example repository](https://github.com/poqudrof/Papart-examples) to \nshare your creations with the community. \n\n## Next steps\n\nThe distribution got better, and the next steps would be to create versions on **Android** and/or on **Raspberry PI**.\n\n### Copyright note\n\nPapARt is an open source software owned by Inria, Bordeaux University and CNRS, distributed\nunder the LGPL license.\n","funding_links":[],"categories":["Libraries/Frameworks/Tools"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fnatar-io%2FPapARt","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fnatar-io%2FPapARt","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fnatar-io%2FPapARt/lists"}