https://github.com/tub-rip/ES-PTAM
Official implementation of ECCVW 2024 SLAM paper "ES-PTAM: Event-based Stereo Parallel Tracking and Mapping"
https://github.com/tub-rip/ES-PTAM
3d-reconstruction asynchronous-sensor camera-tracking depth-estimation ego-motion-estimation event-camera multi-view-stereo robotics simultaneous-localization-and-mapping stereo-vision
Last synced: 7 months ago
JSON representation
Official implementation of ECCVW 2024 SLAM paper "ES-PTAM: Event-based Stereo Parallel Tracking and Mapping"
- Host: GitHub
- URL: https://github.com/tub-rip/ES-PTAM
- Owner: tub-rip
- Created: 2024-08-13T12:36:15.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-05-27T21:30:36.000Z (8 months ago)
- Last Synced: 2025-05-27T22:30:40.295Z (8 months ago)
- Topics: 3d-reconstruction, asynchronous-sensor, camera-tracking, depth-estimation, ego-motion-estimation, event-camera, multi-view-stereo, robotics, simultaneous-localization-and-mapping, stereo-vision
- Language: C++
- Homepage:
- Size: 17.5 MB
- Stars: 36
- Watchers: 4
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- event-vision-index - ES-PTAM: **E**vent-based **S**tereo **P**arallel **T**racking **A**nd **M**apping (ECCVW 2024) - rip/ES-PTAM?style=social"/> (Visual Odometry and SLAM / Arbitrary motion (6-DOF))
README
# ES-PTAM: Event-based Stereo Parallel Tracking and Mapping
Official repository for [**ES-PTAM: Event-based Stereo Parallel Tracking and Mapping**](http://doi.org/10.1007/978-3-031-92460-6_5), by [Suman Ghosh](https://www.linkedin.com/in/suman-ghosh-a8762576/), [Valentina Cavinato](https://ch.linkedin.com/in/valentina-cavinato) and [Guillermo Gallego](http://www.guillermogallego.es), published at the **European Conference on Computer Vision (ECCV) Workshops 2024** Milan, Italy, and **demoed at ECCV 2024**.
[Paper](https://arxiv.org/pdf/2408.15605) | [Video](https://youtu.be/z7J3lZOYwKs) | [Poster](/docs/esptam_eccvw_2024_poster_v3.pdf)
[](https://youtu.be/z7J3lZOYwKs)
:sparkles: It was presented as an **Oral Spotlight** at the [NeVi](https://sites.google.com/view/nevi2024/home-page) Workshop.\
:sparkles: A [live demo](https://x.com/MarcoCristani/status/1841388758299443596/photo/1) was also presented at **ECCV 2024**.
If you use this work in your research, please cite it as follows:
```bibtex
@InProceedings{Ghosh24eccvw,
author = {Suman Ghosh and Valentina Cavinato and Guillermo Gallego},
title = {{ES-PTAM}: Event-based Stereo Parallel Tracking and Mapping},
booktitle = {European Conference on Computer Vision (ECCV) Workshops},
year = {2024},
pages = {70--87},
doi = {10.1007/978-3-031-92460-6\_5}
}
```
## Data Processing Pipeline

### Input
* Events from two or more cameras
* Camera calibration (intrinsic, extrinsic) parameters
### Output
* Camera (i.e., sensor rig) poses
* Depth map
* Confidence map
* Point cloud
* Intermediate ray density maps / Disparity Space Images (DSI)
## Code
* [Installation](docs/installation.md)
* [Running examples on different datasets](docs/examples.md)
* [Running live with DAVIS cameras](docs/live_demo.md)
* [Parameter tuning](docs/parameters.md)
## Results
The original ES-PTAM trajectories and GT poses for various sequences are available [here](trajectory_eval).
They have been evaluted using [this tool](https://github.com/uzh-rpg/rpg_trajectory_evaluation/tree/master).
## License
The license is available [here](Software%20License%20Agreement_TUB_ES_PTAM_final.pdf).
Related works
-------
* **[Event-based Stereo Depth Estimation: A Survey](https://arxiv.org/pdf/2409.17680)**
* **[MC-EMVS: Multi-Event-Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion (AISY 2022)](https://github.com/tub-rip/dvs_mcemvs)**
* [EVO: Event based Visual Odometry (RAL 2017)](https://github.com/uzh-rpg/rpg_dvs_evo_open/)
* [ESVO2: Direct Visual-Inertial Odometry with Stereo Event Cameras (TRO 2025)](https://github.com/NAIL-HNU/ESVO2)
Additional Resources on Event-based Vision
-------
* [Event Collapse in Contrast Maximization Frameworks](https://github.com/tub-rip/event_collapse)
* [Motion-prior Contrast Maximization (ECCV 2024)](https://github.com/tub-rip/MotionPriorCMax)
* [CMax-SLAM (TRO 2024)](https://github.com/tub-rip/cmax_slam)
* [Research page (TU Berlin, RIP lab)](https://sites.google.com/view/guillermogallego/research/event-based-vision)
* [Course at TU Berlin](https://sites.google.com/view/guillermogallego/teaching/event-based-robot-vision)
* [Event-based Vision: A Survey](http://rpg.ifi.uzh.ch/docs/EventVisionSurvey.pdf)
* [List of Resources](https://github.com/uzh-rpg/event-based_vision_resources)