https://github.com/tub-rip/EventStereoSurvey
Survey paper and tables about stereo reconstruction with event cameras
https://github.com/tub-rip/EventStereoSurvey
3d-reconstruction best-practices datasets depth-estimation event-camera slam stereo survey visual-odometry
Last synced: 8 months ago
JSON representation
Survey paper and tables about stereo reconstruction with event cameras
- Host: GitHub
- URL: https://github.com/tub-rip/EventStereoSurvey
- Owner: tub-rip
- License: other
- Created: 2025-05-29T14:12:39.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2025-06-22T12:42:02.000Z (8 months ago)
- Last Synced: 2025-06-22T13:37:08.470Z (8 months ago)
- Topics: 3d-reconstruction, best-practices, datasets, depth-estimation, event-camera, slam, stereo, survey, visual-odometry
- Homepage:
- Size: 973 KB
- Stars: 5
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- event-vision-index - Survey on Event-based Stereo Depth Estimation (TPAMI 2025) - rip/EventStereoSurvey?style=social"/> (Depth Estimation / Stereo Depth Estimation)
README
[![CC BY-NC-SA 4.0][cc-by-nc-sa-shield]][cc-by-nc-sa]
# Event-based Stereo Depth Estimation: A Survey
Official repository for **Event-based Stereo Depth Estimation: A Survey**, by [Suman Ghosh](https://www.linkedin.com/in/suman-ghosh-a8762576/) and [Guillermo Gallego](http://www.guillermogallego.es), 2024.
[Paper](https://arxiv.org/pdf/2409.17680)
| [Table of Methods](https://docs.google.com/spreadsheets/d/1DfmVXdg3H9iaLpkXNm5ygB6ald9dK0ggO0rUDXEDTXE)
| [Table of Datasets](https://docs.google.com/spreadsheets/d/1DfmVXdg3H9iaLpkXNm5ygB6ald9dK0ggO0rUDXEDTXE/edit#gid=1539773438&range=A1)
[](https://docs.google.com/spreadsheets/d/1DfmVXdg3H9iaLpkXNm5ygB6ald9dK0ggO0rUDXEDTXE)
[](https://docs.google.com/spreadsheets/d/1DfmVXdg3H9iaLpkXNm5ygB6ald9dK0ggO0rUDXEDTXE/edit#gid=1539773438&range=A1)
If you use this work in your research, please cite it as follows:
```bibtex
@article{Ghosh24survey,
author = {Suman Ghosh and Guillermo Gallego},
title = {Event-based Stereo Depth Estimation: A Survey},
journal = {(under review)},
year = {2024}
}
```
# Our works on Event-based 3D Reconstruction
## Stereo
* [DERD-Net: Learning Depth from Event-based Ray Densities (2025)](https://arxiv.org/pdf/2504.15863)
* [ES-PTAM: Event-based Stereo Parallel Tracking and Mapping (ECCVW 2024)](https://github.com/tub-rip/ES-PTAM)
* [Secrets of Event-Based Optical Flow, Depth and Ego-Motion Estimation by Contrast Maximization (TPAMI 2024). (Stereo in Sec. IV-E)](https://doi.org/10.1109/TPAMI.2024.3396116)
* [MC-EMVS: Multi-Event-Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion (AISY 2022)](https://github.com/tub-rip/dvs_mcemvs)
* [ESVO2: Direct Visual-Inertial Odometry with Stereo Event Cameras (TRO 2025)](https://github.com/NAIL-HNU/ESVO2)
* [ESVO: Event-based Stereo Visual Odometry (TRO 2021)](https://sites.google.com/view/esvo-project-page/home), including cleaned **Stereo Dataset**.
* [Semi-dense 3D Reconstruction with a Stereo Event Camera (ECCV 2018)](https://rpg.ifi.uzh.ch/ECCV18_stereo_davis.html), including **Stereo Dataset**.
## Monocular
* [Secrets of Event-Based Optical Flow, Depth and Ego-Motion Estimation by Contrast Maximization (TPAMI 2024). (Monocular in Sec. IV-D)](https://doi.org/10.1109/TPAMI.2024.3396116)
* [EDS: Event-aided Direct Sparse Odometry (CVPR 2022)](https://rpg.ifi.uzh.ch/eds.html), including **Dataset**.
* [ESL: Event-based Structured Light (3DV 2021)](https://rpg.ifi.uzh.ch/esl.html), including **Dataset**.
* [Focus Is All You Need: Loss Functions for Event-based Vision (CVPR 2019)](http://rpg.ifi.uzh.ch/docs/CVPR19_Gallego.pdf)
* [A Unifying Contrast Maximization Framework for Event Cameras, with Applications to Motion, Depth and Optical Flow Estimation (CVPR 2018)](http://rpg.ifi.uzh.ch/docs/CVPR18_Gallego.pdf)
* [EVO: Event based Visual Odometry (RAL 2017)](https://github.com/uzh-rpg/rpg_dvs_evo_open/)
* [EMVS: Event-based Multi-View Stereo (BMVC 2016, IJCV 2018)](https://github.com/uzh-rpg/rpg_emvs)
* [Low-Latency Visual Odometry using Event-based Feature Tracks (IROS 2016)](https://youtu.be/RDu5eldW8i8)
## Additional Resources on Event-based Vision
* [Research page (TU Berlin, RIP lab)](https://sites.google.com/view/guillermogallego/research/event-based-vision)
* [Course at TU Berlin](https://sites.google.com/view/guillermogallego/teaching/event-based-robot-vision)
* [Event-based Vision: A Survey](http://rpg.ifi.uzh.ch/docs/EventVisionSurvey.pdf)
* [List of Event-based Vision Resources](https://github.com/uzh-rpg/event-based_vision_resources)
## License
This work is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://licensebuttons.net/l/by-nc-sa/4.0/88x31.png
[cc-by-nc-sa-shield]: https://img.shields.io/badge/License-CC%20BY--NC--SA%204.0-lightgrey.svg