awesome-event-based-slam
Paper Survey for Event-based SLAM
https://github.com/kwanwaipang/awesome-event-based-slam
Last synced: 4 days ago
JSON representation
-
Event-based Pose Estimation
- METS: Motion-Encoded Time-Surface for Event-Based High-Speed Pose Tracking - --|---|
- Event-based mosaicing bundle adjustment - rip/emba.svg)](https://github.com/tub-rip/emba) |---|
- Progressive spatio-temporal alignment for efficient event-based motion estimation - --|---|
- Spatiotemporal registration for event-based visual odometry - --|---|
- Deep Visual Odometry for Stereo Event Cameras - HNU/SDEVO.svg)](https://github.com/NAIL-HNU/SDEVO)|---|
- Hybrid-EVIO: Event-Based Visual-Inertial Odometry with Hybrid Visual Front-End - --|
- VIO-GO: Optimizing Event-Based SLAM Parameters for Robust Performance in High Dynamic Range Scenarios - --|---|
- Event-based Stereo Visual-Inertial Odometry with Voxel Map - --|---|
- GRE-SLAM: 6-DoF Pure Event-Based SLAM with Semi-Dense Depth Recovery Assisted Bundle Adjustment - --|---|
- Radar and Event Camera Fusion for Agile Robot Ego-Motion Estimation - --|
- SuperEvent: Cross-Modal Learning of Event-based Keypoint Detection
- SuperEIO: Self-Supervised Event Feature Learning for Event Inertial Odometry - hku/SuperEIO.svg)](https://github.com/arclab-hku/SuperEIO)|[website](https://arclab-hku.github.io/SuperEIO/)|
- Event-Frame-Inertial Odometry Using Point and Line Features Based on Coarse-to-Fine Motion Compensation - EFIO.svg)](https://github.com/choibottle/C2F-EFIO)|-|
- EVLoc: Event-based Visual Localization in LiDAR Maps via Event-Depth Registration - --|---|
- Esvo2: Direct visual-inertial odometry with stereo event cameras - HNU/ESVO2.svg)](https://github.com/NAIL-HNU/ESVO2)|---|
- METS: Motion-Encoded Time-Surface for Event-Based High-Speed Pose Tracking - --|---|
- Stereo Event-Based Visual--Inertial Odometry - --|---|
- DEIO: Deep Event Inertial Odometry - hku/DEIO.svg)](https://github.com/arclab-hku/DEIO)|[website](https://kwanwaipang.github.io/DEIO/)|
- Deep Visual Odometry with Events and Frames - rpg/rampvo.svg)](https://github.com/uzh-rpg/rampvo)|RAMP-VO|
- EROAM: Event-based Camera Rotational Odometry and Mapping in Real-time - |
- Deep event visual odometry - vision/DEVO.svg)](https://github.com/tum-vision/DEVO)|---|
- Dh-ptam: a deep hybrid stereo events-frames parallel tracking and mapping system - PTAM.svg)](https://github.com/AbanobSoliman/DH-PTAM)|Superpoint+[stereo ptam](https://github.com/uoip/stereo_ptam)|
- Imu-aided event-based stereo visual odometry - HNU/ESVIO_AA.svg)](https://github.com/NAIL-HNU/ESVIO_AA)|---|
- FAST-LIEO: Fast and Real-Time LiDAR-Inertial-Event-Visual Odometry - --|---|
- Continuous Gaussian Process Pre-Optimization for Asynchronous Event-Inertial Odometry - --|---|
- EVIT: Event-based visual-inertial tracking in semi-dense maps using windowed nonlinear optimization - EVT.svg)](https://github.com/MobilePerceptionLab/Canny-EVT)|---|
- Cross-modal semi-dense 6-dof tracking of an event camera in challenging conditions - EVT.svg)](https://github.com/MobilePerceptionLab/Canny-EVT)|---|
- Asynchronous Event-Inertial Odometry using a Unified Gaussian Process Regression Framework - --|---|
- AsynEIO: Asynchronous Monocular Event-Inertial Odometry Using Gaussian Process Regression - --|---|
- Efficient Continuous-Time Ego-Motion Estimation for Asynchronous Event-based Data Associations - --|---|
- ES-PTAM: Event-based stereo parallel tracking and mapping - rip/ES-PTAM.svg)](https://github.com/tub-rip/ES-PTAM) |---|
- Event-based mosaicing bundle adjustment - rip/emba.svg)](https://github.com/tub-rip/emba) |---|
- Event-based Photometric Bundle Adjustment - rip/epba.svg)](https://github.com/tub-rip/epba)|---|
- CMax-SLAM: Event-based Rotational-Motion Bundle Adjustment and SLAM System using Contrast Maximization - rip/cmax_slam.svg)](https://github.com/tub-rip/cmax_slam)|-|
- Cubic B-Spline-Based Feature Tracking for Visual--Inertial Odometry With Event Camera - --|---|
- EVI-SAM: Robust, Real-Time, Tightly-Coupled Event--Visual--Inertial State Estimation and 3D Dense Mapping - --|[website](https://kwanwaipang.github.io/EVI-SAM/)|
- Monocular Event-Inertial Odometry with Adaptive decay-based Time Surface and Polarity-aware Tracking - --|---|
- T-ESVO: Improved Event-Based Stereo Visual Odometry via Adaptive Time-Surface and Truncated Signed Distance Function - --|---|
- Progressive spatio-temporal alignment for efficient event-based motion estimation - --|---|
- Fusing event-based camera and radar for slam using spiking neural networks with continual stdp learning - --|---|
- Event-and Frame-based Visual-Inertial Odometry with Adaptive Filtering based on 8-DOF Warping Uncertainty - --|---|
- An Event-based Stereo 3D Mapping and Tracking Pipeline for Autonomous Vehicles - --|---|
- MC-VEO: A visual-event odometry with accurate 6-DoF motion compensation - veo-buildconf.svg)](https://github.com/huangfeng95/mc-veo-buildconf)|-|
- Event-based stereo visual odometry with native temporal resolution via continuous-time gaussian process regression - --|---|
- Esvio: Event-based stereo visual-inertial odometry - --|---|
- ESVIO: Event-based Stereo Visual Inertial Odometry - hku/ESVIO.svg)](https://github.com/arclab-hku/ESVIO)|[website](https://kwanwaipang.github.io/ESVIO/)|
- PL-EVIO: Robust Monocular Event-based Visual Inertial Odometry with Point and Line Features - --|[website](https://kwanwaipang.github.io/PL-EVIO/)|
- Event-IMU Fusion Strategies for Faster-Than-IMU Estimation Throughput - --|---|
- Event-based line SLAM in real-time - --|---|
- DEVO: Depth-Event Camera Visual Odometry in Challenging Conditions - --|---|
- Exploring Event Camera-based Odometry for Planetary Robots - --|---|
- A tightly-coupled event-inertial odometry using exponential decay and linear preintegrated measurements - --|---|
- Monocular Event Visual Inertial Odometry based on Event-corner using Sliding Windows Graph-based Optimization - --|[website](https://kwanwaipang.github.io/Mono-EIO/)|
- Contrast maximization-based feature tracking for visual odometry with an event camera - |-|
- Visual Odometry with an Event Camera Using Continuous Ray Warping and Volumetric Contrast Maximization - |-|
- Asynchronous optimisation for event-based visual odometry - --|---|
- Spatiotemporal registration for event-based visual odometry - --|---|
- Real-time rotational motion estimation with contrast maximization over globally aligned events - --|---|
- Feature-based Event Stereo Visual Odometry - --|---|
- Event-based stereo visual odometry - Aerial-Robotics/ESVO.svg)](https://github.com/HKUST-Aerial-Robotics/ESVO)|---|
- Globally-optimal contrast maximisation for event cameras - --|---|
- Globally-optimal event camera motion estimation - --|---|
- Globally optimal contrast maximisation for event-based motion estimation - --|---|
- IDOL: A framework for IMU-DVS odometry using lines - --|---|
- Neuromorphic visual odometry system for intelligent vehicle application with bio-inspired vision sensor - --|---|
- Event-based, direct camera tracking from a photometric 3d map using nonlinear optimization - rpg/direct_event_camera_tracker.svg)](https://github.com/uzh-rpg/direct_event_camera_tracker)|---|
- Ultimate SLAM? Combining events, images, and IMU for robust visual SLAM in HDR and high-speed scenarios - rpg/rpg_ultimate_slam_open.svg)](https://github.com/uzh-rpg/rpg_ultimate_slam_open)|---|
- Continuous-time visual-inertial odometry for event cameras - --|---|
- Event-based, 6-DOF camera tracking from photometric depth maps - --|---|
- Real-time panoramic tracking for event cameras - --|---|
- Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization - --|---|
- Event-based visual inertial odometry - group/event_feature_tracking.svg)](https://github.com/daniilidis-group/event_feature_tracking)|---|
- Accurate angular velocity estimation with an event camera - --|---|
- Evo: A geometric approach to event-based 6-dof parallel tracking and mapping in real time - rpg/rpg_dvs_evo_open.svg)](https://github.com/uzh-rpg/rpg_dvs_evo_open) |---|
- Fast localization and tracking using event sensors - --|---|
- Real-time 3D reconstruction and 6-DoF tracking with an event camera - --|---|
- Low-latency visual odometry using event-based feature tracks - --|---|
- Event-based camera pose tracking using a generative event model - --|---|
- Event-based, 6-DOF pose tracking for high-speed maneuvers - --|---|
- Event-based 3D SLAM with a depth-augmented dynamic vision sensor - --|---|
- Low-latency event-based visual odometry - --|---|
- Simultaneous localization and mapping for event-based vision systems - --|---|
- Event-based particle filtering for robot self-localization - --|---|
- Simultaneous mosaicing and tracking with an event camera - --|---|
- Event-based Spinning Object SLAM - --|---|
- METS: Motion-Encoded Time-Surface for Event-Based High-Speed Pose Tracking - --|---|
- Event-based mosaicing bundle adjustment - rip/emba.svg)](https://github.com/tub-rip/emba) |---|
- Unsupervised event-based learning of optical flow, depth, and egomotion - --|---|
- METS: Motion-Encoded Time-Surface for Event-Based High-Speed Pose Tracking - --|---|
- Event-based mosaicing bundle adjustment - rip/emba.svg)](https://github.com/tub-rip/emba) |---|
- Esvo2: Direct visual-inertial odometry with stereo event cameras - HNU/ESVO2.svg)](https://github.com/NAIL-HNU/ESVO2)|---|
- METS: Motion-Encoded Time-Surface for Event-Based High-Speed Pose Tracking - --|---|
- Event-based mosaicing bundle adjustment - rip/emba.svg)](https://github.com/tub-rip/emba) |---|
- Event-based Photometric Bundle Adjustment - rip/epba.svg)](https://github.com/tub-rip/epba)|---|
-  |---|
- METS: Motion-Encoded Time-Surface for Event-Based High-Speed Pose Tracking - --|---|
- Event-based mosaicing bundle adjustment - rip/emba.svg)](https://github.com/tub-rip/emba) |---|
- Enabling High-Frequency Cross-Modality Visual Positioning Service for Accurate Drone Landing - pose/ev-pose.github.io.svg)](https://github.com/ev-pose/ev-pose.github.io)|[Website](https://ev-pose.github.io/)|
- METS: Motion-Encoded Time-Surface for Event-Based High-Speed Pose Tracking - --|---|
- Event-based mosaicing bundle adjustment - rip/emba.svg)](https://github.com/tub-rip/emba) |---|
- METS: Motion-Encoded Time-Surface for Event-Based High-Speed Pose Tracking - --|---|
- Event-based mosaicing bundle adjustment - rip/emba.svg)](https://github.com/tub-rip/emba) |---|
- METS: Motion-Encoded Time-Surface for Event-Based High-Speed Pose Tracking - --|---|
- Event-based mosaicing bundle adjustment - rip/emba.svg)](https://github.com/tub-rip/emba) |---|
- Esvio: Event-based stereo visual-inertial odometry - --|---|
- Globally optimal contrast maximisation for event-based motion estimation - --|---|
-
Event-based Depth Estimation
- High-Rate Monocular Depth Estimation via Cross Frame-Rate Collaboration of Frames and Events - --|
- Semi-dense 3D reconstruction with a stereo event camera - --|[dataset](https://rpg.ifi.uzh.ch/ECCV18_stereo_davis.html)|
- Unsupervised event-based learning of optical flow, depth, and egomotion - |-|
- High-Rate Monocular Depth Estimation via Cross Frame-Rate Collaboration of Frames and Events - --|
- Active Event-based Stereo Vision - li/active_event_based_stereo.svg)](https://github.com/jianing-li/active_event_based_stereo)|---|
- DERD-Net: Learning Depth from Event-based Ray Densities - rip/DERD-Net.svg)](https://github.com/tub-rip/DERD-Net)|---|
- Multi-Modal Fusion of Event and RGB for Monocular Depth Estimation Using a Unified Transformer-based Architecture - --|---|
- Event-based Monocular Depth Estimation with Recurrent Transformers - --|---|
- Depth From Asymmetric Frame-Event Stereo: A Divide-and-Conquer Approach - --|---|
- Secrets of Event-based Optical Flow, Depth and Ego-motion Estimation by Contrast Maximization - rip/event_based_optical_flow.svg)](https://github.com/tub-rip/event_based_optical_flow)|-|
- Low-latency monocular depth estimation using event timing on neuromorphic hardware - --|---|
- Stereo Depth from Events Cameras: Concentrate and Focus on the Future - --|---|
- Multi-Event-Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion - rip/dvs_mcemvs.svg)](https://github.com/tub-rip/dvs_mcemvs)|---|
- Event-based Stereo Depth Estimation from Ego-motion using Ray Density Fusion - --|---|
- EOMVS: Event-Based Omnidirectional Multi-View Stereo - --|---|
- Combining Events and Frames using Recurrent Asynchronous Multimodal Networks for Monocular Depth Prediction - rpg/rpg_ramnet.svg)](https://github.com/uzh-rpg/rpg_ramnet)|---|
- Deep Event Stereo Leveraged by Event-to-Image Translation - --|---|
- Learning Monocular Dense Depth from Events - rpg/rpg_e2depth.svg)](https://github.com/uzh-rpg/rpg_e2depth)|---|
- Learning an event sequence embedding for dense event-based deep stereo - --|---|
- Learning event-based height from plane and parallax - --|---|
- A Unifying Contrast Maximization Framework for Event Cameras, with Applications to Motion, Depth and Optical Flow Estimation - |[supplementary material](https://www.ifi.uzh.ch/dam/jcr:a22071c9-b284-43c6-8f71-6433627b2db2/CVPR18_Gallego.pdf)|
- Semi-dense 3D reconstruction with a stereo event camera - --|[dataset](https://rpg.ifi.uzh.ch/ECCV18_stereo_davis.html)|
- EMVS: Event-based multi-view stereo—3D reconstruction with an event camera in real-time - rpg/rpg_emvs.svg)](https://github.com/uzh-rpg/rpg_emvs)|---|
- Neuromorphic event-based generalized time-based stereovision - --|---|
- Depth Estimation from Moving Stereo Event Cameras without Motion Cues - --|---|
- EventUPS: Uncalibrated Photometric Stereo Using an Event Camera - --|---|
- Spiking Depth: Depth Estimation from Sparse Events with Spiking Neural Networks - --|---|
- High-Rate Monocular Depth Estimation via Cross Frame-Rate Collaboration of Frames and Events - --|
- High-Rate Monocular Depth Estimation via Cross Frame-Rate Collaboration of Frames and Events - --|
-
Event-based Feature Detection and Tracking
- Event-camera enhanced RatSLAM: A Multi-View Projection-Based Loop Closure Detection Method - --|---|
- Data-driven Feature Tracking for Event Cameras with and without Frames - rpg/deep_ev_tracker.svg)](https://github.com/uzh-rpg/deep_ev_tracker)|---|
- Data-driven feature tracking for event cameras - rpg/deep_ev_tracker.svg)](https://github.com/uzh-rpg/deep_ev_tracker)|---|
- Haste: multi-hypothesis asynchronous speeded-up tracking of events - --|
- EKLT: Asynchronous photometric feature tracking using events and frames - rpg/rpg_eklt.svg)](https://github.com/uzh-rpg/rpg_eklt)|[Feature Tracking Analysis](https://github.com/uzh-rpg/rpg_feature_tracking_analysis)|
- Asynchronous corner detection and tracking for event cameras in real time - --|
- Event-camera enhanced RatSLAM: A Multi-View Projection-Based Loop Closure Detection Method - --|---|
- Event-camera enhanced RatSLAM: A Multi-View Projection-Based Loop Closure Detection Method - --|---|
-
Event-based 3DGS or NeRF
- EventNeRF: Neural radiance fields from a single colour event camera - inf.mpg.de/EventNeRF/)|
- Link
- Evdnerf: Reconstructing event data with dynamic neural radiance fields - bhattacharya/EvDNeRF.svg)](https://github.com/anish-bhattacharya/EvDNeRF)|---|
- E-nerf: Neural radiance fields from a moving event camera - --|
- Ev-NeRF: Event based neural radiance field - --|---|
- Multimodal Neural Surface Reconstruction: Recovering the Geometry and Appearance of 3D Scenes from Events and Grayscale Images - --|---|
- EventNeRF: Neural radiance fields from a single colour event camera - inf.mpg.de/EventNeRF/)|
-
Event Dataset for SLAM Benchmarking
- Event-aided Direct Sparse Odometry - --|[https://rpg.ifi.uzh.ch/eds.html](website)|
- MTevent: A Multi-Task Event Camera Dataset for 6D Pose Estimation and Moving Object Detection - gouda/MTevent)|
- CEAR: Comprehensive Event Camera Dataset for Rapid Perception of Agile Quadruped Robots - --|[website](https://daroslab.github.io/cear/)|
- ECMD: An Event-Centric Multisensory Driving Dataset for SLAM - --|[website](https://arclab-hku.github.io/ecmd/)|
- Stereo visual localization dataset featuring event cameras - --|*Not-release<br>[website](https://bitbucket.org/unizg-fer-lamor/event-dataset/)|
- MA-VIED: A Multisensor Automotive Visual Inertial Event Dataset - department-engineering/MA-VIED.svg)](https://github.com/isarlab-department-engineering/MA-VIED)|---|
- M3ED: Multi-Robot, Multi-Sensor, Multi-Environment Event Dataset - group/m3ed.svg)](https://github.com/daniilidis-group/m3ed)|[website](https://m3ed.io/)|
- ViViD++: Vision for Visibility Dataset - --|[website](https://visibilitydataset.github.io/)|
- FusionPortable: A Multi-Sensor Campus-Scene Dataset for Evaluation of Localization and Mapping Accuracy on Diverse Platforms - --|*Not-release|
- VECtor: A Versatile Event-Centric Benchmark for Multi-Sensor SLAM - --|[website](https://star-datasets.github.io/vector/)<br>[rosbag format](https://github.com/arclab-hku/Event_based_VO-VIO-SLAM#Modified-vector-dataset)|
- M2dgr: A multi-sensor and multi-scenario slam dataset for ground robots - ViSYS/M2DGR.svg)](https://github.com/SJTU-ViSYS/M2DGR)|*The Event Data is just Noise|
- TUM-VIE: The TUM stereo visual-inertial event dataset - --|[website](https://go.vision.in.tum.de/tumvie)|
- An Event-based vision dataset for visual navigation tasks in agricultural environments - --|AGRI-EBV-AUTUMN<br>[website](https://ieee-dataport.org/open-access/agri-ebv-autumn)|
- The GRIFFIN perception dataset: Bridging the gap between flapping-wing flight and robotic perception - --|[website](http://grvc.us.es/eye-bird-dataset)|
- Dsec: A stereo event camera dataset for driving scenarios - --|[website](https://dsec.ifi.uzh.ch/)<br>[rosbag format](https://github.com/arclab-hku/Event_based_VO-VIO-SLAM#Modified-dsec-dataset)|
- A large scale event-based detection dataset for automotive - --|*Not-release<br>[website](https://www.prophesee.ai/2020/01/24/prophesee-gen1-automotive-detection-dataset/)|
- Event-based visual place recognition with ensembles of temporal windows - Fischer/ensemble-event-vpr.svg)](https://github.com/Tobias-Fischer/ensemble-event-vpr) |Brisbane-Event-VPR<br>[website](https://zenodo.org/records/4302805)|
- Ddd20 end-to-end event camera driving dataset: Fusing frames and events with deep learning for improved steering prediction - --|[website](https://sites.google.com/view/davis-driving-dataset-2020/home)|
- High speed and high dynamic range video with an event camera - --|E2VID<br>[website](https://rpg.ifi.uzh.ch/E2VID.html)|
- CED: Color event camera dataset - --|[website](https://rpg.ifi.uzh.ch/CED.html)|
- Are we ready for autonomous drone racing? the UZH-FPV drone racing dataset - rpg/uzh_fpv_open.svg)](https://github.com/uzh-rpg/uzh_fpv_open)|UZH-FPV<br>[website](https://fpv.ifi.uzh.ch/)|
- The multivehicle stereo event camera dataset: An event camera dataset for 3D perception - --|MVSEC<br>[website](https://daniilidis-group.github.io/mvsec/)|
- DDD17: End-to-end DAVIS driving dataset - --|[website](https://docs.google.com/document/d/1HM0CSmjO8nOpUeTvmPjopcBcVCk7KXvLUuiZFS6TWSg/pub)|
- The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM - --|davis240c<br>[website](https://rpg.ifi.uzh.ch/davis_data.html)|
- EvtSlowTV--A Large and Diverse Dataset for Event-Based Depth Estimation - --|---|
- SPECTRA: Synchronized Stereo Event-Camera Driving Dataset for Diverse Perception Tasks - --|---|
-
Other Resources
- Course: Event-based Robot Vision
- HKU-Dataset for Event-based VO/VIO/SLAM
- Paper list - Event-based-Contrast-Maximization/)
- Chapter10: Event-based SLAM
- Temporal and Rotational Calibration for Event-Centric Multi-Sensor Systems - HNU/EvMultiCalib.svg)](https://github.com/NAIL-HNU/EvMultiCalib)|---|
- Event-based simultaneous localization and mapping: A comprehensive survey - --|---|
- How to Calibrate Your Event Camera - rpg/e2calib.svg)](https://github.com/uzh-rpg/e2calib)|---|
- Video to Events: Recycling Video Datasets for Event Cameras - rpg/rpg_vid2e.svg)](https://github.com/uzh-rpg/rpg_vid2e) |[Test](https://kwanwaipang.github.io/File/Blogs/Poster/esim.html)|
- Event-based vision: A survey - --|---|
- Esim: an open event camera simulator - rpg/rpg_esim.svg)](https://github.com/uzh-rpg/rpg_esim)|---|
- Esim: an open event camera simulator - rpg/rpg_esim.svg)](https://github.com/uzh-rpg/rpg_esim)|---|
Programming Languages
Categories
Sub Categories
Keywords