Event-based vision: A survey

G Gallego, T Delbrück, G Orchard… - IEEE transactions on …, 2020 - ieeexplore.ieee.org
Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead
of capturing images at a fixed rate, they asynchronously measure per-pixel brightness …

Event-based simultaneous localization and mapping: A comprehensive survey

K Huang, S Zhang, J Zhang, D Tao - arXiv preprint arXiv:2304.09793, 2023 - arxiv.org
In recent decades, visual simultaneous localization and mapping (vSLAM) has gained
significant interest in both academia and industry. It estimates camera motion and …

Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age

C Cadena, L Carlone, H Carrillo, Y Latif… - IEEE Transactions …, 2016 - ieeexplore.ieee.org
Simultaneous localization and mapping (SLAM) consists in the concurrent construction of a
model of the environment (the map), and the estimation of the state of the robot moving …

Ultimate SLAM? Combining events, images, and IMU for robust visual SLAM in HDR and high-speed scenarios

AR Vidal, H Rebecq, T Horstschaefer… - IEEE Robotics and …, 2018 - ieeexplore.ieee.org
Event cameras are bioinspired vision sensors that output pixel-level brightness changes
instead of standard intensity frames. These cameras do not suffer from motion blur and have …

The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM

E Mueggler, H Rebecq, G Gallego… - … Journal of Robotics …, 2017 - journals.sagepub.com
New vision sensors, such as the dynamic and active-pixel vision sensor (DAVIS),
incorporate a conventional global-shutter camera and an event-based sensor in the same …

Event-based stereo visual odometry

Y Zhou, G Gallego, S Shen - IEEE Transactions on Robotics, 2021 - ieeexplore.ieee.org
Event-based cameras are bioinspired vision sensors whose pixels work independently from
each other and respond asynchronously to brightness changes, with microsecond …

A unifying contrast maximization framework for event cameras, with applications to motion, depth, and optical flow estimation

G Gallego, H Rebecq… - Proceedings of the IEEE …, 2018 - openaccess.thecvf.com
We present a unifying framework to solve several computer vision problems with event
cameras: motion, depth and optical flow estimation. The main idea of our framework is to find …

Real-time 3D reconstruction and 6-DoF tracking with an event camera

H Kim, S Leutenegger, AJ Davison - … 11-14, 2016, Proceedings, Part VI 14, 2016 - Springer
We propose a method which can perform real-time 3D reconstruction from a single hand-
held event camera with no additional sensing, and works in unstructured scenes of which it …

Evo: A geometric approach to event-based 6-dof parallel tracking and mapping in real time

H Rebecq, T Horstschäfer, G Gallego… - IEEE Robotics and …, 2016 - ieeexplore.ieee.org
We present EVO, an event-based visual odometry algorithm. Our algorithm successfully
leverages the outstanding properties of event cameras to track fast camera motions while …

A survey on odometry for autonomous navigation systems

SAS Mohamed, MH Haghbayan, T Westerlund… - IEEE …, 2019 - ieeexplore.ieee.org
The development of a navigation system is one of the major challenges in building a fully
autonomous platform. Full autonomy requires a dependable navigation capability not only in …