Visual-inertial navigation: A concise review

G Huang - 2019 international conference on robotics and …, 2019 - ieeexplore.ieee.org
As inertial and visual sensors are becoming ubiquitous, visual-inertial navigation systems
(VINS) have prevailed in a wide range of applications from mobile augmented reality to …

Surround-view fisheye camera perception for automated driving: Overview, survey & challenges

VR Kumar, C Eising, C Witt… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Surround-view fisheye cameras are commonly used for near-field sensing in automated
driving. Four fisheye cameras on four sides of the vehicle are sufficient to cover 360° around …

Agilicious: Open-source and open-hardware agile quadrotor for vision-based flight

P Foehn, E Kaufmann, A Romero, R Penicka, S Sun… - Science robotics, 2022 - science.org
Autonomous, agile quadrotor flight raises fundamental challenges for robotics research in
terms of perception, planning, learning, and control. A versatile and standardized platform is …

Kimera: From SLAM to spatial perception with 3D dynamic scene graphs

A Rosinol, A Violette, M Abate… - … Journal of Robotics …, 2021 - journals.sagepub.com
Humans are able to form a complex mental model of the environment they move in. This
mental model captures geometric and semantic aspects of the scene, describes the …

The TUM VI benchmark for evaluating visual-inertial odometry

D Schubert, T Goll, N Demmel… - 2018 IEEE/RSJ …, 2018 - ieeexplore.ieee.org
Visual odometry and SLAM methods have a large variety of applications in domains such as
augmented reality or robotics. Complementing vision sensors with inertial measurements …

Autonomous drone racing: A survey

D Hanover, A Loquercio, L Bauersfeld… - IEEE Transactions …, 2024 - ieeexplore.ieee.org
Over the last decade, the use of autonomous drone systems for surveying, search and
rescue, or last-mile delivery has increased exponentially. With the rise of these applications …

The newer college dataset: Handheld lidar, inertial and vision with ground truth

M Ramezani, Y Wang, M Camurri… - 2020 IEEE/RSJ …, 2020 - ieeexplore.ieee.org
In this paper, we present a large dataset with a variety of mobile mapping sensors collected
using a handheld device carried at typical walking speeds for nearly 2.2 km around New …

Event-aided direct sparse odometry

J Hidalgo-Carrió, G Gallego… - Proceedings of the …, 2022 - openaccess.thecvf.com
We introduce EDS, a direct monocular visual odometry using events and frames. Our
algorithm leverages the event generation model to track the camera motion in the blind time …

Denoising imu gyroscopes with deep learning for open-loop attitude estimation

M Brossard, S Bonnabel… - IEEE Robotics and …, 2020 - ieeexplore.ieee.org
This article proposes a learning method for denoising gyroscopes of Inertial Measurement
Units (IMUs) using ground truth data, and estimating in real time the orientation (attitude) of a …

IC-GVINS: A robust, real-time, INS-centric GNSS-visual-inertial navigation system

X Niu, H Tang, T Zhang, J Fan… - IEEE robotics and …, 2022 - ieeexplore.ieee.org
Visual navigation systems are susceptible to complex environments, while inertial
navigation systems (INS) are not affected by external factors. Hence, we present IC-GVINS …