A review of multi-sensor fusion slam systems based on 3D LIDAR

X Xu, L Zhang, J Yang, C Cao, W Wang, Y Ran, Z Tan… - Remote Sensing, 2022 - mdpi.com
The ability of intelligent unmanned platforms to achieve autonomous navigation and
positioning in a large-scale environment has become increasingly demanding, in which …

An overview on visual slam: From tradition to semantic

W Chen, G Shang, A Ji, C Zhou, X Wang, C Xu, Z Li… - Remote Sensing, 2022 - mdpi.com
Visual SLAM (VSLAM) has been developing rapidly due to its advantages of low-cost
sensors, the easy fusion of other sensors, and richer environmental information. Traditional …

Robot parkour learning

Z Zhuang, Z Fu, J Wang, C Atkeson… - arXiv preprint arXiv …, 2023 - arxiv.org
Parkour is a grand challenge for legged locomotion that requires robots to overcome various
obstacles rapidly in complex environments. Existing methods can generate either diverse …

Present and future of slam in extreme environments: The darpa subt challenge

K Ebadi, L Bernreiter, H Biggie, G Catt… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
This article surveys recent progress and discusses future opportunities for simultaneous
localization and mapping (SLAM) in extreme underground environments. SLAM in …

LAMP 2.0: A robust multi-robot SLAM system for operation in challenging large-scale underground environments

Y Chang, K Ebadi, CE Denniston… - IEEE Robotics and …, 2022 - ieeexplore.ieee.org
Search and rescue with a team of heterogeneous mobile robots in unknown and large-scale
underground environments requires high-precision localization and mapping. This crucial …

Super odometry: Imu-centric lidar-visual-inertial estimator for challenging environments

S Zhao, H Zhang, P Wang, L Nogueira… - 2021 IEEE/RSJ …, 2021 - ieeexplore.ieee.org
We propose Super Odometry, a high-precision multi-modal sensor fusion framework,
providing a simple but effective way to fuse multiple sensors such as LiDAR, camera, and …

[PDF][PDF] Cerberus: Autonomous legged and aerial robotic exploration in the tunnel and urban circuits of the darpa subterranean challenge

M Tranzatto, F Mascarich, L Bernreiter… - arXiv preprint arXiv …, 2022 - academia.edu
Autonomous exploration of subterranean environments constitutes a major frontier for
robotic systems as underground settings present key challenges that can render robot …

Single-model and any-modality for video object tracking

Z Wu, J Zheng, X Ren, FA Vasluianu… - Proceedings of the …, 2024 - openaccess.thecvf.com
In the realm of video object tracking auxiliary modalities such as depth thermal or event data
have emerged as valuable assets to complement the RGB trackers. In practice most existing …

Fast-livo: Fast and tightly-coupled sparse-direct lidar-inertial-visual odometry

C Zheng, Q Zhu, W Xu, X Liu, Q Guo… - 2022 IEEE/RSJ …, 2022 - ieeexplore.ieee.org
To achieve accurate and robust pose estimation in Simultaneous Localization and Mapping
(SLAM) task, multisensor fusion is proven to be an effective solution and thus provides great …

VILENS: Visual, inertial, lidar, and leg odometry for all-terrain legged robots

D Wisth, M Camurri, M Fallon - IEEE Transactions on Robotics, 2022 - ieeexplore.ieee.org
We present visual inertial lidar legged navigation system (VILENS), an odometry system for
legged robots based on factor graphs. The key novelty is the tight fusion of four different …