[HTML][HTML] A review of multi-sensor fusion slam systems based on 3D LIDAR

X Xu, L Zhang, J Yang, C Cao, W Wang, Y Ran, Z Tan… - Remote Sensing, 2022 - mdpi.com
The ability of intelligent unmanned platforms to achieve autonomous navigation and
positioning in a large-scale environment has become increasingly demanding, in which …

A review of visual SLAM methods for autonomous driving vehicles

J Cheng, L Zhang, Q Chen, X Hu, J Cai - Engineering Applications of …, 2022 - Elsevier
Autonomous driving vehicles require both a precise localization and mapping solution in
different driving environment. In this context, Simultaneous Localization and Mapping …

Fast-lio2: Fast direct lidar-inertial odometry

W Xu, Y Cai, D He, J Lin, F Zhang - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
This article presents FAST-LIO2: a fast, robust, and versatile LiDAR-inertial odometry
framework. Building on a highly efficient tightly coupled iterated Kalman filter, FAST-LIO2 …

Lvi-sam: Tightly-coupled lidar-visual-inertial odometry via smoothing and mapping

T Shan, B Englot, C Ratti, D Rus - 2021 IEEE international …, 2021 - ieeexplore.ieee.org
We propose a framework for tightly-coupled lidar-visual-inertial odometry via smoothing and
mapping, LVI-SAM, that achieves real-time state estimation and map-building with high …

Persformer: 3d lane detection via perspective transformer and the openlane benchmark

L Chen, C Sima, Y Li, Z Zheng, J Xu, X Geng… - … on Computer Vision, 2022 - Springer
Methods for 3D lane detection have been recently proposed to address the issue of
inaccurate lane layouts in many autonomous driving scenarios (uphill/downhill, bump, etc.) …

F-loam: Fast lidar odometry and mapping

H Wang, C Wang, CL Chen… - 2021 IEEE/RSJ …, 2021 - ieeexplore.ieee.org
Simultaneous Localization and Mapping (SLAM) has wide robotic applications such as
autonomous driving and unmanned aerial vehicles. Both computational efficiency and …

Towards high-performance solid-state-lidar-inertial odometry and mapping

K Li, M Li, UD Hanebeck - IEEE Robotics and Automation …, 2021 - ieeexplore.ieee.org
We present a novel tightly-coupled LiDAR-inertial odometry and mapping scheme for both
solid-state and mechanical LiDARs. As frontend, a feature-based lightweight LiDAR …

MULLS: Versatile LiDAR SLAM via multi-metric linear least square

Y Pan, P Xiao, Y He, Z Shao, Z Li - 2021 IEEE International …, 2021 - ieeexplore.ieee.org
The rapid development of autonomous driving and mobile mapping calls for off-the-shelf
LiDAR SLAM solutions that are adaptive to LiDARs of different specifications on various …

Beverse: Unified perception and prediction in birds-eye-view for vision-centric autonomous driving

Y Zhang, Z Zhu, W Zheng, J Huang, G Huang… - arXiv preprint arXiv …, 2022 - arxiv.org
In this paper, we present BEVerse, a unified framework for 3D perception and prediction
based on multi-camera systems. Unlike existing studies focusing on the improvement of …

R LIVE: A Robust, Real-Time, LiDAR-Inertial-Visual Tightly-Coupled State Estimator and Mapping

J Lin, C Zheng, W Xu, F Zhang - IEEE Robotics and Automation …, 2021 - ieeexplore.ieee.org
In this letter, we propose a robust, real-time tightly-coupled multi-sensor fusion framework,
which fuses measurements from LiDAR, inertial sensor, and visual camera to achieve robust …