[HTML][HTML] Path planning techniques for mobile robots: Review and prospect

L Liu, X Wang, X Yang, H Liu, J Li, P Wang - Expert Systems with …, 2023 - Elsevier
Mobile robot path planning refers to the design of the safely collision-free path with shortest
distance and least time-consuming from the starting point to the end point by a mobile robot …

A review of visual SLAM methods for autonomous driving vehicles

J Cheng, L Zhang, Q Chen, X Hu, J Cai - Engineering Applications of …, 2022 - Elsevier
Autonomous driving vehicles require both a precise localization and mapping solution in
different driving environment. In this context, Simultaneous Localization and Mapping …

A review of multi-sensor fusion slam systems based on 3D LIDAR

X Xu, L Zhang, J Yang, C Cao, W Wang, Y Ran, Z Tan… - Remote Sensing, 2022 - mdpi.com
The ability of intelligent unmanned platforms to achieve autonomous navigation and
positioning in a large-scale environment has become increasingly demanding, in which …

Lvi-sam: Tightly-coupled lidar-visual-inertial odometry via smoothing and mapping

T Shan, B Englot, C Ratti, D Rus - 2021 IEEE international …, 2021 - ieeexplore.ieee.org
We propose a framework for tightly-coupled lidar-visual-inertial odometry via smoothing and
mapping, LVI-SAM, that achieves real-time state estimation and map-building with high …

Fast-lio: A fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter

W Xu, F Zhang - IEEE Robotics and Automation Letters, 2021 - ieeexplore.ieee.org
This letter presents a computationally efficient and robust LiDAR-inertial odometry
framework. We fuse LiDAR feature points with IMU data using a tightly-coupled iterated …

R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

J Lin, F Zhang - 2022 International Conference on Robotics …, 2022 - ieeexplore.ieee.org
In this paper, we propose a novel LiDAR-Inertial-Visual sensor fusion framework termed R 3
LIVE, which takes advantage of measurement of LiDAR, inertial, and visual sensors to …

Openvins: A research platform for visual-inertial estimation

P Geneva, K Eckenhoff, W Lee, Y Yang… - … on Robotics and …, 2020 - ieeexplore.ieee.org
In this paper, we present an open platform, termed OpenVINS, for visual-inertial estimation
research for both the academic community and practitioners from industry. The open …

Faster-LIO: Lightweight tightly coupled LiDAR-inertial odometry using parallel sparse incremental voxels

C Bai, T Xiao, Y Chen, H Wang… - IEEE Robotics and …, 2022 - ieeexplore.ieee.org
This letter presents an incremental voxel-based lidar-inertial odometry (LIO) method for fast-
tracking spinning and solid-state lidar scans. To achieve the high tracking speed, we neither …

R LIVE: A Robust, Real-Time, LiDAR-Inertial-Visual Tightly-Coupled State Estimator and Mapping

J Lin, C Zheng, W Xu, F Zhang - IEEE Robotics and Automation …, 2021 - ieeexplore.ieee.org
In this letter, we propose a robust, real-time tightly-coupled multi-sensor fusion framework,
which fuses measurements from LiDAR, inertial sensor, and visual camera to achieve robust …

MULLS: Versatile LiDAR SLAM via multi-metric linear least square

Y Pan, P Xiao, Y He, Z Shao, Z Li - 2021 IEEE International …, 2021 - ieeexplore.ieee.org
The rapid development of autonomous driving and mobile mapping calls for off-the-shelf
LiDAR SLAM solutions that are adaptive to LiDARs of different specifications on various …