A review of multi-sensor fusion slam systems based on 3D LIDAR

X Xu, L Zhang, J Yang, C Cao, W Wang, Y Ran, Z Tan… - Remote Sensing, 2022 - mdpi.com
The ability of intelligent unmanned platforms to achieve autonomous navigation and
positioning in a large-scale environment has become increasingly demanding, in which …

Camera, LiDAR and multi-modal SLAM systems for autonomous ground vehicles: a survey

M Chghaf, S Rodriguez, AE Ouardi - Journal of Intelligent & Robotic …, 2022 - Springer
Abstract Simultaneous Localization and Mapping (SLAM) have been widely studied over the
last years for autonomous vehicles. SLAM achieves its purpose by constructing a map of the …

R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

J Lin, F Zhang - 2022 International Conference on Robotics …, 2022 - ieeexplore.ieee.org
In this paper, we propose a novel LiDAR-Inertial-Visual sensor fusion framework termed R 3
LIVE, which takes advantage of measurement of LiDAR, inertial, and visual sensors to …

R LIVE: A Robust, Real-Time, LiDAR-Inertial-Visual Tightly-Coupled State Estimator and Mapping

J Lin, C Zheng, W Xu, F Zhang - IEEE Robotics and Automation …, 2021 - ieeexplore.ieee.org
In this letter, we propose a robust, real-time tightly-coupled multi-sensor fusion framework,
which fuses measurements from LiDAR, inertial sensor, and visual camera to achieve robust …

Super odometry: Imu-centric lidar-visual-inertial estimator for challenging environments

S Zhao, H Zhang, P Wang, L Nogueira… - 2021 IEEE/RSJ …, 2021 - ieeexplore.ieee.org
We propose Super Odometry, a high-precision multi-modal sensor fusion framework,
providing a simple but effective way to fuse multiple sensors such as LiDAR, camera, and …

Fast-livo: Fast and tightly-coupled sparse-direct lidar-inertial-visual odometry

C Zheng, Q Zhu, W Xu, X Liu, Q Guo… - 2022 IEEE/RSJ …, 2022 - ieeexplore.ieee.org
To achieve accurate and robust pose estimation in Simultaneous Localization and Mapping
(SLAM) task, multisensor fusion is proven to be an effective solution and thus provides great …

Unified multi-modal landmark tracking for tightly coupled lidar-visual-inertial odometry

D Wisth, M Camurri, S Das… - IEEE Robotics and …, 2021 - ieeexplore.ieee.org
We present an efficient multi-sensor odometry system for mobile platforms that jointly
optimizes visual, lidar, and inertial information within a single integrated factor graph. This …

Efficient and accurate tightly-coupled visual-lidar slam

CC Chou, CF Chou - IEEE Transactions on Intelligent …, 2021 - ieeexplore.ieee.org
We investigate a novel way to integrate visual SLAM and lidar SLAM. Instead of enhancing
visual odometry via lidar depths or using visual odometry as the motion initial guess of lidar …

PyPose: A library for robot learning with physics-based optimization

C Wang, D Gao, K Xu, J Geng, Y Hu… - Proceedings of the …, 2023 - openaccess.thecvf.com
Deep learning has had remarkable success in robotic perception, but its data-centric nature
suffers when it comes to generalizing to ever-changing environments. By contrast, physics …

Edge robotics: Edge-computing-accelerated multirobot simultaneous localization and mapping

P Huang, L Zeng, X Chen, K Luo… - IEEE Internet of Things …, 2022 - ieeexplore.ieee.org
With the wide penetration of smart robots in multifarious fields, the simultaneous localization
and mapping (SLAM) technique in robotics has attracted growing attention in the community …