A review of multi-sensor fusion slam systems based on 3D LIDAR

X Xu, L Zhang, J Yang, C Cao, W Wang, Y Ran, Z Tan… - Remote Sensing, 2022 - mdpi.com
The ability of intelligent unmanned platforms to achieve autonomous navigation and
positioning in a large-scale environment has become increasingly demanding, in which …

Deep learning sensor fusion for autonomous vehicle perception and localization: A review

J Fayyad, MA Jaradat, D Gruyer, H Najjaran - Sensors, 2020 - mdpi.com
Autonomous vehicles (AV) are expected to improve, reshape, and revolutionize the future of
ground transportation. It is anticipated that ordinary vehicles will one day be replaced with …

Kiss-icp: In defense of point-to-point icp–simple, accurate, and robust registration if done the right way

I Vizzo, T Guadagnino, B Mersch… - IEEE Robotics and …, 2023 - ieeexplore.ieee.org
Robust and accurate pose estimation of a robotic platform, so-called sensor-based
odometry, is an essential part of many robotic applications. While many sensor odometry …

Semantickitti: A dataset for semantic scene understanding of lidar sequences

J Behley, M Garbade, A Milioto… - Proceedings of the …, 2019 - openaccess.thecvf.com
Semantic scene understanding is important for various applications. In particular, self-driving
cars need a fine-grained understanding of the surfaces and objects in their vicinity. Light …

Squeezesegv3: Spatially-adaptive convolution for efficient point-cloud segmentation

C Xu, B Wu, Z Wang, W Zhan, P Vajda… - Computer Vision–ECCV …, 2020 - Springer
LiDAR point-cloud segmentation is an important problem for many applications. For large-
scale point cloud segmentation, the de facto method is to project a 3D point cloud to get a …

Suma++: Efficient lidar-based semantic slam

X Chen, A Milioto, E Palazzolo… - 2019 IEEE/RSJ …, 2019 - ieeexplore.ieee.org
Reliable and accurate localization and mapping are key components of most autonomous
systems. Besides geometric information about the mapped environment, the semantics …

Tightly coupled 3d lidar inertial odometry and mapping

H Ye, Y Chen, M Liu - 2019 International Conference on …, 2019 - ieeexplore.ieee.org
Ego-motion estimation is a fundamental requirement for most mobile robotic applications. By
sensor fusion, we can compensate the deficiencies of stand-alone sensors and provide …

A comparative analysis of LiDAR SLAM-based indoor navigation for autonomous vehicles

Q Zou, Q Sun, L Chen, B Nie… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Simultaneous localization and mapping (SLAM) is a fundamental technique block in the
indoor-navigation system for most autonomous vehicles and robots. SLAM aims at building …

Moving object segmentation in 3D LiDAR data: A learning-based approach exploiting sequential data

X Chen, S Li, B Mersch, L Wiesmann… - IEEE Robotics and …, 2021 - ieeexplore.ieee.org
The ability to detect and segment moving objects in a scene is essential for building
consistent maps, making future state predictions, avoiding collisions, and planning. In this …

Ct-icp: Real-time elastic lidar odometry with loop closure

P Dellenbach, JE Deschaud, B Jacquet… - … on Robotics and …, 2022 - ieeexplore.ieee.org
Multi-beam LiDAR sensors are increasingly used in robotics, particularly with autonomous
cars for localization and perception tasks, both relying on the ability to build a precise map of …