K Yuan, Z Guo, ZJ Wang - IEEE Robotics and Automation …, 2020 - ieeexplore.ieee.org
Accurate LiDAR-camera online calibration is critical for modern autonomous vehicles and robot platforms. Dominant methods heavily rely on hand-crafted features, which are not …
Sensor setups of robotic platforms commonly include both camera and LiDAR as they provide complementary information. However, fusing these two modalities typically requires …
3D LiDARs and 2D cameras are increasingly being used alongside each other in sensor rigs for perception tasks. Before these sensors can be used to gather meaningful data …
X Liu, C Yuan, F Zhang - IEEE Transactions on Instrumentation …, 2022 - ieeexplore.ieee.org
Determining the extrinsic parameter between multiple light detection and rangings (LiDARs) and cameras is essential for autonomous robots, especially for solid-state LiDARs, where …
As an essential procedure of data fusion, LiDAR-camera calibration is critical for autonomous vehicles and robot navigation. Most calibration methods require laborious …
J Borer, J Tschirner, F Ölsner… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Sensor fusion is vital for the safe and robust operation of autonomous vehicles. Accurate extrinsic sensor to sensor calibration is necessary to accurately fuse multiple sensor's data …
L Zhou, Z Li, M Kaess - 2018 IEEE/RSJ International …, 2018 - ieeexplore.ieee.org
In this paper, we address the problem of extrinsic calibration of a camera and a 3D Light Detection and Ranging (LiDAR) sensor using a checkerboard. Unlike previous works which …
This paper introduces a novel targetless method for joint intrinsic and extrinsic calibration of LiDAR-camera systems using plane-constrained bundle adjustment (BA). Our method …
S Xu, S Zhou, Z Tian, J Ma, Q Nie, X Chu - arXiv preprint arXiv:2312.01085, 2023 - arxiv.org
Current traditional methods for LiDAR-camera extrinsics estimation depend on offline targets and human efforts, while learning-based approaches resort to iterative refinement for …