Sensor fusion is an essential topic in many perception systems, such as autonomous driving and robotics. Existing multi-modal 3D detection models usually involve customized designs …
A Mahmoud, JSK Hu… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Camera and LiDAR sensor modalities provide complementary appearance and geometric information useful for detecting 3D objects for autonomous vehicle applications. However …
X Zhang, Y Gong, J Lu, J Wu, Z Li… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Multi-modal fusion is a basic task of autonomous driving system perception, which has attracted many scholars' attention in recent years. The current multi-modal fusion methods …
C Peng, G Wang, XW Lo, X Wu, C Xu… - Proceedings of the …, 2023 - openaccess.thecvf.com
Point clouds are naturally sparse, while image pixels are dense. The inconsistency limits feature fusion from both modalities for point-wise scene flow estimation. Previous methods …
Robots rely heavily on sensors, especially RGB and depth cameras, to perceive and interact with the world. RGB cameras record 2D images with rich semantic information while missing …
Pixel image data of a scene is received in which the pixel image data includes a two- dimensional representation of an object in the scene. Point cloud data including three …
L Wang, Y Huang - Scientific Reports, 2023 - nature.com
RGB cameras and LiDAR are crucial sensors for autonomous vehicles that provide complementary information for accurate detection. Recent early-level fusion-based …
Vehicle speed estimation is one of the most critical issues in intelligent transportation system (ITS) research, while defining distance and identifying direction have become an …
M Nawaz, JKT Tang, K Bibi, S Xiao… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Autonomous driving has become a prominent topic with the rise of intelligent urban vision in communities. Advancements in automated driving technology play a significant role in the …