Radars for autonomous driving: A review of deep learning methods and challenges

A Srivastav, S Mandal - IEEE Access, 2023 - ieeexplore.ieee.org
Radar is a key component of the suite of perception sensors used for safe and reliable
navigation of autonomous vehicles. Its unique capabilities include high-resolution velocity …

V2X cooperative perception for autonomous driving: Recent advances and challenges

T Huang, J Liu, X Zhou, DC Nguyen… - arXiv preprint arXiv …, 2023 - arxiv.org
Accurate perception is essential for advancing autonomous driving and addressing safety
challenges in modern transportation systems. Despite significant advancements in computer …

LXL: LiDAR excluded lean 3D object detection with 4D imaging radar and camera fusion

W Xiong, J Liu, T Huang, QL Han… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
As an emerging technology and a relatively affordable device, the 4D imaging radar has
already been confirmed effective in performing 3D object detection in autonomous driving …

Waterscenes: A multi-task 4d radar-camera fusion dataset and benchmark for autonomous driving on water surfaces

S Yao, R Guan, Z Wu, Y Ni, Z Zhang, Z Huang… - arXiv preprint arXiv …, 2023 - arxiv.org
Autonomous driving on water surfaces plays an essential role in executing hazardous and
time-consuming missions, such as maritime surveillance, survivors rescue, environmental …

Sparsefusion3d: Sparse sensor fusion for 3d object detection by radar and camera in environmental perception

Z Yu, W Wan, M Ren, X Zheng… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
In the context of autonomous driving environment perception, multi-modal fusion plays a
pivotal role in enhancing robustness, completeness, and accuracy, thereby extending the …

Radar perception in autonomous driving: Exploring different data representations

S Yao, R Guan, Z Peng, C Xu, Y Shi, Y Yue… - arXiv preprint arXiv …, 2023 - arxiv.org
With the rapid advancements of sensor technology and deep learning, autonomous driving
systems are providing safe and efficient access to intelligent vehicles as well as intelligent …

4D Radar-Camera Sensor Fusion for Robust Vehicle Pose Estimation in Foggy Environments

S Yang, M Choi, S Han, KH Choi, KS Kim - IEEE Access, 2023 - ieeexplore.ieee.org
The integration of cameras and millimeter-wave radar into sensor fusion algorithms is
essential to ensure robustness and cost effectiveness for vehicle pose estimation. Due to the …

Assessing the Robustness of LiDAR, Radar and Depth Cameras Against Ill-Reflecting Surfaces in Autonomous Vehicles: An Experimental Study

M Lötscher, N Baumann, E Ghignone… - 2023 IEEE 9th World …, 2023 - ieeexplore.ieee.org
Range-measuring sensors play a critical role in autonomous driving systems. While Light
Detection and Ranging (LiDAR) technology has been dominant, its vulnerability to adverse …

Lightweight semantic segmentation network with configurable context and small object attention

C Zhang, F Xu, C Wu, J Li - Frontiers in Computational Neuroscience, 2023 - frontiersin.org
The current semantic segmentation algorithms suffer from encoding feature distortion and
small object feature loss. Context information exchange can effectively address the feature …

Accurate Association of Radar-Vision Measurements Based on Weighted Euclidean Distance

H Bai, H Li, W Yi, Z Zou - 2023 12th International Conference on …, 2023 - ieeexplore.ieee.org
Associating radar measurements with vision measurements is essential for radar-vision
fusion. Most radar-vision association methods rely on Euclidean distance. However, these …