Radars for autonomous driving: A review of deep learning methods and challenges

A Srivastav, S Mandal - IEEE Access, 2023 - ieeexplore.ieee.org
Radar is a key component of the suite of perception sensors used for safe and reliable
navigation of autonomous vehicles. Its unique capabilities include high-resolution velocity …

V2X cooperative perception for autonomous driving: Recent advances and challenges

T Huang, J Liu, X Zhou, DC Nguyen… - arXiv preprint arXiv …, 2023 - arxiv.org
Accurate perception is essential for advancing autonomous driving and addressing safety
challenges in modern transportation systems. Despite significant advancements in computer …

LXL: LiDAR excluded lean 3D object detection with 4D imaging radar and camera fusion

W Xiong, J Liu, T Huang, QL Han… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
As an emerging technology and a relatively affordable device, the 4D imaging radar has
already been confirmed effective in performing 3D object detection in autonomous driving …

Mask-VRDet: A robust riverway panoptic perception model based on dual graph fusion of vision and 4D mmWave radar

R Guan, S Yao, L Liu, X Zhu, KL Man, Y Yue… - Robotics and …, 2024 - Elsevier
With the development of Unmanned Surface Vehicles (USVs), the perception of inland
waterways has become significant to autonomous navigation. RGB cameras can capture …

WaterScenes: A Multi-Task 4D Radar-Camera Fusion Dataset and Benchmarks for Autonomous Driving on Water Surfaces

S Yao, R Guan, Z Wu, Y Ni, Z Huang… - IEEE Transactions …, 2024 - ieeexplore.ieee.org
Autonomous driving on water surfaces plays an essential role in executing hazardous and
time-consuming missions, such as maritime surveillance, survivor rescue, environmental …

Sparsefusion3d: Sparse sensor fusion for 3d object detection by radar and camera in environmental perception

Z Yu, W Wan, M Ren, X Zheng… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
In the context of autonomous driving environment perception, multi-modal fusion plays a
pivotal role in enhancing robustness, completeness, and accuracy, thereby extending the …

Fuzzy-nms: Improving 3d object detection with fuzzy classification in nms

L Wang, X Zhang, F Zhao, C Wu… - IEEE Transactions …, 2024 - ieeexplore.ieee.org
Non-maximum suppression (NMS) is an essential post-processing module in many 3D
object detection frameworks to remove overlapping candidate bounding boxes. However, an …

Radar perception in autonomous driving: Exploring different data representations

S Yao, R Guan, Z Peng, C Xu, Y Shi, Y Yue… - arXiv preprint arXiv …, 2023 - arxiv.org
With the rapid advancements of sensor technology and deep learning, autonomous driving
systems are providing safe and efficient access to intelligent vehicles as well as intelligent …

Dpft: Dual perspective fusion transformer for camera-radar-based object detection

F Fent, A Palffy, H Caesar - arXiv preprint arXiv:2404.03015, 2024 - arxiv.org
The perception of autonomous vehicles has to be efficient, robust, and cost-effective.
However, cameras are not robust against severe weather conditions, lidar sensors are …

Watervg: Waterway visual grounding based on text-guided vision and mmwave radar

R Guan, L Jia, F Yang, S Yao, E Purwanto… - arXiv preprint arXiv …, 2024 - arxiv.org
The perception of waterways based on human intent holds significant importance for
autonomous navigation and operations of Unmanned Surface Vehicles (USVs) in water …