Radar-camera fusion for object detection and semantic segmentation in autonomous driving: A comprehensive review

S Yao, R Guan, X Huang, Z Li, X Sha… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
Driven by deep learning techniques, perception technology in autonomous driving has
developed rapidly in recent years, enabling vehicles to accurately detect and interpret …

Radars for autonomous driving: A review of deep learning methods and challenges

A Srivastav, S Mandal - IEEE Access, 2023 - ieeexplore.ieee.org
Radar is a key component of the suite of perception sensors used for safe and reliable
navigation of autonomous vehicles. Its unique capabilities include high-resolution velocity …

LXL: LiDAR excluded lean 3D object detection with 4D imaging radar and camera fusion

W Xiong, J Liu, T Huang, QL Han… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
As an emerging technology and a relatively affordable device, the 4D imaging radar has
already been confirmed effective in performing 3D object detection in autonomous driving …

Echoes beyond points: Unleashing the power of raw radar data in multi-modality fusion

Y Liu, F Wang, N Wang… - Advances in Neural …, 2024 - proceedings.neurips.cc
Radar is ubiquitous in autonomous driving systems due to its low cost and good adaptability
to bad weather. Nevertheless, the radar detection performance is usually inferior because its …

RadarDistill: Boosting Radar-based Object Detection Performance via Knowledge Distillation from LiDAR Features

G Bang, K Choi, J Kim, D Kum… - Proceedings of the …, 2024 - openaccess.thecvf.com
The inherent noisy and sparse characteristics of radar data pose challenges in finding
effective representations for 3D object detection. In this paper we propose RadarDistill a …

RCBEVDet: Radar-camera Fusion in Bird's Eye View for 3D Object Detection

Z Lin, Z Liu, Z Xia, X Wang, Y Wang… - Proceedings of the …, 2024 - openaccess.thecvf.com
Three-dimensional object detection is one of the key tasks in autonomous driving. To reduce
costs in practice low-cost multi-view cameras for 3D object detection are proposed to …

Unleashing hydra: Hybrid fusion, depth consistency and radar for unified 3d perception

P Wolters, J Gilg, T Teepe, F Herzog, A Laouichi… - arXiv preprint arXiv …, 2024 - arxiv.org
Low-cost, vision-centric 3D perception systems for autonomous driving have made
significant progress in recent years, narrowing the gap to expensive LiDAR-based methods …

Sparsefusion3d: Sparse sensor fusion for 3d object detection by radar and camera in environmental perception

Z Yu, W Wan, M Ren, X Zheng… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
In the context of autonomous driving environment perception, multi-modal fusion plays a
pivotal role in enhancing robustness, completeness, and accuracy, thereby extending the …

V2X cooperative perception for autonomous driving: Recent advances and challenges

T Huang, J Liu, X Zhou, DC Nguyen… - arXiv preprint arXiv …, 2023 - arxiv.org
Accurate perception is essential for advancing autonomous driving and addressing safety
challenges in modern transportation systems. Despite significant advancements in computer …

CRKD: Enhanced Camera-Radar Object Detection with Cross-modality Knowledge Distillation

L Zhao, J Song, KA Skinner - Proceedings of the IEEE/CVF …, 2024 - openaccess.thecvf.com
In the field of 3D object detection for autonomous driving LiDAR-Camera (LC) fusion is the
top-performing sensor configuration. Still LiDAR is relatively high cost which hinders …