Redformer: Radar enlightens the darkness of camera perception with transformers

C Cui, Y Ma, J Lu, Z Wang - IEEE Transactions on Intelligent …, 2023 - ieeexplore.ieee.org
Enhancing the accuracy and reliability of perception systems in automated vehicles is
critical, especially under varying driving conditions. Unfortunately, the challenges of adverse …

Radar Enlightens the Dark: Enhancing Low-Visibility Perception for Automated Vehicles with Camera-Radar Fusion

C Cui, Y Ma, J Lu, Z Wang - 2023 IEEE 26th International …, 2023 - ieeexplore.ieee.org
Sensor fusion is a crucial augmentation technique for improving the accuracy and reliability
of perception sys-tems for automated vehicles under diverse driving conditions. However …

Deep LiDAR-radar-visual fusion for object detection in urban environments

Y Xiao, Y Liu, K Luan, Y Cheng, X Chen, H Lu - Remote Sensing, 2023 - mdpi.com
Robust environmental sensing and accurate object detection are crucial in enabling
autonomous driving in urban environments. To achieve this goal, autonomous mobile …

Low-level sensor fusion network for 3d vehicle detection using radar range-azimuth heatmap and monocular image

J Kim, Y Kim, D Kum - … of the Asian Conference on Computer …, 2020 - openaccess.thecvf.com
Robust and accurate object detection on roads with various objects is essential for
automated driving. The radar has been employed in commercial advanced driver assistance …

Mvfan: Multi-view feature assisted network for 4d radar object detection

Q Yan, Y Wang - International Conference on Neural Information …, 2023 - Springer
Abstract 4D radar is recognized for its resilience and cost-effectiveness under adverse
weather conditions, thus playing a pivotal role in autonomous driving. While cameras and …

Robustness-aware 3d object detection in autonomous driving: A review and outlook

Z Song, L Liu, F Jia, Y Luo, G Zhang, L Yang… - arXiv preprint arXiv …, 2024 - arxiv.org
In the realm of modern autonomous driving, the perception system is indispensable for
accurately assessing the state of the surrounding environment, thereby enabling informed …

Crn: Camera radar net for accurate, robust, efficient 3d perception

Y Kim, J Shin, S Kim, IJ Lee… - Proceedings of the …, 2023 - openaccess.thecvf.com
Autonomous driving requires an accurate and fast 3D perception system that includes 3D
object detection, tracking, and segmentation. Although recent low-cost camera-based …

Grif net: Gated region of interest fusion network for robust 3d object detection from radar point cloud and monocular image

Y Kim, JW Choi, D Kum - 2020 IEEE/RSJ International …, 2020 - ieeexplore.ieee.org
Robust and accurate scene representation is essential for advanced driver assistance
systems (ADAS) such as automated driving. The radar and camera are two widely used …

Fusion point pruning for optimized 2d object detection with radar-camera fusion

L Stäcker, P Heidenreich… - Proceedings of the …, 2022 - openaccess.thecvf.com
Object detection is one of the most important perception tasks for advanced driver assistant
systems and autonomous driving. Due to its complementary features and moderate cost …

BEVFusion4D: Learning LiDAR-Camera Fusion Under Bird's-Eye-View via Cross-Modality Guidance and Temporal Aggregation

H Cai, Z Zhang, Z Zhou, Z Li, W Ding, J Zhao - arXiv preprint arXiv …, 2023 - arxiv.org
Integrating LiDAR and Camera information into Bird's-Eye-View (BEV) has become an
essential topic for 3D object detection in autonomous driving. Existing methods mostly adopt …