Dpft: Dual perspective fusion transformer for camera-radar-based object detection

F Fent, A Palffy, H Caesar - arXiv preprint arXiv:2404.03015, 2024 - arxiv.org
The perception of autonomous vehicles has to be efficient, robust, and cost-effective.
However, cameras are not robust against severe weather conditions, lidar sensors are …

ROFusion: efficient object detection using hybrid point-wise radar-optical fusion

L Liu, S Zhi, Z Du, L Liu, X Zhang, K Huo… - … Conference on Artificial …, 2023 - Springer
Radars, due to their robustness to adverse weather conditions and ability to measure object
motions, have served in autonomous driving and intelligent agents for years. However …

CRKD: Enhanced Camera-Radar Object Detection with Cross-modality Knowledge Distillation

L Zhao, J Song, KA Skinner - Proceedings of the IEEE/CVF …, 2024 - openaccess.thecvf.com
In the field of 3D object detection for autonomous driving LiDAR-Camera (LC) fusion is the
top-performing sensor configuration. Still LiDAR is relatively high cost which hinders …

Rvdet: Feature-level fusion of radar and camera for object detection

J Zhang, M Zhang, Z Fang, Y Wang… - 2021 IEEE …, 2021 - ieeexplore.ieee.org
Obstacle perception based on radar sensor has drawn wide attentions in autonomous
driving due to robust performance and low cost. It is significant to utilize fusion, eg, camera …

CR-DINO: A Novel Camera-Radar Fusion 2D Object Detection Model Based On Transformer

Y Jin, X Zhu, Y Yue, EG Lim, W Wang - IEEE Sensors Journal, 2024 - ieeexplore.ieee.org
Due to millimeter-wave (MMW) radar's ability to directly acquire spatial positions and velocity
information of objects, as well as its robust performance in adverse weather conditions, it …

TL-4DRCF: A two-level 4D radar-camera fusion method for object detection in adverse weather

H Zhang, K Wu, R Chen, Z Wu, Y Zhong… - IEEE Sensors …, 2024 - ieeexplore.ieee.org
In autonomous driving systems, cameras and light detection and ranging (LiDAR) are two
common sensors for object detection. However, both sensors can be severely affected by …

Echoes beyond points: Unleashing the power of raw radar data in multi-modality fusion

Y Liu, F Wang, N Wang… - Advances in Neural …, 2024 - proceedings.neurips.cc
Radar is ubiquitous in autonomous driving systems due to its low cost and good adaptability
to bad weather. Nevertheless, the radar detection performance is usually inferior because its …

T-fftradnet: Object detection with swin vision transformers from raw adc radar signals

J Giroux, M Bouchard… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Abstract Object detection utilizing Frequency Modulated Continuous Wave radar is
becoming increasingly popular in the field of autonomous systems. Radar does not possess …

Radar-camera sensor fusion for joint object detection and distance estimation in autonomous vehicles

R Nabati, H Qi - arXiv preprint arXiv:2009.08428, 2020 - arxiv.org
In this paper we present a novel radar-camera sensor fusion framework for accurate object
detection and distance estimation in autonomous driving scenarios. The proposed …

Modality-agnostic learning for radar-lidar fusion in vehicle detection

YJ Li, J Park, M O'Toole… - proceedings of the IEEE …, 2022 - openaccess.thecvf.com
Fusion of multiple sensor modalities such as camera, Lidar, and Radar, which are commonly
found on autonomous vehicles, not only allows for accurate detection but also robustifies …