Radar-camera fusion for object detection and semantic segmentation in autonomous driving: A comprehensive review

S Yao, R Guan, X Huang, Z Li, X Sha… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
Driven by deep learning techniques, perception technology in autonomous driving has
developed rapidly in recent years, enabling vehicles to accurately detect and interpret …

Radar perception in autonomous driving: Exploring different data representations

S Yao, R Guan, Z Peng, C Xu, Y Shi, Y Yue… - arXiv preprint arXiv …, 2023 - arxiv.org
With the rapid advancements of sensor technology and deep learning, autonomous driving
systems are providing safe and efficient access to intelligent vehicles as well as intelligent …

WaterScenes: A Multi-Task 4D Radar-Camera Fusion Dataset and Benchmarks for Autonomous Driving on Water Surfaces

S Yao, R Guan, Z Wu, Y Ni, Z Huang… - IEEE Transactions …, 2024 - ieeexplore.ieee.org
Autonomous driving on water surfaces plays an essential role in executing hazardous and
time-consuming missions, such as maritime surveillance, survivor rescue, environmental …

Mask-VRDet: A robust riverway panoptic perception model based on dual graph fusion of vision and 4D mmWave radar

R Guan, S Yao, L Liu, X Zhu, KL Man, Y Yue… - Robotics and …, 2024 - Elsevier
With the development of Unmanned Surface Vehicles (USVs), the perception of inland
waterways has become significant to autonomous navigation. RGB cameras can capture …

RC-BEVFusion: A plug-in module for radar-camera bird's eye view feature fusion

L Stäcker, S Mishra, P Heidenreich, J Rambach… - … German Conference on …, 2023 - Springer
Radars and cameras belong to the most frequently used sensors for advanced driver
assistance systems and automated driving research. However, there has been surprisingly …

Achelous++: Power-Oriented Water-Surface Panoptic Perception Framework on Edge Devices based on Vision-Radar Fusion and Pruning of Heterogeneous …

R Guan, H Zhao, S Yao, KL Man, X Zhu, L Yu… - arXiv preprint arXiv …, 2023 - arxiv.org
Urban water-surface robust perception serves as the foundation for intelligent monitoring of
aquatic environments and the autonomous navigation and operation of unmanned vessels …

Multi-Task Cross-Modality Attention-Fusion for 2D Object Detection

H Sun, H Feng, G Stettinger… - 2023 IEEE 26th …, 2023 - ieeexplore.ieee.org
Accurate and robust object detection is critical for autonomous driving. Image-based
detectors face difficulties caused by low visibility in adverse weather conditions. Thus, radar …

Vehicle detection and tracking method based on multi-sensor trajectory information

L Zhao, Q Cao, B Cai, W Shao, M Zhang - Journal of the Brazilian Society …, 2023 - Springer
Accurately identifying and tracking different types of vehicles is the basis of safe driving of
intelligent vehicles. Because of the defects of the traditional rule-based association methods …

CR-DINO: A Novel Camera-Radar Fusion 2D Object Detection Model Based On Transformer

Y Jin, X Zhu, Y Yue, EG Lim, W Wang - IEEE Sensors Journal, 2024 - ieeexplore.ieee.org
Due to millimeter-wave (MMW) radar's ability to directly acquire spatial positions and velocity
information of objects, as well as its robust performance in adverse weather conditions, it …

Target Detection for USVs by Radar-vision Fusion with Swag-robust Distance-aware Probabilistic Multi-modal Data Association

Z Li, T Yuan, L Ma, Y Zhou, Y Peng - IEEE Sensors Journal, 2024 - ieeexplore.ieee.org
Unmanned surface vehicles (USVs) have been widely used for a wide range of tasks in the
past decades. Accurate perception of the surrounding environment on the water surface …