Vision meets mmWave Radar: 3D Object Perception Benchmark for Autonomous Driving

Y Wang, JH Cheng, JT Huang, SY Kuan, Q Fu… - arXiv preprint arXiv …, 2023 - arxiv.org
Sensor fusion is crucial for an accurate and robust perception system on autonomous
vehicles. Most existing datasets and perception solutions focus on fusing cameras and …

Crn: Camera radar net for accurate, robust, efficient 3d perception

Y Kim, J Shin, S Kim, IJ Lee… - Proceedings of the …, 2023 - openaccess.thecvf.com
Autonomous driving requires an accurate and fast 3D perception system that includes 3D
object detection, tracking, and segmentation. Although recent low-cost camera-based …

GRC-net: Fusing GAT-based 4D radar and camera for 3D object detection

L Fan, C Zeng, Y Li, X Wang, D Cao - 2023 - sae.org
The fusion of multi-modal perception in autonomous driving plays a pivotal role in vehicle
behavior decision-making. However, much of the previous research has predominantly …

Opv2v: An open benchmark dataset and fusion pipeline for perception with vehicle-to-vehicle communication

R Xu, H Xiang, X Xia, X Han, J Li… - … Conference on Robotics …, 2022 - ieeexplore.ieee.org
Employing Vehicle-to-Vehicle communication to enhance perception performance in self-
driving technology has attracted considerable attention recently; however, the absence of a …

Sparsefusion3d: Sparse sensor fusion for 3d object detection by radar and camera in environmental perception

Z Yu, W Wan, M Ren, X Zheng… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
In the context of autonomous driving environment perception, multi-modal fusion plays a
pivotal role in enhancing robustness, completeness, and accuracy, thereby extending the …

Efficient Multi-Sensor Fusion for 3D Perception

K Shao - 2023 - dspace.mit.edu
As a critical component to realizing widespread autonomous driving, 3D perception systems
have come to be heavily studied in the community. However, many solutions are solely …

Unleashing hydra: Hybrid fusion, depth consistency and radar for unified 3d perception

P Wolters, J Gilg, T Teepe, F Herzog, A Laouichi… - arXiv preprint arXiv …, 2024 - arxiv.org
Low-cost, vision-centric 3D perception systems for autonomous driving have made
significant progress in recent years, narrowing the gap to expensive LiDAR-based methods …

Dual radar: A multi-modal dataset with dual 4d radar for autononous driving

X Zhang, L Wang, J Chen, C Fang, L Yang… - arXiv preprint arXiv …, 2023 - arxiv.org
Radar has stronger adaptability in adverse scenarios for autonomous driving environmental
perception compared to widely adopted cameras and LiDARs. Compared with commonly …

Openmpd: An open multimodal perception dataset for autonomous driving

X Zhang, Z Li, Y Gong, D Jin, J Li… - IEEE Transactions …, 2022 - ieeexplore.ieee.org
Multi-modal sensor fusion techniques have promoted the development of autonomous
driving, while perception in the complex environment remains a challenging problem. In …

Deep LiDAR-radar-visual fusion for object detection in urban environments

Y Xiao, Y Liu, K Luan, Y Cheng, X Chen, H Lu - Remote Sensing, 2023 - mdpi.com
Robust environmental sensing and accurate object detection are crucial in enabling
autonomous driving in urban environments. To achieve this goal, autonomous mobile …