WaterScenes: A Multi-Task 4D Radar-Camera Fusion Dataset and Benchmarks for Autonomous Driving on Water Surfaces

S Yao, R Guan, Z Wu, Y Ni, Z Huang… - IEEE Transactions …, 2024 - ieeexplore.ieee.org
Autonomous driving on water surfaces plays an essential role in executing hazardous and
time-consuming missions, such as maritime surveillance, survivor rescue, environmental …

Dual radar: A multi-modal dataset with dual 4d radar for autononous driving

X Zhang, L Wang, J Chen, C Fang, L Yang… - arXiv preprint arXiv …, 2023 - arxiv.org
Radar has stronger adaptability in adverse scenarios for autonomous driving environmental
perception compared to widely adopted cameras and LiDARs. Compared with commonly …

[HTML][HTML] Camera-LiDAR cross-modality fusion water segmentation for unmanned surface vehicles

J Gao, J Zhang, C Liu, X Li, Y Peng - Journal of Marine Science and …, 2022 - mdpi.com
Water segmentation is essential for the autonomous driving system of unmanned surface
vehicles (USVs), which provides reliable navigation for making safety decisions. However …

Achelous: A fast unified water-surface panoptic perception framework based on fusion of monocular camera and 4d mmwave radar

R Guan, S Yao, X Zhu, KL Man, EG Lim… - 2023 IEEE 26th …, 2023 - ieeexplore.ieee.org
Current perception models for different tasks usually exist in modular forms on Unmanned
Surface Vehicles (USVs), which infer extremely slowly in parallel on edge devices, causing …

Radar perception in autonomous driving: Exploring different data representations

S Yao, R Guan, Z Peng, C Xu, Y Shi, Y Yue… - arXiv preprint arXiv …, 2023 - arxiv.org
With the rapid advancements of sensor technology and deep learning, autonomous driving
systems are providing safe and efficient access to intelligent vehicles as well as intelligent …

Vision meets mmwave radar: 3d object perception benchmark for autonomous driving

Y Wang, JH Cheng, JT Huang, SY Kuan… - 2024 IEEE Intelligent …, 2024 - ieeexplore.ieee.org
Sensor fusion is crucial for an accurate and robust perception system on autonomous
vehicles. Most existing datasets and perception solutions focus on fusing cameras and …

Achelous++: Power-Oriented Water-Surface Panoptic Perception Framework on Edge Devices based on Vision-Radar Fusion and Pruning of Heterogeneous …

R Guan, H Zhao, S Yao, KL Man, X Zhu, L Yu… - arXiv preprint arXiv …, 2023 - arxiv.org
Urban water-surface robust perception serves as the foundation for intelligent monitoring of
aquatic environments and the autonomous navigation and operation of unmanned vessels …

WODIS: Water obstacle detection network based on image segmentation for autonomous surface vehicles in maritime environments

X Chen, Y Liu, K Achuthan - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
A reliable obstacle detection system is crucial for autonomous surface vehicles (ASVs) to
realize fully autonomous navigation with no need of human intervention. However, the …

Robust small object detection on the water surface through fusion of camera and millimeter wave radar

Y Cheng, H Xu, Y Liu - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
In recent years, unmanned surface vehicles (USVs) have been experiencing growth in
various applications. With the expansion of USVs' application scenes from the typical marine …

Mask-VRDet: A robust riverway panoptic perception model based on dual graph fusion of vision and 4D mmWave radar

R Guan, S Yao, L Liu, X Zhu, KL Man, Y Yue… - Robotics and …, 2024 - Elsevier
With the development of Unmanned Surface Vehicles (USVs), the perception of inland
waterways has become significant to autonomous navigation. RGB cameras can capture …