To achieve autonomous driving, developing 3D detection fusion methods, which aim to fuse the camera and LiDAR information, has draw great research interest in recent years. As a …
State-of-the-art self-supervised learning approaches for monocular depth estimation usually suffer from scale ambiguity. They do not generalize well when applied on distance …
Autonomous driving is rapidly advancing, and Level 2 functions are becoming a standard feature. One of the foremost outstanding hurdles is to obtain robust visual perception in …
Abstract Bird's-eye-view (BEV) grid is a common representation for the perception of road components, eg, drivable area, in autonomous driving. Most existing approaches rely on …
V Mușat, I Fursa, P Newman… - Proceedings of the …, 2021 - openaccess.thecvf.com
Autonomous vehicles make use of sensors to perceive the world around them, with heavy reliance on vision-based sensors such as RGB cameras. Unfortunately, since these sensors …
VR Kumar, SA Hiremath, M Bach, S Milz… - … on robotics and …, 2020 - ieeexplore.ieee.org
Fisheye cameras are commonly used in applications like autonomous driving and surveillance to provide a large field of view (> 180 o). However, they come at the cost of …
Cameras are the primary sensor in automated driving systems. They provide high information density and are optimal for detecting road infrastructure cues laid out for human …
Wide-angle fisheye cameras are commonly used in automated driving for parking and low- speed navigation tasks. Four of such cameras form a surround-view system that provides a …
Assisted and automated driving functions are increasingly deployed to support improved safety and efficiency and enhance the driver experience. However, there are still key …