Perception and sensing for autonomous vehicles under adverse weather conditions: A survey

Y Zhang, A Carballo, H Yang, K Takeda - ISPRS Journal of …, 2023 - Elsevier
Abstract Automated Driving Systems (ADS) open up a new domain for the automotive
industry and offer new possibilities for future transportation with higher efficiency and …

Surround-view fisheye camera perception for automated driving: Overview, survey & challenges

VR Kumar, C Eising, C Witt… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Surround-view fisheye cameras are commonly used for near-field sensing in automated
driving. Four fisheye cameras on four sides of the vehicle are sufficient to cover 360° around …

Deep reinforcement learning for autonomous driving: A survey

BR Kiran, I Sobh, V Talpaert, P Mannion… - IEEE Transactions …, 2021 - ieeexplore.ieee.org
With the development of deep representation learning, the domain of reinforcement learning
(RL) has become a powerful learning framework now capable of learning complex policies …

Omnidet: Surround view cameras based multi-task visual perception network for autonomous driving

VR Kumar, S Yogamani, H Rashed… - IEEE Robotics and …, 2021 - ieeexplore.ieee.org
Surround View fisheye cameras are commonly deployed in automated driving for 360 near-
field sensing around the vehicle. This work presents a multi-task visual perception network …

Syndistnet: Self-supervised monocular fisheye camera distance estimation synergized with semantic segmentation for autonomous driving

VR Kumar, M Klingner, S Yogamani… - Proceedings of the …, 2021 - openaccess.thecvf.com
State-of-the-art self-supervised learning approaches for monocular depth estimation usually
suffer from scale ambiguity. They do not generalize well when applied on distance …

Generalized object detection on fisheye cameras for autonomous driving: Dataset, representations and baseline

H Rashed, E Mohamed, G Sistu… - Proceedings of the …, 2021 - openaccess.thecvf.com
Object detection is a comprehensively studied problem in autonomous driving. However, it
has been relatively less explored in the case of fisheye cameras. The standard bounding …

Fusemodnet: Real-time camera and lidar based moving object detection for robust low-light autonomous driving

H Rashed, M Ramzy, V Vaquero… - Proceedings of the …, 2019 - openaccess.thecvf.com
Moving object detection is a critical task for autonomous vehicles. As dynamic objects
represent higher collision risk than static ones, our own ego-trajectories have to be planned …

Weather and light level classification for autonomous driving: Dataset, baseline and active learning

MM Dhananjaya, VR Kumar… - 2021 IEEE International …, 2021 - ieeexplore.ieee.org
Autonomous driving is rapidly advancing, and Level 2 functions are becoming a standard
feature. One of the foremost outstanding hurdles is to obtain robust visual perception in …

Fisheyedistancenet: Self-supervised scale-aware distance estimation using monocular fisheye camera for autonomous driving

VR Kumar, SA Hiremath, M Bach, S Milz… - … on robotics and …, 2020 - ieeexplore.ieee.org
Fisheye cameras are commonly used in applications like autonomous driving and
surveillance to provide a large field of view (> 180 o). However, they come at the cost of …

Multi-weather city: Adverse weather stacking for autonomous driving

V Mușat, I Fursa, P Newman… - Proceedings of the …, 2021 - openaccess.thecvf.com
Autonomous vehicles make use of sensors to perceive the world around them, with heavy
reliance on vision-based sensors such as RGB cameras. Unfortunately, since these sensors …