SynWoodScape: Synthetic surround-view fisheye camera dataset for autonomous driving

AR Sekkat, Y Dupuis, VR Kumar… - IEEE Robotics and …, 2022 - ieeexplore.ieee.org
Surround-view cameras are a primary sensor for automated driving, used for near-field
perception. It is one of the most commonly used sensors in commercial vehicles primarily …

X-Align: Cross-Modal Cross-View Alignment for Bird's-Eye-View Segmentation

S Borse, M Klingner, VR Kumar, H Cai… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Bird's-eye-view (BEV) grid is a common representation for the perception of road
components, eg, drivable area, in autonomous driving. Most existing approaches rely on …

A Deep Analysis of Visual SLAM Methods for Highly Automated and Autonomous Vehicles in Complex Urban Environment

K Wang, G Zhao, J Lu - IEEE Transactions on Intelligent …, 2024 - ieeexplore.ieee.org
In the context of automated driving, navigating through challenging urban environments with
dynamic objects, large-scale scenes, and varying lighting/weather conditions, achieving …

Adversarial attacks on multi-task visual perception for autonomous driving

I Sobh, A Hamed, VR Kumar, S Yogamani - arXiv preprint arXiv …, 2021 - arxiv.org
Deep neural networks (DNNs) have accomplished impressive success in various
applications, including autonomous driving perception tasks, in recent years. On the other …

Limoseg: Real-time bird's eye view based lidar motion segmentation

S Mohapatra, M Hodaei, S Yogamani, S Milz… - arXiv preprint arXiv …, 2021 - arxiv.org
Moving object detection and segmentation is an essential task in the Autonomous Driving
pipeline. Detecting and isolating static and moving components of a vehicle's surroundings …

An online learning system for wireless charging alignment using surround-view fisheye cameras

A Dahal, VR Kumar, S Yogamani… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Electric Vehicles are increasingly common, with inductive chargepads being considered a
convenient and efficient means of charging electric vehicles. However, drivers are typically …

A System for Dense Monocular Mapping with a Fisheye Camera

L Gallagher, G Sistu, J Horgan… - Proceedings of the …, 2023 - openaccess.thecvf.com
We introduce a novel dense mapping system that uses a single monocular fisheye camera
as the sole input sensor and incrementally builds a dense surfel representations of the …

Woodscape Fisheye Object Detection for Autonomous Driving--CVPR 2022 OmniCV Workshop Challenge

S Ramachandran, G Sistu, VR Kumar… - arXiv preprint arXiv …, 2022 - arxiv.org
Object detection is a comprehensively studied problem in autonomous driving. However, it
has been relatively less explored in the case of fisheye cameras. The strong radial distortion …

UnShadowNet: Illumination critic guided contrastive learning for shadow removal

S Dasgupta, A Das, S Yogamani, S Das, C Eising… - IEEE …, 2023 - ieeexplore.ieee.org
Shadows are frequently encountered natural phenomena that significantly hinder the
performance of computer vision perception systems in practical settings, eg, autonomous …

WoodScape Motion Segmentation for Autonomous Driving--CVPR 2023 OmniCV Workshop Challenge

S Ramachandran, N Cibik, G Sistu… - arXiv preprint arXiv …, 2023 - arxiv.org
Motion segmentation is a complex yet indispensable task in autonomous driving. The
challenges introduced by the ego-motion of the cameras, radial distortion in fisheye lenses …