3D ToF LiDAR in mobile robotics: A review

T Yang, Y Li, C Zhao, D Yao, G Chen, L Sun… - arXiv preprint arXiv …, 2022 - arxiv.org
In the past ten years, the use of 3D Time-of-Flight (ToF) LiDARs in mobile robotics has
grown rapidly. Based on our accumulation of relevant research, this article systematically …

Review of visual simultaneous localization and mapping based on deep learning

Y Zhang, Y Wu, K Tong, H Chen, Y Yuan - remote sensing, 2023 - mdpi.com
Due to the limitations of LiDAR, such as its high cost, short service life and massive volume,
visual sensors with their lightweight and low cost are attracting more and more attention and …

[HTML][HTML] Multi-sensor integrated navigation/positioning systems using data fusion: From analytics-based to learning-based approaches

Y Zhuang, X Sun, Y Li, J Huai, L Hua, X Yang, X Cao… - Information …, 2023 - Elsevier
Navigation/positioning systems have become critical to many applications, such as
autonomous driving, Internet of Things (IoT), Unmanned Aerial Vehicle (UAV), and smart …

Dsec: A stereo event camera dataset for driving scenarios

M Gehrig, W Aarents, D Gehrig… - IEEE Robotics and …, 2021 - ieeexplore.ieee.org
Once an academic venture, autonomous driving has received unparalleled corporate
funding in the last decade. Still, operating conditions of current autonomous cars are mostly …

Unsupervised scale-consistent depth and ego-motion learning from monocular video

J Bian, Z Li, N Wang, H Zhan, C Shen… - Advances in neural …, 2019 - proceedings.neurips.cc
Recent work has shown that CNN-based depth and ego-motion estimators can be learned
using unlabelled monocular videos. However, the performance is limited by unidentified …

Boreas: A multi-season autonomous driving dataset

K Burnett, DJ Yoon, Y Wu, AZ Li… - … Journal of Robotics …, 2023 - journals.sagepub.com
The Boreas dataset was collected by driving a repeated route over the course of 1 year,
resulting in stark seasonal variations and adverse weather conditions such as rain and …

Mulran: Multimodal range dataset for urban place recognition

G Kim, YS Park, Y Cho, J Jeong… - 2020 IEEE international …, 2020 - ieeexplore.ieee.org
This paper introduces a multimodal range dataset namely for radio detection and ranging
(radar) and light detection and ranging (LiDAR) specifically targeting the urban environment …

End-to-end pseudo-lidar for image-based 3d object detection

R Qian, D Garg, Y Wang, Y You… - Proceedings of the …, 2020 - openaccess.thecvf.com
Reliable and accurate 3D object detection is a necessity for safe autonomous driving.
Although LiDAR sensors can provide accurate 3D point cloud estimates of the environment …

Ntu viral: A visual-inertial-ranging-lidar dataset, from an aerial vehicle viewpoint

TM Nguyen, S Yuan, M Cao, Y Lyu… - … Journal of Robotics …, 2022 - journals.sagepub.com
In recent years, autonomous robots have become ubiquitous in research and daily life.
Among many factors, public datasets play an important role in the progress of this field, as …

The newer college dataset: Handheld lidar, inertial and vision with ground truth

M Ramezani, Y Wang, M Camurri… - 2020 IEEE/RSJ …, 2020 - ieeexplore.ieee.org
In this paper, we present a large dataset with a variety of mobile mapping sensors collected
using a handheld device carried at typical walking speeds for nearly 2.2 km around New …