Lock-in time-of-flight (ToF) cameras: A survey

S Foix, G Alenya, C Torras - IEEE Sensors Journal, 2011 - ieeexplore.ieee.org
This paper reviews the state-of-the art in the field of lock-in time-of-flight (ToF) cameras, their
advantages, their limitations, the existing calibration methods, and the way they are being …

[图书][B] Time-of-flight cameras: principles, methods and applications

M Hansard, S Lee, O Choi, RP Horaud - 2012 - books.google.com
Time-of-flight (TOF) cameras provide a depth value at each pixel, from which the 3D
structure of the scene can be estimated. This new type of active sensor makes it possible to …

Using a depth camera for crop row detection and mapping for under-canopy navigation of agricultural robotic vehicle

J Gai, L Xiang, L Tang - Computers and Electronics in Agriculture, 2021 - Elsevier
Computer vision provides local environmental information for robotic navigation in crop
fields. It is particularly useful for robots operating under canopies of tall plants such as corns …

Real-time plane segmentation using RGB-D cameras

D Holz, S Holzer, RB Rusu, S Behnke - … 2011: Robot Soccer World Cup XV …, 2012 - Springer
Real-time 3D perception of the surrounding environment is a crucial precondition for the
reliable and safe application of mobile service robots in domestic environments. Using a …

ToF cameras for active vision in robotics

G Alenyà, S Foix, C Torras - Sensors and Actuators A: Physical, 2014 - Elsevier
ToF cameras are now a mature technology that is widely being adopted to provide sensory
input to robotic applications. Depending on the nature of the objects to be perceived and the …

Terrain classification in complex three‐dimensional outdoor environments

À Santamaria‐Navarro, EH Teniente… - Journal of Field …, 2015 - Wiley Online Library
This paper presents two techniques to detect and classify navigable terrain in complex three‐
dimensional (3D) environments. The first method is a low level on‐line mechanism aimed at …

Image-guided ToF depth upsampling: a survey

I Eichhardt, D Chetverikov, Z Janko - Machine Vision and Applications, 2017 - Springer
Recently, there has been remarkable growth of interest in the development and applications
of time-of-flight (ToF) depth cameras. Despite the permanent improvement of their …

Into darkness: Visual navigation based on a lidar-intensity-image pipeline

TD Barfoot, C McManus, S Anderson, H Dong… - Robotics Research: The …, 2016 - Springer
Visual navigation of mobile robots has become a core capability that enables many
interesting applications from planetary exploration to self-driving cars. While systems built on …

Lighting‐invariant visual teach and repeat using appearance‐based lidar

C McManus, P Furgale, B Stenning… - Journal of Field …, 2013 - Wiley Online Library
Visual Teach and Repeat (VT&R) is an effective method to enable a vehicle to repeat any
previously driven route using just a visual sensor and without a global positioning system …

Towards lighting-invariant visual navigation: An appearance-based approach using scanning laser-rangefinders

C McManus, P Furgale, TD Barfoot - Robotics and Autonomous Systems, 2013 - Elsevier
In an effort to facilitate lighting-invariant exploration, this paper presents an appearance-
based approach using 3D scanning laser-rangefinders for two core visual navigation …