itof2dtof: A robust and flexible representation for data-driven time-of-flight imaging

F Gutierrez-Barragan, H Chen, M Gupta… - IEEE Transactions …, 2021 - ieeexplore.ieee.org
IEEE Transactions on Computational Imaging, 2021ieeexplore.ieee.org
Indirect Time-of-Flight (iToF) cameras are a promising depth sensing technology. However,
they are prone to errors caused by multi-path interference (MPI) and low signal-to-noise ratio
(SNR). Traditional methods, after denoising, mitigate MPI by estimating a transient image
that encodes depths. Recently, data-driven methods that jointly denoise and mitigate MPI
have become state-of-the-art without using the intermediate transient representation. In this
paper, we propose to revisit the transient representation. Using data-driven priors, we …
Indirect Time-of-Flight (iToF) cameras are a promising depth sensing technology. However, they are prone to errors caused by multi-path interference (MPI) and low signal-to-noise ratio (SNR). Traditional methods, after denoising, mitigate MPI by estimating a transient image that encodes depths. Recently, data-driven methods that jointly denoise and mitigate MPI have become state-of-the-art without using the intermediate transient representation. In this paper, we propose to revisit the transient representation. Using data-driven priors, we interpolate/extrapolate iToF frequencies and use them to estimate the transient image. Given direct ToF (dToF) sensors capture transient images, we name our method iToF2dToF. The transient representation is flexible. It can be integrated with different rule-based depth sensing algorithms that are robust to low SNR and can deal with ambiguous scenarios that arise in practice (e.g., specular MPI, optical cross-talk). We demonstrate the benefits of iToF2dToF over previous methods in real depth sensing scenarios.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果