Depth estimation from a single omnidirectional image using domain adaptation

Y Wu, Y Heng, M Niranjan, H Kim - Proceedings of the 18th ACM …, 2021 - dl.acm.org
Proceedings of the 18th ACM SIGGRAPH European Conference on Visual Media …, 2021dl.acm.org
Omnidirectional cameras are becoming popular in various applications owing to their ability
to capture the full surrounding scene in real-time. However, depth estimation for an
omnidirectional scene is more difficult than normal perspective images due to its different
system properties and distortions. It is hard to use normal depth estimation methods such as
stereo matching or RGB-D sensing. A deep-learning-based single-shot depth estimation
approach can be a good solution, but it requires a large labelled dataset for training. The …
Omnidirectional cameras are becoming popular in various applications owing to their ability to capture the full surrounding scene in real-time. However, depth estimation for an omnidirectional scene is more difficult than normal perspective images due to its different system properties and distortions. It is hard to use normal depth estimation methods such as stereo matching or RGB-D sensing. A deep-learning-based single-shot depth estimation approach can be a good solution, but it requires a large labelled dataset for training. The 3D60 dataset, the largest omnidirectional dataset with depth labels, is not applicable for general scene depth estimation because it covers very limited scenes. In order to overcome this limitation, we propose a depth estimation architecture for a single omnidirectional image using domain adaptation. The proposed architecture gets labelled source domain and unlabelled target domain data together as its input and estimated depth information of the target domain using the Generative Adversarial Networks (GAN) based method. The proposed architecture shows >10% higher accuracy in depth estimation than traditional encoder-decoder models with a limited labelled dataset.
ACM Digital Library
以上显示的是最相近的搜索结果。 查看全部搜索结果

Google学术搜索按钮

example.edu/paper.pdf
搜索
获取 PDF 文件
引用
References