A novel point cloud registration using 2D image features

CC Lin, YC Tai, JJ Lee, YS Chen - EURASIP Journal on Advances in …, 2017 - Springer
CC Lin, YC Tai, JJ Lee, YS Chen
EURASIP Journal on Advances in Signal Processing, 2017Springer
Since a 3D scanner only captures a scene of a 3D object at a time, a 3D registration for multi-
scene is the key issue of 3D modeling. This paper presents a novel and an efficient 3D
registration method based on 2D local feature matching. The proposed method transforms
the point clouds into 2D bearing angle images and then uses the 2D feature based matching
method, SURF, to find matching pixel pairs between two images. The corresponding points
of 3D point clouds can be obtained by those pixel pairs. Since the corresponding pairs are …
Abstract
Since a 3D scanner only captures a scene of a 3D object at a time, a 3D registration for multi-scene is the key issue of 3D modeling. This paper presents a novel and an efficient 3D registration method based on 2D local feature matching. The proposed method transforms the point clouds into 2D bearing angle images and then uses the 2D feature based matching method, SURF, to find matching pixel pairs between two images. The corresponding points of 3D point clouds can be obtained by those pixel pairs. Since the corresponding pairs are sorted by their distance between matching features, only the top half of the corresponding pairs are used to find the optimal rotation matrix by the least squares approximation. In this paper, the optimal rotation matrix is derived by orthogonal Procrustes method (SVD-based approach). Therefore, the 3D model of an object can be reconstructed by aligning those point clouds with the optimal transformation matrix. Experimental results show that the accuracy of the proposed method is close to the ICP, but the computation cost is reduced significantly. The performance is six times faster than the generalized-ICP algorithm. Furthermore, while the ICP requires high alignment similarity of two scenes, the proposed method is robust to a larger difference of viewing angle.
Springer
以上显示的是最相近的搜索结果。 查看全部搜索结果