Object-guided day-night visual localization in urban scenes

A Benbihi, C Pradalier, O Chum - 2022 26th International …, 2022 - ieeexplore.ieee.org
2022 26th International Conference on Pattern Recognition (ICPR), 2022ieeexplore.ieee.org
We introduce Object-Guided localization (OGuL) based on a novel method of local-feature
matching. Direct matching of local features is sensitive to significant changes in illumination.
In contrast, object detection often survives severe changes in lighting conditions. The
proposed method first detects semantic objects and establishes correspondences of those
objects between images. Object correspondences provide local coarse alignment of the
images in the form of a planar homography. These homographies are consequently used to …
We introduce Object-Guided localization (OGuL) based on a novel method of local-feature matching. Direct matching of local features is sensitive to significant changes in illumination. In contrast, object detection often survives severe changes in lighting conditions. The proposed method first detects semantic objects and establishes correspondences of those objects between images. Object correspondences provide local coarse alignment of the images in the form of a planar homography. These homographies are consequently used to guide the matching of local features. Experiments on standard urban localization datasets (Aachen, RobotCar-Season) show that OGuL significantly improves localization results with as simple local features as SIFT, and its performance competes with the state-of-the-art CNN-based methods trained for day-to-night localization.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果