Image matching is a fundamental task in photogrammetry and computer vision. While effective solutions exist for narrow-baseline viewing conditions, using detectors, e.g., based on differences of Gaussians (DoG) and descriptors such as scale-invariant feature transform (SIFT), it still remains a challenging problem for wide-baseline configurations. This is particularly true when dealing with UAV-based (unmanned aerial vehicle) images together with images taken from the ground. In this paper, we propose a method for wide-baseline image matching that extends the current state-of-the-art approach matching on demand with view synthesis (MODS) in such a way that even more extreme wide-baseline problems can be solved. We achieve this (1) by making use of projective transformations during view synthesis to overcome limitations induced by the approximate character of affine transformations and (2) by estimating the essential matrix within geometric verification to more robustly filter incorrect correspondences in case of a known camera calibration. We have evaluated our approach on several challenging image pairs mainly consisting of UAV-based images together with images taken from the ground and demonstrate improved performance compared to MODS.
«Image matching is a fundamental task in photogrammetry and computer vision. While effective solutions exist for narrow-baseline viewing conditions, using detectors, e.g., based on differences of Gaussians (DoG) and descriptors such as scale-invariant feature transform (SIFT), it still remains a challenging problem for wide-baseline configurations. This is particularly true when dealing with UAV-based (unmanned aerial vehicle) images together with images taken from the ground. In this paper, we p...
»