Matching images with different resolutions

Abstract
In this paper we address the problem of matching two images with two different resolutions: a high-resolution image and a low-resolution one. On the premise that changes in resolution act as a smoothing equivalent to changes in scale, a scale-space representation of the high-resolution image is produced. Hence the one-to-one classical image matching paradigm becomes one-to-many because the low-resolution image is compared with all the scale-space representations of the high-resolution one. Key to the success of such a process is the proper representation of the features to be matched in scale-space. We show how to extract interest points at variable scales and we devise a method allowing the comparison of two images at two different resolutions. The method comprises the use of photometric- and rotation-invariant descriptors, a geometric model mapping the high-resolution image onto a low-resolution image region, and an image matching strategy based on the robust estimation of this geometric model. Extensive experiments show that our matching method can be used for scale changes up to a factor 6.

This publication has 18 references indexed in Scilit: