An algorithm for associating the features of two images

Abstract
In this paper we describe an algorithm that operates on the distances between features in the two related images and delivers a set of correspondences between them. The algorithm maximizes the inner product of two matrices, one of which is the desired `pairing matrix' and the other a `proximity matrix' with elements exp (-r$_{ij}^{2}$/2$\sigma ^{2}$), where r$_{ij}$ is the distance between two features, one in each image, and $\sigma $ is an adjustable scale parameter. The output of the algorithm may be compared with the movements that people perceive when viewing two images in quick succession, and it is found that an increase in $\sigma $ affects the computed correspondences in much the same way as an increase in interstimulus interval alters the perceived displacements. Provided that $\sigma $ is not too small the algorithm will recover the feature mappings that result from image translation, expansion or shear deformation -- transformations of common occurrence in image sequences -- even when the displacements of individual features depart slightly from the general trend.

This publication has 2 references indexed in Scilit: