Automatic matching of dissimilar SAM‐images

Abstract
An important step in analysing multispectral image data—measured at different times or at slightly differing positions of the sample—is the matching of these data. The matching process consists of two steps: registration, which restores image differences induced by varying detector positions, and comparison, which follows image analysis and makes it possible to overlap different images and to detect and classify differences of the image contents.1–3We present a new technique for registering uni‐ and multispectral image data, which is independent of the underlying grey‐level distributions. This characteristic property is achieved by first transforming the representation of the image into a set of connected regions with homogeneous grey levels by means of a k‐means classification algorithm. The idea is to compare the geometric extensions of overlapping regions instead of single pixel values of images for a given geometric transformation. The transformation which yields the best similarity is then chosen to restore the data.The method was applied to several different uni‐ and multispectral scanning Auger microscopy (SAM) images and the results were judged according to accuracy and robustness.

This publication has 15 references indexed in Scilit: