Abstract
In order to analyze the imaging properties of an electron-optical system, it is necessary to know how various sources of aberration combine to increase the size of the final image or spot. Either linear mixing, or else quadratic mixing (based on the assumption that the aberration produces a Gaussian distribution), have been commonly used. In this paper a theory of aberration mixing is developed, starting from the assumption of a circularly symmetric current-density distribution, each point on which is subjected to a circularly symmetric aberration. If the effective radius of the spot (and of each contribution) is defined as the square root of the normalized second moment of the current-density distribution, then the following theorem is shown to hold: when a spot with a circularly symmetric current-density distribution suffers any circularly symmetric aberration, the effective radius of the resultant spot is the square root of the sum of the squares of the effective radii of the original spot and the aberration disk. If several sources of aberration are present, it follows that the effective radii all add quadratically. For a Gaussian distribution, it is shown that the effective radius is just the 1/e point on the current-density distribution. For a rectangular distribution (e.g., defocusing or, to first approximation, astigmatism) and the distribution resulting from the spherical aberration of a lens, numerical constants are derived which relate the effective radius to the more usual and often more convenient outer diameter of the aberration disk.

This publication has 0 references indexed in Scilit: