Abstract
Synthetic aperture radar (SAR) [1,2] (and other imaging systems) measures the complex Fourier transform (called the signal history or phase history) of the scene being imaged, but it often suffers from one-dimensional (1-D) phase errors due to unknown system platform motion, target motion, system phase instabilities, and propagation through atmospheric turbulence. If uncorrected, these phase errors can cause severe blurring or smearing of the imagery. Phase errors (the residual phase errors remaining after correction for the measured motion) can be corrected by digital phase-error correction (sometimes called autofocus) methods, the most widely used being "sub-aperture processing" and "prominent-point processing." The disadvantage of sub-aperture processing, which is analogous to the concept of a Hartmann sensor in optics, is that it only works for low-order (up to about fourth-order) polynomial-type phase errors. The disadvantage of prominent-point processing, which deconvolves an image based on an estimate of the impulse response of the system, is that it requires the presence of an identifiable, isolated, strong point-like reflector in the scene being imaged. Furthermore, both of these two processing methods require extensive computations.