Abstract
The detection of delayed emission in X-ray, optical, and radio wavelength, or "afterglow," following a γ-ray burst can be described as the emission of a relativistic shell decelerating upon collision with the interstellar medium. We show that the observed radiation surface has well-defined bright edges. We derive an explicit expression for the size as a function of time, and obtain the surface brightness distribution. This might be directly observed if the burst occurs at a small redshift so that its radio signal can be resolved. The size and shape are relevant for detailed analysis of scintillation or microlensing. We show that the effective Lorentz factor depends on the observed frequency and that it is higher for frequencies above the synchrotron typical frequency (optical and X-ray) than for low frequencies (radio). Consequently, transition to nonrelativistic evolution will be observed first in low frequencies and only a factor of ~2 later in the high frequencies.
All Related Versions