Abstract
The author investigates, by means of computer simulations, the performance of a very long, single-channel optical fiber system operating very close to the zero-dispersion wavelength of the fiber. Fiber losses are compensated by optical amplifiers. The optical signal is filtered after each amplifier, is passed through a final optical filter prior to square-law detection, and is finally filtered electrically. It is found that such a system does not work well if the fiber dispersion is strictly constant and if the carrier wavelength of the modulated signal coincides with the zero-dispersion wavelength of the fiber. As a result, the optical signal spectrum spreads to many times its initial width so that power is lost in the optical filters and the signal-to-noise ratio is degraded by the need for admitting a wider band of noise to the receiver.<>

This publication has 4 references indexed in Scilit: