A covariance matrix method to compute bit error rates in a highly nonlinear dispersion-managed soliton system

Abstract
We completely describe the covariance matrix method for the first time, and we use it to compute the noise evolution in a 10-Gb/s single-channel dispersion-managed soliton system propagating over 24000 km. The linearization assumption upon which the covariance matrix method is based breaks down, unless we explicitly separate the phase and timing jitter of each pulse from the noise. We describe a procedure for carrying out this separation.