A 10-b 120-Msample/s time-interleaved analog-to-digital converter with digital background calibration

Abstract
Digital calibration using adaptive signal processing corrects for offset mismatch, gain mismatch, and sample-time error between time-interleaved channels in a 10-b 120-Msample/s pipelined analog-to-digital converter (ADC). Offset mismatch between channels is overcome with a random chopper-based offset calibration. Gain mismatch and sample-time error are overcome with correlation-based algorithms, which drive the correlation between a signal and its chopped image or its chopped and delayed image to zero. Test results show that, with a 0.99-MHz sinusoidal input, the ADC achieves a peak signal-to-noise-and-distortion ratio (SNDR) of 56.8 dB, a peak integral nonlinearity of 0.88 least significant bit (LSB), and a peak differential nonlinearity of 0.44 LSB. For a 39.9-MHz sinusoidal input, the ADC achieves a peak SNDR of 50.2 dB. The active area is 5.2 mm/sup 2/, and the power dissipation is 234 mW from a 3.3-V supply.

This publication has 27 references indexed in Scilit: