Abstract
An accurate method to analyze the joint effect of clipping, quantization and thermal noise in the analog-to-digital conversion in high-speed orthogonal frequency division multiplexing (OFDM) receivers is presented. The model provides information about the spectral properties of the distortion noise, which allows the analytical characterization of the signal-to-noise distortion ratio at the input of the demodulator. Oversampling, filtering, uniform, and optimal nonuniform quantization effects are taken into account. The optimal automatic gain control working point is evaluated for each parameter configuration. The differences from the results obtained by the classical pseudo-quantization noise model are highlighted showing that, due to the spectral characteristics, the gain introduced by oversampling is generally a few decibels lower than what foreseen by the classical theory. Moreover, the impact of these effects on the OFDM receiver performance degradation is evaluated and the validity of simplified approximations is discussed

This publication has 20 references indexed in Scilit: