A multiple-interpolation method for fast and precise time digitizing

Abstract
A method for digitizing time intervals is described utilizing a single interpolator which is repeatedly used to perform a multiple interpolation at each input event. The analysis shows that the method features the shortest dead time in comparison with the single and dual interpolation method. For a given channel width and clock period an optimum number of interpolation steps may be found to obtain minimum dead time of the digitizer. The detailed description of a model of the multiple interpolator is presented together with results of the experimental tests. A microprocessor control was utilized to accomplish automatic calibration and tests of the model. For an integer stretch factor [4], five interpolation steps and a 10-MHz clock with the incremental resolution of 97.6 ps was obtained, as well as the mean conversion time of 3.9 μs, integral nonlinearity error of 0.07 percent (rms), and differential nonlinearity error of 0.97 percent (rms).