Shot Noise of Single-Electron Tunneling in 1D Arrays

Abstract
We have used numerical modeling and a semi-analytical calculation method to find the low frequency value S_{I}(0) of the spectral density of fluctuations of current through 1D arrays of small tunnel junctions, using the ``orthodox theory'' of single-electron tunneling. In all three array types studied, at low temperature (kT << eV), increasing current induces a crossover from the Schottky value S_{I}(0)=2e to the ``reduced Schottky value'' S_{I}(0)=2e/N (where N is the array length) at some crossover current I_{c}. In uniform arrays over a ground plane, I_{c} is proportional to exp(-\lambda N), where 1/\lambda is the single-electron soliton length. In arrays without a ground plane, I_{c} decreases slowly with both N and \lambda. Finally, we have calculated the statistics of I_{c} for ensembles of arrays with random background charges. The standard deviation of I_{c} from the ensemble average is quite large, typically between 0.5 and 0.7 of , while the dependence of on N or \lambda is so weak that it is hidden within the random fluctuations of the crossover current.

This publication has 0 references indexed in Scilit: