Abstract
The photoelectric emission from F-centers in RbI was measured at 300°K. It was compared with the optical absorption of pure RbI as determined by Fesefeldt's method at the same temperature. The results were analogous to those previously reported for KI, and indicated that exciton-enhanced photoelectric emission reached a peak at hν5.6 ev, where the optical absorption had its first maximum. When the films were cooled to 85°K, the excitoninduced emission showed a double maximum separated by a sharp minimum at hν=5.72 ev. This minimum coincided with the shifted position of the first optical absorption peak at the lower temperature. The spectral distribution of the yield had somewhat the appearance of a self-reversed spectral line. An attractive explanation is that the photoelectric emission at first increases with the optical absorption, reaches a maximum for an absorption constant near 106 cm1, and then decreases as the absorption constant becomes still larger. The decrease is attributed to a sparsity of F-centers near the surface, and to a growing destruction of excitons at the surface as the layer in which they are formed becomes thinner. Yields near the photoelectric threshold were compared qualitatively with Herring's theory of photoemission from impurities in a polar crystal. An upper limit of 1.9 ev was placed on the electron affinity of these RbI films.