Abstract
A theoretical calculation of the effect of isotopic defects on the infrared lattice absorption of ionic crystals is presented. The main concern has been to establish the relative strength of the absorption by local vibrations in such a system as compared with that of the nonlocal modes. It has been found that, when a local mode exists, it may display a vastly enhanced absorption cross section as compared with that of a charged defect in a nonpolar host. This effect arises from the fact that the uniform external field interacts directly with the zero-wave-vector components of the normal modes of the imperfect crystal. This interaction induces transitions into states which are not true eigenstates of the lattice Hamiltonian, and in which there is an effective coupling between band and local modes that results in the scattering and absorption of band-mode phonons by the local vibrations. It has been shown that if the local-mode absorption frequency is close to that of the pure crystal, a small concentration of impurity is sufficient to shift the main absorption to the local-mode frequency, as has been observed in the case of Li(H,D) mixtures containing 5% D impurity. Furthermore, application of the theory to calculating the temperature dependence of the total vibrational absorption by the local modes of U centers (H or D ions substituted into an alkali halide) has shown that, even for single defects which produce highly localized vibrations, it is incorrect to regard the local-mode and host-lattice absorptions as independent.

This publication has 6 references indexed in Scilit: