Nonlinear variations in the Faraday effect caused in atomic systems by a strong magnetic field

Abstract
A theory of nonlinear variations in the Faraday effect caused by a strong magnetic field is proposed. In atomic gases, the variation is defined by the nonlinear magneto-optical susceptibility tensor component χxyzzzem, which is expressed in terms of integrals of the radial Green's functions GE(r1,r2) of valence electrons of the atom. Applying for GE(r1,r2) and the wave functions analytical expressions in approximation of the model potential method, the authors carried out the first numerical calculations of χxyzzzem for inert-gas, alkali, and hydrogen atoms. The calculations show that the variations in the Verdet constant induced by the square of the magnetic field are sufficiently large (even in frequency regions far from resonance) for detection by the currently available strong-magnetic-field pulse technique.