Quantitative low-energy x-ray spectroscopy (50–100-Å region)

Abstract
The quantitative analyis of emission spectra in the 10–100-Å region has become of considerable importance for high-temperature plasma diagnostics (106–107 °K region) and for molecular orbital and solid-state-band analysis. Because measurement intensities are typically low in these applications, achieving an optimum spectrographic measurement is essential. In order to present specific procedures and methods for optimizing and calibrating a low-energy spectrographic measurement, a molecular orbital analysis in the 70–90-Å region (S-LII,III emission spectra) has been carried out quantitatively in energy and intensity using a recently described single-crystal (lead stearate) spectrographic approach with about 1 eV resolution. Radiative yields, Y, for the radiation process being investigated are determined by the relation, Y=Z/(X0RSTQ), where Z is the area (intensity x angle) under the spectrographic line, X0 the excitation function, R (λ) the coefficient of reflection of the analyzer, S (λ) the effective source thickness, T (λ) the window transmission, and Q (λ) the quantum counting efficiency of the detector. The determination of each of these parameters has been considered in detail.