Abstract
The influence of spectral-line blending, line-wing loss and self-absorption on the experimental determination of the integrated intensity of various Ar I lines emitted by a homogeneous LTE plasma have been investigated using a computer-based model of the spectrum. Correction factors have been determined for the 714.7, 560.67, 641.63 and 430.01 nm lines as a function of electron density and as expected have been found to be largest for lines originating from the most highly excited levels. These results appear to explain some recent experimental observations more simply than explanations in terms of the high-density corrections to the Boltzmann distribution of number densities.