Abstract
If one has a set of observables ( z 1 , ··· , z m ) which are bound in a relation with certain parameters ( a 1 , ··· , a n ) by an equation ζ ( z 1 , ··· , a 1 , ···) = 0, one frequently has the problem of determining a set of values of the a i which minimizes the sum of squares of differences between observed and calculated values of a distinguished observable, say z m . If the solution of the above equation for z m , z m = η ( z 1 , ··· ; a 1 , ···) gives rise to a function η which is nonlinear in the a i , then one may rely on a version of Gaussian regression [1, 2] for an iteration scheme that converges to a minimizing set of values. It is shown here that this same minimization technique may be used for the solution of simultaneous (not necessarily linear) equations.

This publication has 2 references indexed in Scilit: