Abstract
The paper treats general convergence conditions for a class of algorithms for finding the minima of a function f(x) when f(x) is of unknown (or partly unknown) form - and when only noise corrupted observations can be taken. Such problems occur frequently in adaptive processes, and in many applications to statistics and estimation. The algorithms are of the stochastic approximation type. Several forms are dealt with - for estimation in either discrete or continuous time, with and without side constraints. The algorithms can be considered as sequential Monte Carlo methods for systems optimization. The main purpose of the paper is to show how the theory of weak convergence of a sequence of probability measures can be used to reduce the convergence problem to "basic" elements, and to yield, quickly and efficiently, a convergence theorem under rather general conditions on the noise. Many extensions are possible, and only a few applications are considered here. The ideas are useful for establishing deeper (than the usual) results on rates of convergence [17]. Extensions to problems where the parameters are abstract valued is also possible. We mention also that a study of the numerical properties of several algorithms for constrained SA (stochastic approximation) appear in [18].