Globally convergent block-coordinate techniques for unconstrained optimization
- 1 January 1999
- journal article
- research article
- Published by Taylor & Francis in Optimization Methods and Software
- Vol. 10 (4) , 587-637
- https://doi.org/10.1080/10556789908805730
Abstract
In this paper we define new classes of globally convergent block-coordinate techniques for the unconstrained minimization of a continuously differentiable function. More specifically, we first describe conceptual models of decomposition algorithms based on the interconnection of elementary operations performed on the block components of the variable vector. Then we characterize the elementary operations defined through a suitable line search or the global minimization in a component subspace. Using these models, we establish new results on the convergence of the nonlinear Gauss–Seidel method and we prove that this method with a two-block decomposition is globally convergent towards stationary points, even in the absence of convexity or uniqueness assumptions. In the general case of nonconvex objective function and arbitrary decomposition we define new globally convergent line-search-based schemes that may also include partial global inimizations with respect to some component. Computational aspects are discussed and, in particular, an application to a learning problem in a Radial Basis Function neural network is illustrated.Keywords
This publication has 15 references indexed in Scilit:
- Parallel Gradient Distribution in Unconstrained OptimizationSIAM Journal on Control and Optimization, 1995
- Parallel Variable DistributionSIAM Journal on Optimization, 1994
- A proximal-based decomposition method for convex minimization problemsMathematical Programming, 1994
- Error bounds and convergence analysis of feasible descent methods: a general approachAnnals of Operations Research, 1993
- Application of the alternating direction method of multipliers to separable convex programming problemsComputational Optimization and Applications, 1992
- On the convergence of the coordinate descent method for convex differentiable minimizationJournal of Optimization Theory and Applications, 1992
- Networks for approximation and learningProceedings of the IEEE, 1990
- Global convergence and stabilization of unconstrained minimization methods without derivativesJournal of Optimization Theory and Applications, 1988
- Stopping criteria for linesearch methods without derivativesMathematical Programming, 1984
- A dual algorithm for the solution of nonlinear variational problems via finite element approximationComputers & Mathematics with Applications, 1976