A note on the convergence of subgradient optimization methods
- 1 January 1983
- journal article
- research article
- Published by Taylor & Francis in Mathematische Operationsforschung und Statistik. Series Optimization
- Vol. 14 (4) , 537-541
- https://doi.org/10.1080/02331938308842889
Abstract
The problem to minimize a convex function f over a closed convex set C on which it is subdifferentiable can be approached by methods of subgradient optimization. In this note, a simple condition is given under which a sequence generated by such a method is convergent if arid only if f attains its infimum over G. An answer to the question when this condition can be satisfied is given, too.Keywords
This publication has 2 references indexed in Scilit:
- Monotone Operators and the Proximal Point AlgorithmSIAM Journal on Control and Optimization, 1976
- Convex AnalysisPublished by Walter de Gruyter GmbH ,1970