On second-order sufficient optimality conditions for c 1,1-optimization problems
- 1 January 1988
- journal article
- research article
- Published by Taylor & Francis in Optimization
- Vol. 19 (2) , 169-179
- https://doi.org/10.1080/02331938808843333
Abstract
The present paper is concerned with optimization problems in which the data are differentiate functions having a locally Lipschitzian gradient mapping (C1.1-functions). We give second-order sufficient conditions for a stationary solution to be isolated or to be a strict local minimizer. It is shown that the results and ideas known for the case of twice differentiate data can be extended in a natural way. Applications to the analysis of semi-infinite programs, of iterated minimization procedures and of the stability of C1,1-programs are sketched.Keywords
This publication has 11 references indexed in Scilit:
- On Iterated Minimization in Nonconvex OptimizationMathematics of Operations Research, 1986
- On the Regularity of the Kuhn–Tucker CurveSIAM Journal on Control and Optimization, 1986
- Stability in Mathematical Programming with Nondifferentiable DataSIAM Journal on Control and Optimization, 1984
- Generalized Hessian matrix and second-order optimality conditions for problems withC 1,1 dataApplied Mathematics & Optimization, 1984
- Second-Order Sufficiency Conditions for Nondifferentiable Programming ProblemsSIAM Journal on Control and Optimization, 1982
- Refinements of necessary optimality conditions in nondifferentiable programming IIPublished by Springer Nature ,1982
- Strongly Stable Stationary Solutions in Nonlinear ProgramsPublished by Elsevier ,1980
- Lipschitz Continuity for Constrained ProcessesSIAM Journal on Control and Optimization, 1979
- Analysis and optimization of Lipschitz continuous mappingsJournal of Optimization Theory and Applications, 1977
- On the inverse function theoremPacific Journal of Mathematics, 1976