The Q-Superlinear Convergence of a Collinear Scaling Algorithm for Unconstrained Optimization
- 1 February 1980
- journal article
- Published by Society for Industrial & Applied Mathematics (SIAM) in SIAM Journal on Numerical Analysis
- Vol. 17 (1) , 84-114
- https://doi.org/10.1137/0717011
Abstract
Summary:The paper contains a description and an analysis of two modifications of the conjugate gradient method for unconstrained minimization which find a minimum of the conic function after a finite number of steps. Moreover, further extension of the conjugate gradient method is given which is based on a more general class of the model functionsKeywords
This publication has 14 references indexed in Scilit:
- Conic Approximations and Collinear Scalings for OptimizersSIAM Journal on Numerical Analysis, 1980
- Quasi-Newton Methods, Motivation and TheorySIAM Review, 1977
- Optimally conditioned optimization algorithms without line searchesMathematical Programming, 1975
- A characterization of superlinear convergence and its application to quasi-Newton methodsMathematics of Computation, 1974
- On the Local and Superlinear Convergence of Quasi-Newton MethodsIMA Journal of Applied Mathematics, 1973
- Rank-one and Rank-two Corrections to Positive Definite Matrices Expressed in Product FormIMA Journal of Applied Mathematics, 1973
- Minimization Algorithms Making Use of Non-quadratic Properties of the Objective FunctionIMA Journal of Applied Mathematics, 1971
- Maximization by Quadratic Hill-ClimbingEconometrica, 1966
- Function Minimization Without Evaluating Derivatives--a ReviewThe Computer Journal, 1965
- A Rapidly Convergent Descent Method for MinimizationThe Computer Journal, 1963