A convergence analysis of a method of steepest descent and a two–step algorothm for nonlinear ill–posed problems
- 1 January 1996
- journal article
- research article
- Published by Taylor & Francis in Numerical Functional Analysis and Optimization
- Vol. 17 (1-2) , 197-214
- https://doi.org/10.1080/01630569608816691
Abstract
In this paper a convergence analysis of a method of steepest descent and a two–step algorithm for solving nonlinear ill–posed problems is presented. Combined with an appropriate stopping criterion both methods turn out to be stable.Keywords
This publication has 13 references indexed in Scilit:
- A convergence analysis of the Landweber iteration for nonlinear ill-posed problemsNumerische Mathematik, 1995
- Optimal a Posteriori Parameter Choice for Tikhonov Regularization for Solving Nonlinear Ill-Posed ProblemsSIAM Journal on Numerical Analysis, 1993
- The instability of some gradient methods for ill-posed problemsNumerische Mathematik, 1990
- A class of iterative methods of conjugate gradient typeNumerical Functional Analysis and Optimization, 1990
- Convergence rates for Tikhonov regularisation of non-linear ill-posed problemsInverse Problems, 1989
- Tikhonov regularisation for non-linear ill-posed problems: optimal convergence rates and finite-dimensional approximationInverse Problems, 1989
- Some History of the Conjugate Gradient and Lanczos Algorithms: 1948–1976SIAM Review, 1989
- A minimal error conjugate gradient method for ill-posed problemsJournal of Optimization Theory and Applications, 1989
- On the Convergence of the Conjugate Gradient Method for Singular Linear Operator EquationsSIAM Journal on Numerical Analysis, 1972
- Steepest descent for singular linear operators with nonclosed rangeApplicable Analysis, 1971