Article ID Journal Published Year Pages File Type
472742 Computers & Mathematics with Applications 2007 13 Pages PDF
Abstract

In this paper, we present a multi-step memory gradient method with Goldstein line search for unconstrained optimization problems and prove its global convergence under some mild conditions. We also prove the linear convergence rate of the new method when the objective function is uniformly convex. Numerical results show that the new algorithm is suitable to solve large-scale optimization problems and is more stable than other similar methods in practical computation.

Related Topics
Physical Sciences and Engineering Computer Science Computer Science (General)
Authors
, ,