Article ID Journal Published Year Pages File Type
481952 European Journal of Operational Research 2010 11 Pages PDF
Abstract

An accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for solving unconstrained optimization problems is presented. The basic idea is to combine the scaled memoryless BFGS method and the preconditioning technique in the frame of the conjugate gradient method. The preconditioner, which is also a scaled memoryless BFGS matrix, is reset when the Beale–Powell restart criterion holds. The parameter scaling the gradient is selected as a spectral gradient. For the steplength computation the method has the advantage that in conjugate gradient algorithms the step lengths may differ from 1 by two order of magnitude and tend to vary unpredictably. Thus, we suggest an acceleration scheme able to improve the efficiency of the algorithm. Under common assumptions, the method is proved to be globally convergent. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in the function values is significantly improved. In mild conditions the algorithm is globally convergent for strongly convex functions. Computational results for a set consisting of 750 unconstrained optimization test problems show that this new accelerated scaled conjugate gradient algorithm substantially outperforms known conjugate gradient methods: SCALCG [3], [4], [5] and [6], CONMIN by Shanno and Phua (1976, 1978) [42] and [43], Hestenes and Stiefel (1952) [25], Polak–Ribiére–Polyak (1969) [32] and [33], Dai and Yuan (2001) [17], Dai and Liao (2001) (t=1)(t=1)[14], conjugate gradient with sufficient descent condition [7], hybrid Dai and Yuan (2001) [17], hybrid Dai and Yuan zero (2001) [17], CG_DESCENT by Hager and Zhang (2005, 2006) [22] and [23], as well as quasi-Newton LBFGS method [26] and truncated Newton method by Nash (1985) [27].

Related Topics
Physical Sciences and Engineering Computer Science Computer Science (General)
Authors
,