کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
4640957 1632053 2009 13 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
موضوعات مرتبط
مهندسی و علوم پایه ریاضیات ریاضیات کاربردی
پیش نمایش صفحه اول مقاله
Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization
چکیده انگلیسی

In this paper we propose a fundamentally different conjugate gradient method, in which the well-known parameter βkβk is computed by an approximation of the Hessian/vector product through finite differences. For search direction computation, the method uses a forward difference approximation to the Hessian/vector product in combination with a careful choice of the finite difference interval. For the step length computation we suggest an acceleration scheme able to improve the efficiency of the algorithm. Under common assumptions, the method is proved to be globally convergent. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in function values is significantly improved. Numerical comparisons with conjugate gradient algorithms including CONMIN by Shanno and Phua [D.F. Shanno, K.H. Phua, Algorithm 500, minimization of unconstrained multivariate functions, ACM Trans. Math. Softw. 2 (1976) 87–94], SCALCG by Andrei [N. Andrei, Scaled conjugate gradient algorithms for unconstrained optimization, Comput. Optim. Appl. 38 (2007) 401–416; N. Andrei, Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Optim. Methods Softw. 22 (2007) 561–571; N. Andrei, A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Appl. Math. Lett. 20 (2007) 645–650], and new conjugacy condition and related new conjugate gradient by Li, Tang and Wei [G. Li, C. Tang, Z. Wei, New conjugacy condition and related new conjugate gradient methods for unconstrained optimization, J. Comput. Appl. Math. 202 (2007) 523–539] or truncated Newton TN by Nash [S.G. Nash, Preconditioning of truncated-Newton methods, SIAM J. on Scientific and Statistical Computing 6 (1985) 599–616] using a set of 750 unconstrained optimization test problems show that the suggested algorithm outperforms these conjugate gradient algorithms as well as TN.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Journal of Computational and Applied Mathematics - Volume 230, Issue 2, 15 August 2009, Pages 570–582
نویسندگان
,