کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
4637086 | 1340734 | 2006 | 33 صفحه PDF | دانلود رایگان |
![عکس صفحه اول مقاله: New quasi-Newton methods for unconstrained optimization problems New quasi-Newton methods for unconstrained optimization problems](/preview/png/4637086.png)
Many methods for solving minimization problems are variants of Newton method, which requires the specification of the Hessian matrix of second derivatives. Quasi-Newton methods are intended for the situation where the Hessian is expensive or difficult to calculate. Quasi-Newton methods use only first derivatives to build an approximate Hessian over a number of iterations. This approximation is updated each iteration by a matrix of low rank. In unconstrained minimization, the original quasi-Newton equation is Bk+1sk = yk, where yk is the difference of the gradients at the last two iterates. In this paper, we first propose a new quasi-Newton equation Bk+1sk=yk∗ in which yk∗ is decided by the sum of yk and Aksk where Ak is some matrix. Then we give two choices of Ak which carry some second order information from the Hessian of the objective function. The three corresponding BFGS-TYPE algorithms are proved to possess global convergence property. The superlinear convergence of the one algorithm is proved. Extensive numerical experiments have been conducted which show that the proposed algorithms are very encouraging.
Journal: Applied Mathematics and Computation - Volume 175, Issue 2, 15 April 2006, Pages 1156–1188