Article ID Journal Published Year Pages File Type
10997877 Journal of Computational and Applied Mathematics 2019 18 Pages PDF
Abstract
The conjugate gradient methods (CGMs) are very effective iterative methods for solving large-scale unconstrained optimization. The aim of this work is to improve the Fletcher-Reeves and Dai-Yuan CGMs. First, based on the conjugate parameters of the Fletcher-Reeves (FR) method and the Dai-Yuan (DY) method, and combining the second inequality of the strong Wolfe line search, two new conjugate parameters are constructed. Second, using the two new conjugate parameters, another FR type conjugate parameter is presented. Third, utilizing the strong Wolfe line search to yield the steplength, three improved CGMs are proposed for large-scale unconstrained optimization. Under usual assumptions, the improved methods are all proved to possess sufficient descent property and global convergence. Finally, three group experiments and their corresponding performance profiles are reported, which show that the proposed methods are very promising.
Related Topics
Physical Sciences and Engineering Mathematics Applied Mathematics
Authors
, ,