| Article ID | Journal | Published Year | Pages | File Type |
|---|---|---|---|---|
| 4630025 | Applied Mathematics and Computation | 2012 | 10 Pages |
Abstract
Simple modifications of the limited-memory BFGS method (L-BFGS) for large scale unconstrained optimization are considered, which consist in corrections (derived from the idea of conjugate directions) of the used difference vectors, utilizing information from the preceding iteration. For quadratic objective functions, the improvement of convergence is the best one in some sense and all stored difference vectors are conjugate for unit stepsizes. Global convergence of the algorithm is established for convex sufficiently smooth functions. Numerical experiments indicate that the new method often improves the L-BFGS method significantly.
Related Topics
Physical Sciences and Engineering
Mathematics
Applied Mathematics
Authors
J. Vlček, L. Lukšan,
