کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
5776146 | 1631963 | 2018 | 19 صفحه PDF | دانلود رایگان |
عنوان انگلیسی مقاله ISI
Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
ترجمه فارسی عنوان
چند روش شبیه ساز کارآمد با تقریب مطلوب برای بهینه سازی بدون محدودیت در مقیاس بزرگ
دانلود مقاله + سفارش ترجمه
دانلود مقاله ISI انگلیسی
رایگان برای ایرانیان
موضوعات مرتبط
مهندسی و علوم پایه
ریاضیات
ریاضیات کاربردی
چکیده انگلیسی
In this paper we introduce a new concept of approximate optimal stepsize for gradient method, use it to interpret the nice numerical effect of the Barzilai-Borwein (BB) method, and present several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization. Based on revising some modified BFGS update formulae, we construct some new quadratic approximation models to develop several approximate optimal stepsizes. It is remarkable that these approximate optimal stepsizes lie in the intervals which contain the two well-known BB stepsizes. We then truncate these approximate optimal stepsizes by the two well-known BB stepsizes and treat the resulted approximate optimal stepsizes as the new stepsizes for gradient methods. Moreover, for the nonconvex case, we also design a new approximation model to generate an approximate optimal stepsize for gradient methods. We establish the convergences of the proposed methods under weaker condition. Numerical results show that the proposed methods are very promising.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Journal of Computational and Applied Mathematics - Volume 328, 15 January 2018, Pages 400-413
Journal: Journal of Computational and Applied Mathematics - Volume 328, 15 January 2018, Pages 400-413
نویسندگان
Zexian Liu, Hongwei Liu,