کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
4608592 1338365 2015 14 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
On lower complexity bounds for large-scale smooth convex optimization
موضوعات مرتبط
مهندسی و علوم پایه ریاضیات آنالیز ریاضی
پیش نمایش صفحه اول مقاله
On lower complexity bounds for large-scale smooth convex optimization
چکیده انگلیسی

We derive lower bounds on the black-box oracle complexity of large-scale smooth convex minimization problems, with emphasis on minimizing smooth (with Hölder continuous, with a given exponent and constant, gradient) convex functions over high-dimensional ‖⋅‖p‖⋅‖p-balls, 1≤p≤∞1≤p≤∞. Our bounds turn out to be tight (up to logarithmic in the design dimension factors), and can be viewed as a substantial extension of the existing lower complexity bounds for large-scale convex minimization covering the nonsmooth case and the “Euclidean” smooth case (minimization of convex functions with Lipschitz continuous gradients over Euclidean balls). As a byproduct of our results, we demonstrate that the classical Conditional Gradient algorithm is near-optimal, in the sense of Information-Based Complexity Theory, when minimizing smooth convex functions over high-dimensional ‖⋅‖∞‖⋅‖∞-balls and their matrix analogies–spectral norm balls in the spaces of square matrices.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Journal of Complexity - Volume 31, Issue 1, February 2015, Pages 1–14
نویسندگان
, ,