کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
405959 678051 2016 11 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Online kernel learning with nearly constant support vectors
ترجمه فارسی عنوان
یادگیری هسته ای آنلاین با بردارهای پشتیبانی تقریبا ثابت
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
چکیده انگلیسی

Nyström method has been widely used to improve the computational efficiency of batch kernel learning. The key idea of Nyström method is to randomly sample M support vectors from the collection of T   training instances, and learn a kernel classifier in the space spanned by the randomly sampled support vectors. In this work, we studied online regularized kernel learning using the Nyström method, with a focus on the sample complexity, i.e. the number of randomly sampled support vectors that are needed to yield the optimal convergence rate O(1/T)O(1/T), where T   is the number of training instances received in online learning. We show that, when the loss function is smooth and strongly convex, only O(log2T) randomly sampled support vectors are needed to guarantee an O(logT/T)O(logT/T) convergence rate, which is almost optimal except for the logT factor. We further validate our theory by an extensive empirical study.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 179, 29 February 2016, Pages 26–36
نویسندگان
, , , , ,