کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
4605655 | 1337590 | 2007 | 17 صفحه PDF | دانلود رایگان |

In this paper we consider fully online learning algorithms for classification generated from Tikhonov regularization schemes associated with general convex loss functions and reproducing kernel Hilbert spaces. For such a fully online algorithm, the regularization parameter in each learning step changes. This is the essential difference from the partially online algorithm which uses a fixed regularization parameter. We first present a novel approach to the drift error incurred by the change of the regularization parameter. Then we estimate the error of the learning process for the strong approximation in the reproducing kernel Hilbert space. Finally, learning rates are derived from decays of the regularization error. The convexity of the loss function plays an important role in our analysis. Concrete learning rates are given for the hinge loss and the support vector machine q-norm loss.
Journal: Applied and Computational Harmonic Analysis - Volume 23, Issue 2, September 2007, Pages 198-214