Article ID Journal Published Year Pages File Type
4605655 Applied and Computational Harmonic Analysis 2007 17 Pages PDF
Abstract

In this paper we consider fully online learning algorithms for classification generated from Tikhonov regularization schemes associated with general convex loss functions and reproducing kernel Hilbert spaces. For such a fully online algorithm, the regularization parameter in each learning step changes. This is the essential difference from the partially online algorithm which uses a fixed regularization parameter. We first present a novel approach to the drift error incurred by the change of the regularization parameter. Then we estimate the error of the learning process for the strong approximation in the reproducing kernel Hilbert space. Finally, learning rates are derived from decays of the regularization error. The convexity of the loss function plays an important role in our analysis. Concrete learning rates are given for the hinge loss and the support vector machine q-norm loss.

Related Topics
Physical Sciences and Engineering Mathematics Analysis