Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
4944433 | Information Sciences | 2017 | 14 Pages |
Abstract
Online pairwise learning algorithms with general convex loss functions without regularization in a Reproducing Kernel Hilbert Space (RKHS) are investigated. Under mild conditions on loss functions and the RKHS, upper bounds for the expected excess generalization error are derived in terms of the approximation error when the stepsize sequence decays polynomially. In particular, for Lipschitz loss functions such as the hinge loss, the logistic loss and the absolute-value loss, the bounds can be of order O(Tâ13logT) after T iterations, while for the least squares loss, the bounds can be of order O(Tâ14logT). In comparison with previous works for these algorithms, a broader family of convex loss functions is studied here, and refined upper bounds are obtained.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Junhong Lin, Yunwen Lei, Bo Zhang, Ding-Xuan Zhou,