Article ID Journal Published Year Pages File Type
4944433 Information Sciences 2017 14 Pages PDF
Abstract
Online pairwise learning algorithms with general convex loss functions without regularization in a Reproducing Kernel Hilbert Space (RKHS) are investigated. Under mild conditions on loss functions and the RKHS, upper bounds for the expected excess generalization error are derived in terms of the approximation error when the stepsize sequence decays polynomially. In particular, for Lipschitz loss functions such as the hinge loss, the logistic loss and the absolute-value loss, the bounds can be of order O(T−13logT) after T iterations, while for the least squares loss, the bounds can be of order O(T−14logT). In comparison with previous works for these algorithms, a broader family of convex loss functions is studied here, and refined upper bounds are obtained.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,