Article ID Journal Published Year Pages File Type
533990 Pattern Recognition Letters 2013 11 Pages PDF
Abstract

•We introduce a new online algorithm for orthogonal regression (OR).•The method is constructed via a stochastic gradient descent approach.•An incremental strategy (ISA) is introduced, which is used to find sparse solutions.•The ISA also can be used to find the “minimal tube” containing the data.•As far as we are aware, this is the first method for OR that uses the “kernel-trick.”

In this paper, we introduce a new online algorithm for orthogonal regression. The method is constructed via an stochastic gradient descent approach combined with the idea of a tube loss function, which is similar to the one used in support vector (SV) regression. The algorithm can be used in primal or in dual variables. The latter formulation allows the introduction of kernels and soft margins. In addition, an incremental strategy algorithm is introduced, which can be used to find sparse solutions and also an approximation to the “minimal tube” containing the data. The algorithm is very simple to implement and avoids quadratic optimization.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , , ,