Article ID Journal Published Year Pages File Type
536039 Pattern Recognition Letters 2011 8 Pages PDF
Abstract

Machine Learning based on the Regularized Least Squares (RLS) model requires one to solve a system of linear equations. Direct-solution methods exhibit predictable complexity and storage, but often prove impractical for large-scale problems; iterative methods attain approximate solutions at lower complexities, but heavily depend on learning parameters. The paper shows that applying the properties of Toeplitz matrixes to RLS yields two benefits: first, both the computational cost and the memory space required to train an RLS-based machine reduce dramatically; secondly, timing and storage requirements are defined analytically. The paper proves this result formally for the one-dimensional case, and gives an analytical criterion for an effective approximation in multidimensional domains. The approach validity is demonstrated in several real-world problems involving huge data sets with highly dimensional data.

Research highlights► Fast approximated Toeplitz matrix Regularized Least Squares learning. ► Embedded device friendly learning algorithm for Regularized Least Squares. ► Levinson Trench Zohar algorithm for approximated regularized squares learning.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , ,