کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
536039 | 870439 | 2011 | 8 صفحه PDF | دانلود رایگان |
Machine Learning based on the Regularized Least Squares (RLS) model requires one to solve a system of linear equations. Direct-solution methods exhibit predictable complexity and storage, but often prove impractical for large-scale problems; iterative methods attain approximate solutions at lower complexities, but heavily depend on learning parameters. The paper shows that applying the properties of Toeplitz matrixes to RLS yields two benefits: first, both the computational cost and the memory space required to train an RLS-based machine reduce dramatically; secondly, timing and storage requirements are defined analytically. The paper proves this result formally for the one-dimensional case, and gives an analytical criterion for an effective approximation in multidimensional domains. The approach validity is demonstrated in several real-world problems involving huge data sets with highly dimensional data.
Research highlights
► Fast approximated Toeplitz matrix Regularized Least Squares learning.
► Embedded device friendly learning algorithm for Regularized Least Squares.
► Levinson Trench Zohar algorithm for approximated regularized squares learning.
Journal: Pattern Recognition Letters - Volume 32, Issue 3, 1 February 2011, Pages 468–475