کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
408600 679036 2007 10 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Learning principal directions: Integrated-squared-error minimization
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
پیش نمایش صفحه اول مقاله
Learning principal directions: Integrated-squared-error minimization
چکیده انگلیسی

A common derivation of principal component analysis (PCA) is based on the minimization of the squared-error between centered data and linear model, corresponding to the reconstruction error. In fact, minimizing the squared-error leads to principal subspace analysis where scaled and rotated principal axes of a set of observed data, are estimated. In this paper, we introduce and investigate an alternative error measure, integrated-squared error (ISE), the minimization of which determines the exact principal axes (without rotational ambiguity) of a set of observed data. We show that exact principal directions emerge from the minimization of ISE. We present a simple EM algorithm, ‘EM-ePCA’, which is similar to EM-PCA [S.T. Roweis, EM algorithms for PCA and SPCA, in: Advances in Neural Information Processing Systems, vol. 10, MIT Press, Cambridge, 1998, pp. 626–632.], but finds exact principal directions without rotational ambiguity. In addition, we revisit the generalized Hebbian algorithm (GHA) and show that it emerges from the ISE minimization in a single-layer linear feedforward neural network.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 70, Issues 7–9, March 2007, Pages 1372–1381
نویسندگان
, , ,