کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
404905 | 677462 | 2006 | 10 صفحه PDF | دانلود رایگان |

Decorrelation and its higher-order generalization, independent component analysis (ICA), are fundamental and important tasks in unsupervised learning, that were studied mainly in the domain of Hebbian learning. In this paper we present a variation of the natural gradient ICA, differential ICA, where the learning relies on the concurrent change of output variables. We interpret the differential learning as the maximum likelihood estimation of parameters with latent variables represented by the random walk model. In such a framework, we derive the differential ICA algorithm and, in addition, we also present the differential decorrelation algorithm that is treated as a special instance of the differential ICA. Algorithm derivation and local stability analysis are given with some numerical experimental results.
Journal: Neural Networks - Volume 19, Issue 10, December 2006, Pages 1558–1567