Article ID Journal Published Year Pages File Type
1153734 Statistical Methodology 2010 21 Pages PDF
Abstract

Given a vector-valued stationary time series, we define information quantities such as entropy, divergence, and mutual information by their second- and higher-order cumulant spectra. These quantities are naturally introduced from the information geometrical viewpoint. We present their expressions for linear processes and for random vectors. In the case of linear processes, relations to the identification of transfer function matrices are clarified. In the case of random vectors, relations to the quantities defined by using probability density functions are provided. As an application, we treat the identification of nonlinear systems in the framework of this paper. We also present differential geometrical backgrounds based on the invariance for our definitions.

Related Topics
Physical Sciences and Engineering Mathematics Statistics and Probability
Authors
,