Article ID Journal Published Year Pages File Type
395254 Information Sciences 2007 8 Pages PDF
Abstract

The measure-theoretic definition of Kullback–Leibler relative-entropy (or simply KL-entropy) plays a basic role in defining various classical information measures on general spaces. Entropy, mutual information and conditional forms of entropy can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. These measure-theoretic definitions are key to extending the ergodic theorems of information theory to non-discrete cases. A fundamental theorem in this respect is the Gelfand–Yaglom–Perez (GYP) Theorem [M.S. Pinsker, Information and Information Stability of Random Variables and Process, 1960, Holden-Day, San Francisco, CA (English ed., 1964, translated and edited by Amiel Feinstein), Theorem. 2.4.2] which states that measure-theoretic relative-entropy equals the supremum of relative-entropies over all measurable partitions. This paper states and proves the GYP-theorem for Rényi relative-entropy of order greater than one. Consequently, the result can be easily extended to Tsallis relative-entropy.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,