Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
1708387 | Applied Mathematics Letters | 2012 | 4 Pages |
Abstract
Entropy, conditional entropy and mutual information for discrete-valued random variables play important roles in information theory (see Ash (1965)Â [8] and Cover and Thomas (2006)Â [9]). Our purpose within this work is to present a strong upper bound for the classical Shannon entropy, refining recent results from the literature. For this purpose we have considered the work of Simic (2009)Â [4], where new entropy bounds based on a new refinement of Jensen's inequality are presented. Our work improves the basic result of Simic through a stronger refinement of Jensen's inequality which is then applied to information theory.
Related Topics
Physical Sciences and Engineering
Engineering
Computational Mechanics
Authors
N. Å¢ÄpuÅ, P.G. Popescu,