Article ID Journal Published Year Pages File Type
975842 Physica A: Statistical Mechanics and its Applications 2006 6 Pages PDF
Abstract
It is proved that the only additive and isotropic information measure that can depend on the probability distribution and also on its first derivative is a linear combination of the Boltzmann-Gibbs-Shannon and Fisher information measures. Power-law equilibrium distributions are found as a result of the interaction of the two terms. The case of second order derivative dependence is investigated and a corresponding additive information measure is given.
Related Topics
Physical Sciences and Engineering Mathematics Mathematical Physics
Authors
,