Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
9868250 | Physics Letters A | 2005 | 8 Pages |
Abstract
Tsallis and Rényi entropy measures are two possible different generalizations of the Boltzmann-Gibbs entropy (or Shannon's information) but are not generalizations of each others. It is however the Sharma-Mittal measure, which was already defined in 1975 [J. Math. Sci. 10 (1975) 28] and which received attention only recently as an application in statistical mechanics [Physica A 285 (2000) 351, Eur. Phys. J. B 30 (2002) 543] that provides one possible unification. We will show how this generalization that unifies Rényi and Tsallis entropy in a coherent picture naturally comes into being if the q-formalism of generalized logarithm and exponential functions is used, how together with Sharma-Mittal's measure another possible extension emerges which however does not obey a pseudo-additive law and lacks of other properties relevant for a generalized thermostatistics, and how the relation between all these information measures is best understood when described in terms of a particular logarithmic Kolmogorov-Nagumo average.
Related Topics
Physical Sciences and Engineering
Physics and Astronomy
Physics and Astronomy (General)
Authors
Marco Masi,