Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
1896917 | Chaos, Solitons & Fractals | 2006 | 6 Pages |
Abstract
Chaos is often explained in terms of random behaviour; and having positive Kolmogorov-Sinai entropy (KSE) is taken to be indicative of randomness. Although seemly plausible, the association of positive KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. A common way of justifying this use of the KSE is to draw parallels between the KSE and Shannon's information theoretic entropy. However, as it stands this no more than a heuristic point, because no rigorous connection between the KSE and Shannon's entropy has been established yet. This paper fills this gap by proving that the KSE of a Hamiltonian dynamical system is equivalent to a generalized version of Shannon's information theoretic entropy under certain plausible assumptions.
Related Topics
Physical Sciences and Engineering
Physics and Astronomy
Statistical and Nonlinear Physics
Authors
Roman Frigg,