Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
8205146 | Physics Letters A | 2014 | 4 Pages |
Abstract
The hallmark of deterministic chaos is that it creates information-the rate being given by the Kolmogorov-Sinai metric entropy. Since its introduction half a century ago, the metric entropy has been used as a unitary quantity to measure a system's intrinsic unpredictability. Here, we show that it naturally decomposes into two structurally meaningful components: A portion of the created information-the ephemeral information-is forgotten and a portion-the bound information-is remembered. The bound information is a new kind of intrinsic computation that differs fundamentally from information creation: it measures the rate of active information storage. We show that it can be directly and accurately calculated via symbolic dynamics, revealing a hitherto unknown richness in how dynamical systems compute.
Keywords
Related Topics
Physical Sciences and Engineering
Physics and Astronomy
Physics and Astronomy (General)
Authors
Ryan G. James, Korana Burke, James P. Crutchfield,