Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6874725 | Journal of Computer and System Sciences | 2018 | 8 Pages |
Abstract
It is a classic result in algorithmic information theory that every infinite binary sequence is computable from an infinite binary sequence which is random in the sense of Martin-Löf. If the computation of the first n bits of a sequence requires n+g(n) bits of the random oracle, then g is the redundancy of the computation. We devise a new coding method that achieves optimal logarithmic redundancy. For any computable non-decreasing function g such that âi2âg(i) is bounded we show that there is a coding process that codes any given infinite binary sequence into a Martin-Löf random infinite binary sequence with redundancy g. This redundancy bound is known to be the best possible.
Related Topics
Physical Sciences and Engineering
Computer Science
Computational Theory and Mathematics
Authors
George Barmpalias, Andrew Lewis-Pye,