Article ID Journal Published Year Pages File Type
9657173 Journal of Computer and System Sciences 2005 16 Pages PDF
Abstract
The notions of predictive complexity and of corresponding amount of information are considered. Predictive complexity is a generalization of Kolmogorov complexity which bounds the ability of any algorithm to predict elements of a sequence of outcomes. We consider predictive complexity for a wide class of bounded loss functions which are generalizations of square-loss function. Relations between unconditional KG(x) and conditional KG(x|y) predictive complexities are studied. We define an algorithm which has some “expanding property”. It transforms with positive probability sequences of given predictive complexity into sequences of essentially bigger predictive complexity. A concept of amount of predictive information IG(y:x) is studied. We show that this information is noncommutative in a very strong sense and present asymptotic relations between values IG(y:x), IG(x:y), KG(x) and KG(y).
Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, ,