Article ID Journal Published Year Pages File Type
416129 Computational Statistics & Data Analysis 2009 15 Pages PDF
Abstract

The importance of predictors is characterized by the extent to which their use reduces uncertainty about predicting the response variable, namely their information importance. The uncertainty associated with a probability distribution is a concave function of the density such that its global maximum is a uniform distribution reflecting the most difficult prediction situation. Shannon entropy is used to operationalize the concept. For nonstochastic predictors, maximum entropy characterization of probability distributions provides measures of information importance. For stochastic predictors, the expected entropy difference gives measures of information importance, which are invariant under one-to-one transformations of the variables. Applications to various data types lead to familiar statistical quantities for various models, yet with the unified interpretation of uncertainty reduction. Bayesian inference procedures for the importance and relative importance of predictors are developed. Three examples show applications to normal regression, contingency table, and logit analyses.

Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, , ,