کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
1145521 | 1489669 | 2014 | 19 صفحه PDF | دانلود رایگان |
• We show that the popular association indices fail to detect and rank dependence.
• The popular association indices under-represent the dependence of elliptical models.
• The mutual information detects and ranks dependence of absolutely continuous models.
• The mutual information measures the utility of dependence between random variables.
• We use a generalized information index to rank dependence of singular distributions.
This paper first illustrates that a mutual information index detects and ranks dependence of a wide variety of absolutely continuous families, but the popular association and variance reduction indices fail to serve as such “common metrics”. We then elaborate on some theoretical merits of the mutual information and give several results. The mutual information provides a notion of the utility of dependence for predicting random variables and quantifies how much the joint distribution is more informative about the variables than the independent model. We present insightful partitions of dependence among the components of a random vector, for a class of models recently proposed for dependence of uncorrelated variables, and for the elliptical families. We also recall that the mutual information is not applicable to singular distributions and give some results for a generalized information index for these models. The generalized index is derived for the Marshall–Olkin copula and for a new singular copula that represents the dependence of the consecutive terms of the exponential autoregressive and related processes.
Journal: Journal of Multivariate Analysis - Volume 131, October 2014, Pages 32–50