Article ID Journal Published Year Pages File Type
1146367 Journal of Multivariate Analysis 2011 12 Pages PDF
Abstract

This paper addresses the problem of estimating the density of a future outcome from a multivariate normal model. We propose a class of empirical Bayes predictive densities and evaluate their performances under the Kullback–Leibler (KL) divergence. We show that these empirical Bayes predictive densities dominate the Bayesian predictive density under the uniform prior and thus are minimax under some general conditions. We also establish the asymptotic optimality of these empirical Bayes predictive densities in infinite-dimensional parameter spaces through an oracle inequality.

Related Topics
Physical Sciences and Engineering Mathematics Numerical Analysis
Authors
, ,