Article ID Journal Published Year Pages File Type
1154450 Statistics & Probability Letters 2007 13 Pages PDF
Abstract

Suppose that XX is a random vector with probability distribution P   and suppose that PP denotes a proposed model that involves interesting parameters and relationship between variables. We consider statistical inference procedures for the case where P∉PP∉P constructed as follows: let θ(P)θ(P) denote the parameter of the distribution Q∈PQ∈P that minimizes a Kullback–Leibler (K–L)-type discrepancy K(Q,P)K(Q,P) between QQ and PP. We take θ(P)θ(P) to be the parameter of interest. The estimate of θ(P)θ(P), when it exists, is defined by θ^=θ(P^) where P^ is the empirical probability. We call θ(P^) a Kullback–Leibler empirical projection (KLEP). When θ(P^) does not exist, we extend the concept of a K–L discrepancy to limits of empirical likelihoods to obtain KLEP procedures. Properties of inference procedures based on θ^ are considered when P∉PP∉P. In particular we compare the naive procedure that uses the standard error applicable when P∈PP∈P, the sandwich formula standard error, and the bootstrap standard error using asymptotic methods and Monte Carlo simulation. For regression experiments with a model based on transforming both response and covariates, we use results of Hernandez and Johnson [1980. The large-sample behavior of transformations to normality. J. Amer. Statist. Assoc. 75, 855–861] to derive KLEP procedures.

Related Topics
Physical Sciences and Engineering Mathematics Statistics and Probability
Authors
, , , ,