Article ID Journal Published Year Pages File Type
10526267 Statistics & Probability Letters 2005 11 Pages PDF
Abstract
Bootstrap methods are attractive empirical procedures for assessment of errors in problems of statistical estimation, and allow highly accurate inference in a vast range of parametric problems. Conventional parametric bootstrapping involves sampling from a fitted parametric model, obtained by substituting the maximum likelihood estimator for the unknown population parameter. Recently, attention has focussed on modified bootstrap methods which alter the sampling model used in the bootstrap calculation, in a systematic way that is dependent on the parameter of interest. Typically, inference is required for the interest parameter in the presence of a nuisance parameter, in which case the issue of how best to handle the nuisance parameter in the bootstrap inference arises. In this paper, we provide a general analysis of the error reduction properties of the parametric bootstrap. We show that conventional parametric bootstrapping succeeds in reducing error quite generally, when applied to an asymptotically normal pivot, and demonstrate further that systematic improvements are obtained by a particular form of modified scheme, in which the nuisance parameter is substituted by its constrained maximum likelihood estimator, for a given value of the parameter of interest.
Related Topics
Physical Sciences and Engineering Mathematics Statistics and Probability
Authors
, ,