Article ID Journal Published Year Pages File Type
416889 Computational Statistics & Data Analysis 2011 17 Pages PDF
Abstract

It is well known now that the minimum Hellinger distance estimation approach introduced by Beran (Beran, R., 1977. Minimum Hellinger distance estimators for parametric models. Ann. Statist. 5, 445–463) produces estimators that achieve efficiency at the model density and simultaneously have excellent robustness properties. However, computational difficulties and algorithmic convergence problems associated with this method have hampered its application in practice, particularly when the method is applied to models with high-dimensional parameter spaces. A one-step minimum Hellinger distance (MHD) procedure is investigated in this paper to overcome computational drawbacks of the fully iterative MHD method. The idea is to start with an initial estimator, and then iterate the Newton–Raphson equation once related to the Hellinger distance. The resulting estimator can be considered a one-step MHD estimator. We show that the proposed one-step MHD estimator has the same asymptotic behavior as the MHD estimator, as long as the initial estimators are reasonably good. Furthermore, our theoretical and numerical studies also demonstrate that the proposed one-step MHD estimator also retains excellent robustness properties of the MHD estimators. A real data example is analyzed as well.

► An efficient one-step minimum Hellinger distance procedure (MHD) is investigated. ► Asymptotic properties of the estimator are same as those of the fully iterative one. ► The proposed estimator also retains excellent robustness properties. ► A numerical study and a real data example are given as well.

Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, ,