Article ID Journal Published Year Pages File Type
10370398 Signal Processing 2005 13 Pages PDF
Abstract
In regression problems where the density f of the errors is not known, maximum likelihood is unapplicable, and the use of alternative techniques like least squares or robust M-estimation generally implies inefficient estimation of the parameters. The search for adaptive estimators, that is, estimators that remain asymptotically efficient independently of the knowledge of f, has received a lot of attention, see in particular (Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, 1956, pp. 187; Ann. Stat. 3(2) (1975) 267; Ann. Stat. 10 (1982) 647) and the review paper (Econometric Rev. 3(2) (1984) 145). The paper considers a minimum-entropy parametric estimator that minimizes an estimate of the entropy of the distribution of the residuals. A first construction connects the method with the Stone-Bickel approach, where the estimation is decomposed into two steps. Then we consider a direct approach that does not involve any preliminary n-consistent estimator. Some results are given that illustrate the good performance of minimum-entropy estimation for reasonable sample sizes when compared to standard methods, in particular concerning robustness in the presence of outliers.
Related Topics
Physical Sciences and Engineering Computer Science Signal Processing
Authors
, , ,