Article ID Journal Published Year Pages File Type
418153 Computational Statistics & Data Analysis 2007 10 Pages PDF
Abstract

We consider Nadaraya–Watson type estimators for binary regression functions. We propose a method for improving the performance of such estimators by employing bias reduction techniques when estimating the constituent probability densities. Direct substitution of separately optimized density estimates into the regression function formula generates disappointing results in practice. However, adjusting the global smoothing parameter to optimize a performance criterion for the binary regression function itself is more promising. We focus on an implementation of this approach which uses a variable kernel technique to provide reduced bias density estimates, and where the global bandwidth is selected by an appropriately tailored leave-one-out (cross-validation) method. Theory and numerical experiments show that this form of bias reduction improves performance substantially when the underlying regression function is highly non-linear but is not beneficial when the underlying regression function is almost linear in form.

Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
,