Article ID Journal Published Year Pages File Type
4607902 Journal of Approximation Theory 2010 16 Pages PDF
Abstract

Moving least-square (MLS) is an approximation method for data interpolation, numerical analysis and statistics. In this paper we consider the MLS method in learning theory for the regression problem. Essential differences between MLS and other common learning algorithms are pointed out: lack of a natural uniform bound for estimators and the pointwise definition. The sample error is estimated in terms of the weight function and the finite dimensional hypothesis space. The approximation error is dealt with for two special cases for which convergence rates for the total L2L2 error measuring the global approximation on the whole domain are provided.

Related Topics
Physical Sciences and Engineering Mathematics Analysis
Authors
, , ,