کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
1146124 1489694 2012 26 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Asymptotic normality of support vector machine variants and other regularized kernel methods
موضوعات مرتبط
مهندسی و علوم پایه ریاضیات آنالیز عددی
پیش نمایش صفحه اول مقاله
Asymptotic normality of support vector machine variants and other regularized kernel methods
چکیده انگلیسی

In nonparametric classification and regression problems, regularized kernel methods, in particular support vector machines, attract much attention in theoretical and in applied statistics. In an abstract sense, regularized kernel methods (simply called SVMs here) can be seen as regularized M-estimators for a parameter in a (typically infinite dimensional) reproducing kernel Hilbert space. For smooth loss functions LL, it is shown that the difference between the estimator, i.e. the empirical SVM fL,Dn,λDn, and the theoretical SVM fL,P,λ0fL,P,λ0 is asymptotically normal with rate n. That is, n(fL,Dn,λDn−fL,P,λ0) converges weakly to a Gaussian process in the reproducing kernel Hilbert space. As common in real applications, the choice of the regularization parameter Dn in fL,Dn,λDn may depend on the data. The proof is done by an application of the functional delta-method and by showing that the SVM-functional P↦fL,P,λP↦fL,P,λ is suitably Hadamard-differentiable.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Journal of Multivariate Analysis - Volume 106, April 2012, Pages 92–117
نویسندگان
,