کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
4605635 | 1337588 | 2006 | 7 صفحه PDF | دانلود رایگان |

The convergence of the discrete graph Laplacian to the continuous manifold Laplacian in the limit of sample size N→∞ while the kernel bandwidth ε→0, is the justification for the success of Laplacian based algorithms in machine learning, such as dimensionality reduction, semi-supervised learning and spectral clustering. In this paper we improve the convergence rate of the variance term recently obtained by Hein et al. [From graphs to manifolds—Weak and strong pointwise consistency of graph Laplacians, in: P. Auer, R. Meir (Eds.), Proc. 18th Conf. Learning Theory (COLT), Lecture Notes Comput. Sci., vol. 3559, Springer-Verlag, Berlin, 2005, pp. 470–485], improve the bias term error, and find an optimal criteria to determine the parameter ε given N.
Journal: Applied and Computational Harmonic Analysis - Volume 21, Issue 1, July 2006, Pages 128-134