Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
10151151 | Neurocomputing | 2018 | 24 Pages |
Abstract
The massive amount of available data potentially used to discover patterns in machine learning is a challenge for kernel based algorithms with respect to runtime and storage capacities. Local approaches might help to relieve these issues. From a statistical point of view local approaches allow additionally to deal with different structures in the data in different ways. This paper analyses properties of localized kernel based, non-parametric statistical machine learning methods, in particular of support vector machines (SVMs) and methods close to them. We will show there that locally learnt kernel methods are universally consistent. Furthermore, we give an upper bound for the maxbias in order to show statistical robustness of the proposed method.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Florian Dumpert, Andreas Christmann,