Article ID Journal Published Year Pages File Type
535946 Pattern Recognition Letters 2011 12 Pages PDF
Abstract

According to a certain misconception sometimes met in the literature: for the nearest-neighbors algorithms there is no fixed hypothesis class of limited Vapnik–Chervonenkis dimension.In the paper a simple reformulation (not a modification) of the nearest-neighbors algorithm is shown where instead of a natural number k, a percentage α ∈ (0, 1) of nearest neighbors is used. Owing to this reformulation one can construct sets of approximating functions, which we prove to have finite VC dimension. In a special (but practical) case this dimension is equal to ⌊2/α⌋. It is also then possible to form a sequence of sets of functions with increasing VC dimension, and to perform complexity selection via cross-validation or similarly to the structural risk minimization framework. Results of such experiments are also presented.

► Reformulation of k-NN algorithm to alpha-NN∗ algorithm (where alpha is a fraction). ► Establishing sets of functions for alpha-NN∗, with finite capacity. ► Pointing out degrees of freedom for these sets. ► Proving theorems about dichotomies and VC-dimension for the proposed sets.

Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, ,