Article ID Journal Published Year Pages File Type
407020 Neurocomputing 2014 8 Pages PDF
Abstract

There are a large number of target data and fewer outlier data in the problem of one-class classification. The sample reduction methods used for two-class or multi-class classification cannot be applied to this problem. This paper presents a novel method to reduce the scale of training set for one-class classification. This method only preserves the sample locating near the data distribution which may become support vector. If summing the cosine value of the angle which is between the difference between the sample and the neighbor, and the difference between the sample and mean of neighbors, the cosine sum will be close to k (k is the number of the neighbors), while the sample locating near the boundary; close to 0, while the sample locating within the data distribution. Experimental results demonstrate that the proposed method can reduce the scale of training set to get faster training speed and use less support vectors in the model of one-class SVMs, and the performance does not deteriorate.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , , ,