Article ID Journal Published Year Pages File Type
403480 Knowledge-Based Systems 2015 9 Pages PDF
Abstract

•We propose two new SVDD models which improve the robustness to noise.•Cutoff distance-based local density can mitigate the effect of noise towards SVDD.•Tolerated gap of SVDD with ε-insensitive loss can improve generalization performance.

As an example of one-class classification methods, support vector data description (SVDD) offers an opportunity to improve the performance of outlier detection and reduce the loss caused by outlier occurrence in many real-world applications. However, due to limited outliers, the SVDD model is built only by using the normal data. In this situation, SVDD may easily lead to over fitting when the normal data contain noise or uncertainty. This paper presents two types of new SVDD methods, named R-SVDD and εNR-SVDD, which are constructed by introducing cutoff distance-based local density of each data sample and the ε-insensitive loss function with negative samples. We have demonstrated that the proposed methods can improve the robustness of SVDD for data with noise or uncertainty by extensive experiments on ten UCI datasets. The experimental results have shown that the proposed εNR-SVDD is superior to other existing outlier detection methods in terms of the detection rate and the false alarm rate. Meanwhile, the proposed R-SVDD can also achieve a better outlier detection performance with only normal data. Finally, the proposed methods are successfully used to detect the image-based conveyor belt fault.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,