Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
4947631 | Neurocomputing | 2017 | 12 Pages |
Abstract
Support vector data description (SVDD) is a leading classification method for novelty detection, which minimizes the volume of a spherically shaped decision boundary around the normal class. While SVDD has achieved promising performance, it will lead to a loose boundary for multivariate datasets of which the input dimensions are usually correlated. Inspired by the relationship between kernel principal component analysis (kernel PCA) and the best-fit ellipsoid for a dataset, this study proposes the ellipsoidal data description (ELPDD) which considers feature variance of each dimension adaptively. A minimum volume enclosing ellipsoid (MVEE) is constructed around the target data in the kernel PCA subspace which can be learned via a SVM-like objective function with log-determinant penalty. We also provide the Rademacher complexity bound for our model. Some relating problems are investigated in detail. Experiments on artificial and real-world datasets validate the effectiveness of our method.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Kunzhe Wang, Huaitie Xiao,