کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
534711 | 870283 | 2012 | 5 صفحه PDF | دانلود رایگان |

In this short note, we demonstrate the use of principal components analysis (PCA) for one-class support vector machine (one-class SVM) as a dimension reduction tool. However, unlike almost all other usage of PCA which extracts the eigenvectors associated with top eigenvalues as the projection directions, here it is the eigenvectors associated with small eigenvalues that are of interests, and in particular the null of the eigenspace, since the null space in fact characterizes the common features of the training samples. Image retrieval examples are used to illustrate the effectiveness of dimension reduction.
► Propose to use PCA before applying OCSVM for dimension reduction.
► Only MINOR components can reduce error rates.
► Some experiments are presented to show the effectiveness of dimension reduction.
Journal: Pattern Recognition Letters - Volume 33, Issue 9, 1 July 2012, Pages 1027–1031