Article ID Journal Published Year Pages File Type
4969953 Pattern Recognition 2016 40 Pages PDF
Abstract
Support vector machines (SVMs) play a very dominant role in data classification because of their good generalization performance. However, they suffer from the high computational complexity in the classification stage when there are a considerable number of support vectors (SVs). It is desirable to design efficient algorithms in the classification stage to deal with datasets obtained from real-time pattern recognition systems. To this end, we propose a novel classifier called HMLSVMs (Hierarchical Mixing Linear Support Vector Machines), which has a hierarchical structure with a mixing linear SVM classifier at each node. It predicts the label of a sample using only a few hyperplanes. We also give a generalization error bound for the class of locally linear SVMs (LLSVMs) based on the Rademacher theory, which ensures that overfitting can be effectively avoided. Experimental evaluations show that the proposed classifier achieves a high efficiency in the classification stage, while the classification performance approaches that of kernel SVMs.
Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , , ,