Article ID Journal Published Year Pages File Type
404015 Neural Networks 2014 8 Pages PDF
Abstract

In this article, we address the problem of compressed classification learning. A generalization bound of the support vector machines (SVMs) compressed classification algorithm with uniformly ergodic Markov chain samples is established. This bound indicates that the accuracy of the SVM classifier in the compressed domain is close to that of the best classifier in the data domain. In a sense, the fact that the compressed learning can avoid the curse of dimensionality in the learning process is shown. In addition, we show that compressed classification learning reduces the learning time at the price of decreasing the classification accuracy, but the decrement can be controlled. The numerical experiments further verify the results claimed in this article.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,