Article ID Journal Published Year Pages File Type
6938743 Pattern Recognition 2018 13 Pages PDF
Abstract
Cost-sensitive learning can be found in many real-world applications and represents an important learning paradigm in machine learning. The recently proposed cost-sensitive hinge loss support vector machine (CSHL-SVM) guarantees consistency with the cost-sensitive Bayes risk, and this technique provides better generalization accuracy compared to traditional cost-sensitive support vector machines. In practice, data typically appear in the form of sequential chunks, also called an on-line scenario. However, conventional batch learning algorithms waste a considerable amount of time under the on-line scenario due to re-training of a model from scratch. To make CSHL-SVM more practical for the on-line scenario, we propose a chunk incremental learning algorithm for CSHL-SVM, which can update a trained model without re-training from scratch when incorporating a chunk of new samples. Our method is efficient because it can update the trained model for not only one sample at a time but also multiple samples at a time. Our experimental results on a variety of datasets not only confirm the effectiveness of CSHL-SVM but also show that our method is more efficient than the batch algorithm of CSHL-SVM and the incremental learning method of CSHL-SVM only for a single sample.
Related Topics
Physical Sciences and Engineering Computer Science Computer Vision and Pattern Recognition
Authors
, , , , ,