Article ID Journal Published Year Pages File Type
383077 Expert Systems with Applications 2014 7 Pages PDF
Abstract

•Our semi-supervised algorithm combines the benefits of both co-training and active learning.•The most reliable instances are selected according to high confidence and nearest neighbor.•We define contribution degree as the selection criteria of informative instances.•Our algorithm achieves significant improvement for sacrificing same amount of human effort.•Compared with standard co-training, our algorithm worked well on small labeled training sets.

Co-training is a good paradigm of semi-supervised, which requires the data set to be described by two views of features. There are a notable characteristic shared by many co-training algorithm: the selected unlabeled instances should be predicted with high confidence, since a high confidence score usually implies that the corresponding prediction is correct. Unfortunately, it is not always able to improve the classification performance with these high confidence unlabeled instances. In this paper, a new semi-supervised learning algorithm was proposed combining the benefits of both co-training and active learning. The algorithm applies co-training to select the most reliable instances according to the two criterions of high confidence and nearest neighbor for boosting the classifier, also exploit the most informative instances with human annotation for improve the classification performance. Experiments on several UCI data sets and natural language processing task, which demonstrate our method achieves more significant improvement for sacrificing the same amount of human effort.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,