کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
534544 | 870265 | 2014 | 15 صفحه PDF | دانلود رایگان |
• We propose a boosting algorithm for multiclass semi-supervised learning, Multi-SemiAdaBoost.
• It minimizes the margin cost on labeled data and the inconsistency over labeled and unlabeled data.
• Multi-SemiAdaBoost uses an exponential multiclass loss function for semi-supervised learning.
• Multi-SemiAdaBoost can boost any kind of base classifier.
We present an algorithm for multiclass semi-supervised learning, which is learning from a limited amount of labeled data and plenty of unlabeled data. Existing semi-supervised learning algorithms use approaches such as one-versus-all to convert the multiclass problem to several binary classification problems, which is not optimal. We propose a multiclass semi-supervised boosting algorithm that solves multiclass classification problems directly. The algorithm is based on a novel multiclass loss function consisting of the margin cost on labeled data and two regularization terms on labeled and unlabeled data. Experimental results on a number of benchmark and real-world datasets show that the proposed algorithm performs better than the state-of-the-art boosting algorithms for multiclass semi-supervised learning, such as SemiBoost (Mallapragada et al., 2009) and RegBoost (Chen and Wang, 2011).
Journal: Pattern Recognition Letters - Volume 37, 1 February 2014, Pages 63–77