کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
526878 | 869252 | 2014 | 8 صفحه PDF | دانلود رایگان |

• We address a novel boosting algorithm by taking advantage of Universum data.
• A greedy, stagewise, functional gradient procedure is taken to derive the method.
• Explicit weighting schemes for labeled and Universum samples are provided.
• Practical conditions to verify effectiveness of Universum learning are described.
• This algorithm obtains superior performances over AdaBoost with Universum data.
Recently, Universum data that does not belong to any class of the training data, has been applied for training better classifiers. In this paper, we address a novel boosting algorithm called UUAdaBoost that can improve the classification performance of AdaBoost with Universum data. UUAdaBoost chooses a function by minimizing the loss for labeled data and Universum data. The cost function is minimized by a greedy, stagewise, functional gradient procedure. Each training stage of UUAdaBoost is fast and efficient. The standard AdaBoost weights labeled samples during training iterations while UUAdaBoost gives an explicit weighting scheme for Universum samples as well. In addition, this paper describes the practical conditions for the effectiveness of Universum learning. These conditions are based on the analysis of the distribution of ensemble predictions over training samples. Experiments on handwritten digits classification and gender classification problems are presented. As exhibited by our experimental results, the proposed method can obtain superior performances over the standard AdaBoost by selecting proper Universum data.
Figure optionsDownload high-quality image (276 K)Download as PowerPoint slide
Journal: Image and Vision Computing - Volume 32, Issue 8, August 2014, Pages 550–557