Article ID Journal Published Year Pages File Type
1144954 Journal of the Korean Statistical Society 2010 9 Pages PDF
Abstract

In this paper, we propose a sparse semi-supervised learning method, which combines the large margin approach and the L1L1 constraint. The main contribution of the paper is to develop an efficient optimization algorithm. The objective function to be minimized in a large margin semi-supervised learning method is non-convex and non-differentiable, and hence special optimization algorithms are required. For this purpose, we develop an optimization algorithm, which is a hybrid of the CCCP and the gradient LASSO algorithm. The advantage of the proposed method over existing semi-supervised learning methods is that it can identify a small number of relevant input variables while keeping the prediction accuracy high. Also, the proposed algorithm is simple enough that it can be applied to various real problems without being much hampered by computational limitations. To confirm these advantages, we compare the proposed method with the standard semi-supervised method by analyzing simulated as well as real data sets.

Related Topics
Physical Sciences and Engineering Mathematics Statistics and Probability
Authors
, , ,