کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
1144954 | 957442 | 2010 | 9 صفحه PDF | دانلود رایگان |

In this paper, we propose a sparse semi-supervised learning method, which combines the large margin approach and the L1L1 constraint. The main contribution of the paper is to develop an efficient optimization algorithm. The objective function to be minimized in a large margin semi-supervised learning method is non-convex and non-differentiable, and hence special optimization algorithms are required. For this purpose, we develop an optimization algorithm, which is a hybrid of the CCCP and the gradient LASSO algorithm. The advantage of the proposed method over existing semi-supervised learning methods is that it can identify a small number of relevant input variables while keeping the prediction accuracy high. Also, the proposed algorithm is simple enough that it can be applied to various real problems without being much hampered by computational limitations. To confirm these advantages, we compare the proposed method with the standard semi-supervised method by analyzing simulated as well as real data sets.
Journal: Journal of the Korean Statistical Society - Volume 39, Issue 4, December 2010, Pages 479–487