کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
534557 | 870265 | 2014 | 9 صفحه PDF | دانلود رایگان |
In this paper, we propose a Laplacian minimax probability machine, which is a semi-supervised version of minimax probability machine based on the manifold regularization framework. We also show that the proposed method can be kernelized on the basis of a theorem similar to the representer theorem for non-linear cases. Experiments confirm that the proposed methods achieve competitive results, as compared to existing graph-based learning methods such as the Laplacian support vector machine and the Laplacian regularized least square, for publicly available datasets from the UCI machine learning repository.
► Minimax probability machine (MPM) can be extended to its semi-supervised version.
► Block coordinate descent is used to solve proposed optimization problem.
► Proposed methods achieved comparative results to existing semi-supervised methods.
► Proposed methods performs better in linear case than in non-linear case.
Journal: Pattern Recognition Letters - Volume 37, 1 February 2014, Pages 192–200