Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
534557 | Pattern Recognition Letters | 2014 | 9 Pages |
In this paper, we propose a Laplacian minimax probability machine, which is a semi-supervised version of minimax probability machine based on the manifold regularization framework. We also show that the proposed method can be kernelized on the basis of a theorem similar to the representer theorem for non-linear cases. Experiments confirm that the proposed methods achieve competitive results, as compared to existing graph-based learning methods such as the Laplacian support vector machine and the Laplacian regularized least square, for publicly available datasets from the UCI machine learning repository.
► Minimax probability machine (MPM) can be extended to its semi-supervised version. ► Block coordinate descent is used to solve proposed optimization problem. ► Proposed methods achieved comparative results to existing semi-supervised methods. ► Proposed methods performs better in linear case than in non-linear case.