کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
531819 | 869876 | 2016 | 13 صفحه PDF | دانلود رایگان |
• We imposed the column orthogonal constraints to the mapping operator and propose the orthogonal optimal reverse prediction algorithm which shows a better performance than the original algorithm.
• We imposed the orthogonal constraints method to the kernelized optimal reverse prediction algorithm and proposed the kernelized orthogonal optimal reverse prediction.
• An optimization algorithm is designed to solve the problem and the model׳s classification performance is encouraging.
• We also extend the idea to the Laplacian orthogonal optimal reverse prediction and get a better classification performance.
Optimal reverse prediction (ORP) has recently been proposed as a semi-supervised framework to unify supervised and unsupervised training methods such as supervised least square, principal component analysis (PCA), k-means clustering and normalized graph-cut. ORP has an ability to deal with classification tasks in which the labeled data are insufficient. But, the performance of ORP and its kernelized version is still not satisfactory for classification applications. To further improve performance of ORP, motivated by recently proposed orthogonal k-means clustering, in this paper we propose an orthogonal optimal reverse prediction (OORP), together with its kernelized and Laplacian regularized extensions. With only limited additional computations, our algorithms can greatly enhance the classification performance, compared to the original ORP.Extensive experiments on synthetic and benchmark data collections consistently prove the effectiveness and efficiency of our OORP in comparison with several competing approaches.
Journal: Pattern Recognition - Volume 60, December 2016, Pages 908–920