Article ID Journal Published Year Pages File Type
6861282 Knowledge-Based Systems 2018 15 Pages PDF
Abstract
Multi-view learning (MVL) focuses on the problem of learning from the data represented by multiple distinct feature sets. Various successful SVM-based multi-view learning models have been proposed to improve existing learning tasks performance. Since nonparallel support vector machine (NPSVM) is proposed with several incomparable advantages over the state-of-the-art classifiers, it is potentially beneficial to perform the multi-view classification task using NPSVM. In this paper, we build a new multi-view learning model based on nonparallel support vector machine, termed as MVNPSVM. By combining the large margin mechanism and the consensus principle, MVNPSVM not only inherits the advantages of both NPSVM and multi-view learning, but also brings a new insight of extending NPSVM to the multi-view learning field. To solve MVNPSVM efficiently, we adopt the alternating direction method of multipliers (ADMM) as the solution. We theoretically analyze the performance of MVNPSVM from the viewpoints of the consensus analysis and the comparisons with the other two similar methods SVM-2K and multi-view twin support vector machines. Experimental results on 95 binary data sets confirm the effectiveness of the proposed method.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,