Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6862893 | Neural Networks | 2018 | 16 Pages |
Abstract
Although the twin support vector regression (TSVR) method has been widely studied and various variants are successfully developed, the structural risk minimization (SRM) principle and model's sparseness are not given sufficient consideration. In this paper, a novel nonparallel support vector regression (NPSVR) is proposed in spirit of nonparallel support vector machine (NPSVM), which outperforms existing twin support vector regression (TSVR) methods in the following terms: (1) For each primal problem, a regularized term is added by rigidly following the SRM principle so that the kernel trick can be applied directly to the dual problems for the nonlinear case without considering an extra kernel-generated surface; (2) An ε-insensitive loss function is adopted to remain inherent sparseness as the standard support vector regression (SVR); (3) The dual problems have the same formulation with that of the standard SVR, so computing inverse matrix is well avoided and a sequential minimization optimization (SMO)-type solver is exclusively designed to accelerate the training for large-scale datasets; (4) The primal problems can approximately degenerate to those of the existing TSVRs if corresponding parameters are appropriately chosen. Numerical experiments on diverse datasets have verified the effectiveness of our proposed NPSVR in sparseness, generalization ability and scalability.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Long Tang, Yingjie Tian, Chunyan Yang,