Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
408275 | Neurocomputing | 2011 | 10 Pages |
Abstract
In this paper, a fast method of selecting features for kernel minimum squared error (KMSE) is proposed to mitigate the computational burden in the case where the size of the training patterns is large. Compared with other existent algorithms of selecting features for KMSE, this iterative KMSE, viz. IKMSE, shows better property of enhancing the computational efficiency without sacrificing the generalization performance. Experimental reports on the benchmark data sets, nonlinear autoregressive model and real problem address the efficacy and feasibility of the proposed IKMSE. In addition, IKMSE can be easily extended to classification fields.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Yong-Ping Zhao, Zhong-Hua Du, Zhi-An Zhang, Hai-Bo Zhang,