Article ID Journal Published Year Pages File Type
410572 Neurocomputing 2009 10 Pages PDF
Abstract

Input selection is advantageous in regression problems. It may, for example, decrease the training time of models, reduce measurement costs, and assist in circumventing problems of high dimensionality. Also, the inclusion of useless inputs into the model increases the likelihood of overfitting. Neural networks provide good generalization in many cases, but their interpretability is usually limited. However, selecting a subset of variables and estimating their relative importances would be valuable in many real world applications. In the present work, a simultaneous input and basis function selection method for a radial basis function (RBF) network is proposed. The selection is performed by minimizing a constrained optimization problem, in which sparsity of the network is controlled by two continuous valued shrinkage parameters. Each input dimension is weighted and the constraints are imposed on these weights and the output layer coefficients. Direct and alternating optimization (AO) procedures are presented to solve the problem. The proposed method is applied to simulated and benchmark data. In the comparison with the existing methods, the resulting RBF networks have similar prediction accuracies with the smaller numbers of inputs and basis functions.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
,