Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
413082 | Neurocomputing | 2006 | 27 Pages |
A locally regularized orthogonal least squares (LROLS) algorithm is proposed for constructing parsimonious or sparse regression models that generalize well. By associating each orthogonal weight in the regression model with an individual regularization parameter, the ability for the orthogonal least squares model selection to produce a very sparse model with good generalization performance is greatly enhanced. Furthermore, with the assistance of local regularization, when to terminate the model selection procedure becomes much clearer. A comparison with a state-of-the-art method for constructing sparse regression models, known as the relevance vector machine, is given. The proposed LROLS algorithm is shown to possess considerable computational advantages, including well conditioned solution and faster convergence speed.