کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
415642 | 681221 | 2013 | 11 صفحه PDF | دانلود رایگان |

This paper proposes a new method for choosing regression models which may produce multiple models with sufficient explanatory power and parsimony unlike the traditional model selection procedures that aim at obtaining a single best model. The method ensures interpretability of the resulting models even under strong multicollinearity. The algorithm proceeds in the forward stepwise manner with two requirements for the selected regression models to be fulfilled: goodness of fit and the magnitude of update in loss functions. For the latter criterion, the standardized update is newly introduced, which is closely related with the model selection criteria including the Mallows’ CpCp, Akaike information criterion and Bayesian information criterion. Simulation studies demonstrate that the proposed algorithm works well with and without strong multicollinearity and even with many explanatory variables. Application to real data is also provided.
Journal: Computational Statistics & Data Analysis - Volume 63, July 2013, Pages 31–41