کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
408069 | 678242 | 2011 | 11 صفحه PDF | دانلود رایگان |
To optimize the construct of an ensemble of many relatively small and simple classifiers may be more realistic than to optimize the design of an individual large and complex classifier. Problems of local minima and slow convergence may be mitigated within an ensemble system where the decisions obtained from locally optimal components classifiers are integrated together. However, it is very difficult to design the structure of individual neural networks (NNs) in an ensemble and the architecture of a whole ensemble. In n-Bits Binary Coding ICBP Ensemble System (nBBC-ICBP-ES), the crucial parameter that is required to be set a priori is an appropriate number of hidden nodes of the corresponding improved circular back-propagation (ICBP) root model. Thereby, both the number of individual ICBPs and the architecture of each ICBP component in an nBBC-ICBP-ES can be decided straightly. nBBC-ICBP-ES is computationally more efficient, with relatively fewer user specified parameters, and does not need any manual division to the training data set for the purpose of its construction. It is rather easy to be understood and implemented, while inheriting the benefits of ICBP root model in a natural manner. Simulation and t-test results on four large-scale benchmark classification data sets demonstrated that, in most cases, n BBC-ICBP-ES significantly improved the classification and generalization performances of the two typically large single ICBPs, Same-2Nh2Nh-ICBP-ES (i.e., an ensemble system with the same 2Nh2Nh ICBP components), conventional Bagging and AdaBoost ensemble. We conclude that, for NN applications in pattern recognition, assembling many small NNs might be better than just utilizing an individual large one, and further, assembling many heterogeneous small NNs might be better than assembling many homogeneous ones. The proposed nBBC-ICBP-ES is simple but efficient and effective, and also potentially significant for the applications of NN ensemble.
Journal: Neurocomputing - Volume 74, Issue 17, October 2011, Pages 3509–3519