Article ID Journal Published Year Pages File Type
978645 Physica A: Statistical Mechanics and its Applications 2006 10 Pages PDF
Abstract

Network boosting (NB) is an ensemble learning method that combines weak learners together based on a network and can learn the target hypothesis asymptotically. The experiment results show that NB can improve the classification accuracy significantly compared to Bagging and AdaBoost. We compare the accumulative margin distributions of the three ensemble learning methods and find that NB draws merit from Bagging and AdaBoost and shows higher generalization ability. To explore the influence of network topology on the performance of the algorithm, random graph, small-world network and scale-free-network are employed. The analysis based on the synchronizability of network shows that the ensemble learned by scale-free-network-based NB is more correlated than that of NB based on other two topologies.

Related Topics
Physical Sciences and Engineering Mathematics Mathematical Physics
Authors
, , ,