Article ID Journal Published Year Pages File Type
391707 Information Sciences 2014 22 Pages PDF
Abstract

The Multi-Layer Perceptron (MLP), as one of the most-widely used Neural Networks (NNs), has been applied to many practical problems. The MLP requires training on specific applications, often experiencing problems of entrapment in local minima, convergence speed, and sensitivity to initialization. This paper proposes the use of the recently developed Biogeography-Based Optimization (BBO) algorithm for training MLPs to reduce these problems. In order to investigate the efficiencies of BBO in training MLPs, five classification datasets, as well as six function approximation datasets are employed. The results are compared to five well-known heuristic algorithms, Back Propagation (BP), and Extreme Learning Machine (ELM) in terms of entrapment in local minima, result accuracy, and convergence rate. The results show that training MLPs by using BBO is significantly better than the current heuristic learning algorithms and BP. Moreover, the results show that BBO is able to provide very competitive results in comparison with ELM.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,