Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6855206 | Expert Systems with Applications | 2018 | 26 Pages |
Abstract
The Extreme Learning Machine (ELM) is a single-hidden layer feedforward neural network (SLFN) learning algorithm that can learn effectively and quickly. The ELM training phase assigns the input weights and bias randomly and does not change them in the whole process. Although the network works well, the random weights in the input layer may affect the algorithm performance. Therefore, we propose a new approach to determine the input weights and bias for the ELM using the restricted Boltzmann machine (RBM), which we call RBM-ELM. We compare our new approach to the well-known ELM-AE and to the ELM-RO, a state of the art algorithm to select the input weights for the ELM. The experimental results show that the RBM-ELM achieves a better performance than the ELM and outperforms the ELM-AE and ELM-RO.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Andre G.C. Pacheco, Renato A. Krohling, Carlos A.S. da Silva,