Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6864933 | Neurocomputing | 2018 | 27 Pages |
Abstract
Currently, neural networks deliver state of the art performance on multiple machine learning tasks, mainly because of their ability to learn features. However, the architecture of the neural network still requires problem-specific tuning and the long training times and hardware requirements remain an issue. In this work, the Multi-Scale Auto-Tuned Extreme Learning Machine (MSATELM) architecture is proposed, which does not require any manual feature crafting or architecture tuning and automatically learns structure and weights using an auto-tuned ELM as building block. It learns a simple model that achieves the required accuracy. The GPU implementation in OpenCL allows handling any number of samples while still delivering portable code and high performance. Results on MNIST, CIFAR-10 and UCI datasets demonstrate that this approach provides competitive results even though no problem-specific tuning is used.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Douglas Coimbra de Andrade, LuÃs Gonzaga Trabasso,