کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
536296 | 870495 | 2015 | 7 صفحه PDF | دانلود رایگان |

• We show that kernel ELM classifiers are approximations of infinite SLFN networks.
• KELM space is determined by using a low-rank decomposition of the kernel matrix.
• The adoption of this approach enhances the performance of ELM networks.
In this paper, we discuss the connection of the kernel versions of the ELM classifier with infinite Single-hidden Layer Feedforward Neural networks and show that the original ELM kernel definition can be adopted for the calculation of the ELM kernel matrix for two of the most common activation functions, i.e., the RBF and the sigmoid functions. In addition, we show that a low-rank decomposition of the kernel matrix defined on the input training data can be exploited in order to determine an appropriate ELM space for input data mapping. The ELM space determined from this process can be subsequently used for network training using the original ELM formulation. Experimental results denote that the adoption of the low-rank decomposition-based ELM space determination leads to enhanced performance, when compared to the standard choice, i.e., random input weights generation.
Journal: Pattern Recognition Letters - Volume 54, 1 March 2015, Pages 11–17