Article ID Journal Published Year Pages File Type
405963 Neurocomputing 2016 19 Pages PDF
Abstract

•We have proposed a novel approach of hybridization of higher order neural network (Functional link higher order artificial neural network) with self adaptive harmony search (SAHS) based gradient descent learning (GDL) for non-linear data classification problem.•Proposed approach exhibits better performance than other alternative approaches.•Statistical analysis has been performed by using various statistical methods (Friedman test, Holm and Hochberg procedure, Tukey test and Dunnett test) under null-hypothesis in order to prove the proposed method is statistically valid and better.

In the data classification process involving higher order ANNs, it’s a herculean task to determine the optimal ANN classification model due to non-linear nature of real world datasets. To add to the woe, it is tedious to adjust the set of weights of ANNs by using appropriate learning algorithm to obtain better classification accuracy. In this paper, an improved variant of harmony search (HS), called self-adaptive harmony search (SAHS) along with gradient descent learning is used with functional link artificial neural network (FLANN) for the task of classification in data mining. Using its past experiences, SAHS adjusts the harmonies according to the maximum and minimum values in the current harmony memory. The powerful combination of this unique strategy of SAHS and searching capabilities of gradient descent search is used to obtain optimal set of weights for FLANN. The proposed method (SAHS–FLANN) is implemented in MATLAB and the results are contrasted with other alternatives (FLANN, GA based FLANN, PSO based FLANN, HS based FLANN, improved HS based FLANN and TLBO based FLANN). To illustrate its effectiveness, SAHS–FLANN is tested on various benchmark datasets from UCI machine learning repository by using 5-fold cross validation technique. Under the null-hypothesis, the proposed method is analyzed by using various statistical tests for statistical correctness of results. The performance of the SAHS–FLANN is found to be better and statistically significant in comparison with other alternatives. The SAHS–FLANN differs from HS–FLANN (HS based FLANN) by the elimination of constant parameters (bandwidth and pitch adjustment rate). Furthermore, it leads to the simplification of steps for the improvisation of weight-sets in IHS–FLANN (improved HS based FLANN) by incorporating adjustments of new weight-sets according to the weight-sets with maximum and minimum fitness.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,