Article ID Journal Published Year Pages File Type
406772 Neurocomputing 2014 15 Pages PDF
Abstract

This work explores annealed cooperative–competitive learning of multiple modules of Mahalanobis normalized radial basis functions (NRBF) with applications to nonlinear function approximation and chaotic differential function approximation. A multilayer neural network is extended to be composed of multiple Mahalanobis-NRBF modules. Each module activates normalized outputs of radial basis functions, determining Mahalanobis radial distances based on its own adaptable weight matrix. An essential cooperative scheme well decomposes learning a multi-module network to sub-tasks of learning individual modules. Adaptable network interconnections are asynchronously updated module-by-module based on annealed cooperative–competitive learning for function approximation under a physical-like mean-field annealing process. Numerical simulations show outstanding performance of annealed cooperative–competitive learning of a multi-module Mahalanobis-NRBF network for nonlinear function approximation and long term look-ahead prediction of chaotic time series.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,