Article ID Journal Published Year Pages File Type
4948050 Neurocomputing 2017 11 Pages PDF
Abstract
Although there are many fault tolerant algorithms for neural networks, they usually focus on one kind of weight failure or node failure only. This paper first proposes a unified fault model for describing the concurrent weight and node failure situation, where open weight fault, open node fault, weight noise, and node noise could happen in a network at the same time. Afterwards, we analyze the training set error of radial basis function (RBF) networks under the concurrent weight and node failure situation. Based on the finding, we define an objective function for tolerating the concurrent weight and node failure situation. We then develop two learning algorithms, one for batch mode learning and one for online mode learning. Furthermore, for the online mode learning, we derive the convergent conditions for two cases, fixed learning rate and adaptive learning rate.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,