Article ID Journal Published Year Pages File Type
407914 Neurocomputing 2013 9 Pages PDF
Abstract

The sensitivity of a neural network's output to its inputs' perturbations is an important measure for evaluating the network's performance. To make the sensitivity be a practical tool for designing and implementing Multilayer Perceptrons (MLPs), this paper proposes a general approach to quantify the sensitivity of MLPs. The sensitivity is defined as the mathematical expectation of absolute output deviations due to input perturbations with respect to all possible inputs, and computed following a bottom-up way, in which the sensitivity of a neuron is first considered and then is that of the entire network. The main contribution of the approach is that it requests a weak assumption on the input, that is, input elements need only to be independent of each other without being restricted to have a certain type of distribution and thus is more applicable to real applications. Some experimental results on artificial datasets and real datasets demonstrate the proposed approach is highly accurate.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , ,