Article ID Journal Published Year Pages File Type
1706559 Applied Mathematical Modelling 2010 15 Pages PDF
Abstract

Based on the usual approximation scheme, we lay a so-called second order spiking network with the renewal process inputs, which employ both first and second order statistical representation, i.e., the means, variances and correlations of the synaptic input. Then we apply an error minimization technique to train the network and derive the corresponding backpropagation learning rule to present a more biologically plausible so-called Second Order Spiking Perceptron. It shows that such perceptron, even a single neuron, is able to perform various complex non-linear tasks like the XOR problem that classical single-layer perceptrons are unable to perform. Among including the second order statistics in computations, such perceptron offers the most important advantage over their predecessors, in that it can train not only the output means but the output variances by introducing the variance term in the error presentation. As a result, we can not only obtain the desired output means but decrease the output noises (variances) by training the networks, and also can reach the trade-off between the output mean error and output variance by the adjustment of the penalty factor in error function, due to a specific learning task.

Related Topics
Physical Sciences and Engineering Engineering Computational Mechanics
Authors
, ,