Article ID Journal Published Year Pages File Type
410912 Neurocomputing 2006 11 Pages PDF
Abstract

The interpretation of weights in a neural network is seldom straightforward. Recently, we have shown perceptron-based learning to yield better brain-wave classification rates than learning based on averaging and optimal filtering. By virtue of our implementation, we are able to interpret the weights as a time series and to relate them to prototypes generated by averaging. In this paper, some results of four closely related linear models are shown. They are based on averaging, averaging with filtering, Tikhonov regularization, and a single-layer neural network. We then introduce this interpretation for a Tikhonov-regularized linear model and a single-layer neural network with a linear-transfer function. We show, using Tikhonov regularization as an example, how such an interpretation can be used to gain insight into the mechanisms of various perceptron-based methods.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , , ,