Article ID Journal Published Year Pages File Type
408605 Neurocomputing 2007 15 Pages PDF
Abstract

An extension of the well-known probabilistic neural network (PNN) to generalized locally recurrent PNN (GLR PNN) is introduced. The GLR PNN is derived from the original PNN by incorporating a fully connected recurrent layer between the pattern and output layers. This extension renders GLR PNN sensitive to the context in which events occur, and therefore, capable of identifying temporal and spatial correlations. In the present work, this capability is exploited to improve the speaker verification performance. A fast three-step method for training GLR PNNs is proposed. The first two steps are identical to the training of original PNNs, while the third step is based on the differential evolution (DE) optimization method.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,