Article ID Journal Published Year Pages File Type
6874412 Journal of Computational Science 2018 42 Pages PDF
Abstract
We present a new scalable Probabilistic Neural Network (PNN) construction method suitable for data-neuron parallelism in a ring pipeline parallel topology that allows training a large scale distributed model on a large scale distributed dataset. First the recently proposed Kernel Gradient Subtractive Clustering (KG-SC) automatically selects representative exemplar centers and their number for the PNN kernels. Then Expectation Maximization (EM) refines the PNN parameters. Experimental simulations compare the proposed solution accuracy and performance with PNNs produced from other state-of-the-art k-center clustering algorithms. The parallel and distributed implementations produce speedups close to linear on increasing the number of processors and the dataset size.
Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, ,