Article ID Journal Published Year Pages File Type
487969 Procedia Computer Science 2013 6 Pages PDF
Abstract

In this paper we present analysis and solutions to problems related to initial positioning of neurons in a classic self-organizing map (SOM) neural network. This means that we are not concerned with the multitude of growing variants, where new neurons are placed where needed. For our work, we consider placing the neurons on a Hilbert curve, as SOM have the tendency to converge similarly to self-similar curves. Another point of adjustment in SOM is the initial number of neurons, which depends on the data set. Our investigations show that initializing the neurons on a self-similar curve such as Hilbert provides a quality coverage of the input topology in much less number of epochs as compared to the usual random neuron placement. The meaning of quality is measured by absence of tangles in the network, which is one-dimensional SOM utilizing the traditional Kohonen training algorithm. The tangling of SOM presents the problem of topologically close neighbors that are actually far apart in the neuron chain of the 1D network. This is related to issues of proper clustering and analysis of cluster labels and classification. We also experiment and provide analysis where the number of neurons is concerned.

Related Topics
Physical Sciences and Engineering Computer Science Computer Science (General)