Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
5002820 | IFAC-PapersOnLine | 2016 | 6 Pages |
Abstract
Initial weight choice is an important aspect of the training mechanism for feedforward neural networks. This paper deals with a particular topology of a feedforward neural network, where symmetric linear saturated activation functions are used in a hidden layer. Training of such a topology is a tricky procedure, since the activation functions are not fully differentiable. Thus, a proper initialization method for that case is even more important, than dealing with neural networks with sigmoid activation functions. Therefore, several initialization possibilities are examined and tested here. As a result, particular initialization methods are recommended for application, according to the class of the task to be solved.
Related Topics
Physical Sciences and Engineering
Engineering
Computational Mechanics
Authors
Petr Dolezel, Pavel Skrabanek, Lumir Gago,