کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
534556 | 870265 | 2014 | 14 صفحه PDF | دانلود رایگان |
• Neural net with adaptive activation functions of (virtually) any generic form.
• Activation functions are learned according to the nature of the data.
• Probabilistic weights are estimated and assigned to the activation functions.
• Experiments on regression and classification tasks.
• Positive results in terms of performance and complexity of the model.
Standard feedforward neural networks benefit from the nice theoretical properties of mixtures of sigmoid activation functions, but they may fail in several practical learning tasks. These tasks would be better faced by relying on a more appropriate, problem-specific basis of activation functions. The paper presents a connectionist model which exploits adaptive activation functions. Each hidden unit in the network is associated with a specific pair (f(·),p(·))(f(·),p(·)), where f(·)f(·) is the activation function and p(·)p(·) is the likelihood of the unit being relevant to the computation of the network output over the current input. The function f(·)f(·) is optimized in a supervised manner, while p(·)p(·) is realized via a statistical parametric model learned through unsupervised (or, partially supervised) estimation. Since f(·)f(·) and p(·)p(·) influence each other’s learning process, the overall machine is implicitly a co-trained coupled model and, in turn, a flexible, non-standard neural architecture. Feasibility of the approach is corroborated by empirical evidence yielded by computer simulations involving regression and classification tasks.
Journal: Pattern Recognition Letters - Volume 37, 1 February 2014, Pages 178–191