کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
404753 | 677447 | 2008 | 11 صفحه PDF | دانلود رایگان |

This paper investigates the relation between over-fitting and weight size in neural network regression. The over-fitting of a network to Gaussian noise is discussed. Using re-parametrization, a network function is represented as a bounded function gg multiplied by a coefficient cc. This is considered to bound the squared sum of the outputs of gg at given inputs away from a positive constant δnδn, which restricts the weight size of a network and enables the probabilistic upper bound of the degree of over-fitting to be derived. This reveals that the order of the probabilistic upper bound can change depending on δnδn. By applying the bound to analyze the over-fitting behavior of one Gaussian unit, it is shown that the probability of obtaining an extremely small value for the width parameter in training is close to one when the sample size is large.
Journal: Neural Networks - Volume 21, Issue 1, January 2008, Pages 48–58