Article ID Journal Published Year Pages File Type
4946988 Neurocomputing 2017 26 Pages PDF
Abstract
A new approach based on wavelet sampling is proposed to overcome overfitting in neural networks. Our approach optimizes input weights and network structure according to the empirical distribution of input training data. Thus only output weights are adjusted from training data errors. Using the fact that our algorithm trains input and output weights in independent procedures, our theorems demonstrate that it has rapid and global convergence. More importantly, we redefine a norm on l2 space, corresponding to a useful new cost function. Using this cost function, the algorithm improves the ability of our networks to distinguish target functions from noise. In fact, we prove that this algorithm allows neural networks to act as wavelet filters, yielding good generalization, approximation and anti-noise capacities. Our simulations verify these theoretical results and simultaneously show the algorithm is robust to noise.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,