Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
410791 | Neurocomputing | 2008 | 6 Pages |
Abstract
We present an investigation of the generalization ability of finite size perceptrons with binary couplings. The results for the expected generalization error provide a guide for practical applications by establishing limits for the learning capacity of finite systems. The method applied to find solutions was the genetic algorithm, which showed to be efficient, even for values of αα larger then the Gardner–Derrida storage capacity αGD=1.245αGD=1.245, for which the number of solutions is largely reduced. We show that the generalization error of finite size networks for αα up to αGDαGD coincides with the value calculated through the statistical mechanical analysis in the thermodynamic limit.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
D.M.L. Barbato, J.J. De Groote,