Article ID Journal Published Year Pages File Type
9653422 Neurocomputing 2005 16 Pages PDF
Abstract
A lot of hierarchical learning machines such as neural networks and normal mixtures are singular learning machines. In such a learning machine, the likelihood function cannot be approximated by any quadratic form, resulting that the conventional statistical theory does not hold. This paper proves the symmetrical property of the generalization and training errors based on the algebraic geometrical method. Firstly, a new parameterization is introduced by applying the resolution of singularities. Secondly, the asymptotic behavior of the likelihood function is clarified based on the empirical process theory. Lastly, the asymptotic forms of the generalization and training errors are derived. The result will be a mathematical foundation of model selection and hypothesis testing in singular learning machines.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
,