Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
10334728 | Theoretical Computer Science | 2005 | 33 Pages |
Abstract
Afterwards, we consider the objective to minimize the failure ratio in the presence of misclassification errors. We show that it is NP-hard to approximate the failure ratio within any positive constant for a multilayered threshold network with varying input dimension and a fixed number of neurons in the hidden layer if the thresholds of the neurons in the first hidden layer are zero. Furthermore, even obtaining weak approximations is almost NP-hard in the same situation.
Related Topics
Physical Sciences and Engineering
Computer Science
Computational Theory and Mathematics
Authors
Bhaskar DasGupta, Barbara Hammer,