Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
10326498 | Neural Networks | 2011 | 7 Pages |
Abstract
We consider the optimal rate of approximation by single hidden feed-forward neural networks on the unit sphere. It is proved that there exists a neural network with n neurons, and an analytic, strictly increasing, sigmoidal activation function such that the deviation of a Sobolev class W2r2(Sd) from the class of neural networks ΦnÏ, behaves asymptotically as nâ2rdâ1. Namely, we prove that the essential rate of approximation by spherical neural networks is nâ2rdâ1.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Shaobo Lin, Feilong Cao, Zongben Xu,