Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
4946651 | Neural Networks | 2017 | 17 Pages |
Abstract
In this paper, we aim at analyzing the approximation abilities of shallow networks in reproducing kernel Hilbert spaces (RKHSs). We prove that there is a probability measure such that the achievable lower bound for approximating by shallow nets can be realized for all functions in balls of reproducing kernel Hilbert space with high probability, which is different with the classical minimax approximation error estimates. This result together with the existing approximation results for deep nets shows the limitations for shallow nets and provides a theoretical explanation on why deep nets perform better than shallow nets.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Shao-Bo Lin,