کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
407437 | 678140 | 2016 | 7 صفحه PDF | دانلود رایگان |
Model complexities of shallow (i.e., one-hidden-layer) networks representing highly varying multivariable {−1,1}{−1,1}-valued functions are studied in terms of variational norms tailored to dictionaries of network units. It is shown that bounds on these norms define classes of functions computable by networks with constrained numbers of hidden units and sizes of output weights. Estimates of probabilistic distributions of values of variational norms with respect to typical computational units, such as perceptrons and Gaussian kernel units, are derived via geometric characterization of variational norms combined with the probabilistic Chernoff Bound. It is shown that almost any randomly chosen {−1,1}{−1,1}-valued function on a sufficiently large d-dimensional domain has variation with respect to perceptrons depending on d exponentially.
Journal: Neurocomputing - Volume 171, 1 January 2016, Pages 598–604