Article ID Journal Published Year Pages File Type
10326108 Neural Networks 2005 16 Pages PDF
Abstract
In this paper, we study a natural extension of multi-layer perceptrons (MLP) to functional inputs. We show that fundamental results for classical MLP can be extended to functional MLP. We obtain universal approximation results that show the expressive power of functional MLP is comparable to that of numerical MLP. We obtain consistency results, which imply that the estimation of optimal parameters for functional MLP is statistically well defined. We finally show on simulated and real world data that the proposed model performs in a very satisfactory way.
Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, ,