Article ID Journal Published Year Pages File Type
430202 Journal of Computer and System Sciences 2016 19 Pages PDF
Abstract

•We characterize the attractor-based expressive power of several models of recurrent neural networks.•The deterministic rational-weighted networks are Muller Turing equivalent.•The deterministic real-weighted and evolving networks recognize the class of BC(Π20) neural ω languages.•The nondeterministic rational and real networks recognize the class of Σ11 neural ω-languages.

We provide a characterization of the expressive powers of several models of deterministic and nondeterministic first-order recurrent neural networks according to their attractor dynamics. The expressive power of neural nets is expressed as the topological complexity of their underlying neural ω-languages, and refers to the ability of the networks to perform more or less complicated classification tasks via the manifestation of specific attractor dynamics. In this context, we prove that most neural models under consideration are strictly more powerful than Muller Turing machines. These results provide new insights into the computational capabilities of recurrent neural networks.

Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, ,