Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6866439 | Neurocomputing | 2014 | 35 Pages |
Abstract
Reservoir Computing (RC) is an effective approach to design and train recurrent neural networks, which is successfully and widely applied in real-valued time series modeling tasks. However, RC has been criticized for not being principled enough, namely the reservoir which is unlikely to be optimal because the reservoir connectivity and weight structure are created randomly. A new Simple Cycle Reservoir Network (SCRN) with deterministically constructed connectivity and weight structure can yield performance competitive with standard Echo State Network (ESN). In order to determine the proper size of the reservoir and improve generalization ability of SCRN, a Sensitive Iterated Pruning Algorithm (SIPA), in which a larger than necessary reservoir is employed firstly and then its size is reduced by pruning out the least sensitive internal units, is proposed to optimize the reservoir size and weights of SCRN. A system identification and two time-series benchmark tasks are applied to demonstrate the feasibility and superiority of SIPA. The results show that the SIPA method significantly outperforms a Least Angle Regression (LAR) method and SIPA is able to improve the generalization performance of SCRN. Besides, two well known reservoir characterizations, i.e. pseudo-Lyapunov exponent of the reservoir dynamics and Memory Capacity, and the impact of SIPA on two characterizations are investigated.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Heshan Wang, Xuefeng Yan,