Article ID Journal Published Year Pages File Type
428620 Information Processing Letters 2011 7 Pages PDF
Abstract

By adding different activation functions, a type of gradient-based neural networks is developed and presented for the online solution of Lyapunov matrix equation. Theoretical analysis shows that any monotonically-increasing odd activation function could be used for the construction of neural networks, and the improved neural models have the global convergence performance. For the convenience of hardware realization, the schematic circuit is given for the improved neural solvers. Computer simulation results further substantiate that the improved neural networks could solve the Lyapunov matrix equation with accuracy and effectiveness. Moreover, when using the power-sigmoid activation functions, the improved neural networks have superior convergence when compared to linear models.

► Different activation functions are investigated for the improved GNN models. ► An improved GNN model is exploited for the Lyapunov matrix equation. ► The improved GNN models are theoretically proved to be exponentially convergent. ► The block diagram and its schematic circuit are drawn for such GNN models.

Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, , ,