Article ID Journal Published Year Pages File Type
427134 Information Processing Letters 2013 6 Pages PDF
Abstract

•An improved GNN models is exploited for the Lyapunov matrix equation.•Lyapunov equation can be decomposed into sub-equations solved by the presented GNN.•An illustrative example is presented to verify the improved GNN.

By using the hierarchical identification principle, based on the conventional gradient search, two neural subsystems are developed and investigated for the online solution of the well-known Lyapunov matrix equation. Theoretical analysis shows that, by using any monotonically-increasing odd activation function, the gradient-based neural networks (GNN) can solve the Lyapunov equation exactly and efficiently. Computer simulation results confirm that the solution of the presented GNN models could globally converge to the solution of the Lyapunov matrix equation. Moreover, when using the power-sigmoid activation functions, the GNN models have superior convergence when compared to linear models.

Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics
Authors
, , ,