Article ID Journal Published Year Pages File Type
408953 Neurocomputing 2016 10 Pages PDF
Abstract

This paper focuses on finite-time recurrent neural networks with continuous but non-smooth activation function solving nonlinearly constrained optimization problems. Firstly, definition of finite-time stability and finite-time convergence criteria are reviewed. Secondly, a finite-time recurrent neural network is proposed to solve the nonlinear optimization problem. It is shown that the proposed recurrent neural network is globally finite-time stable under the condition that the Hessian matrix of the associated Lagrangian function is positive definite. Its output converges to a minimum solution globally and finite-time, which means that the actual minimum solution can be derived in finite-time period. In addition, our recurrent neural network is applied to a hydrothermal scheduling problem. Compared with other methods, a lower consumption scheme can be derived in finite-time interval. At last, numerical simulations demonstrate the superiority and effectiveness of our proposed neural networks by solving nonlinear optimization problems with inequality constraints.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,