Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
406990 | Neurocomputing | 2013 | 9 Pages |
In this letter, together with some novel Lyapunov–Krasovskii functional (LKF) terms and effective techniques, two novel sufficient conditions can be established to guarantee a class of discrete-time delayed neural networks with distributed delay to be exponentially stable, in which the linear fractional uncertainties are involved and the information on time-delay is fully utilized. Through employing the reciprocal convex technique, some previously ignored terms can be reconsidered when estimating the time difference of LKF and the criteria are presented via linear matrix inequalities (LMIs), whose solvability heavily depends on the information of addressed systems. Finally, three numerical examples are provided to show that the achieved conditions can be less conservative than some existing ones based on comparing results.