کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
472743 698744 2007 13 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر علوم کامپیوتر (عمومی)
پیش نمایش صفحه اول مقاله
A recurrent neural network computing the largest imaginary or real part of eigenvalues of real matrices
چکیده انگلیسی

As the efficient calculation of eigenpairs of a matrix, especially, a general real matrix, is significant in engineering, and neural networks run asynchronously and can achieve high performance in calculation, this paper introduces a recurrent neural network (RNN) to extract some eigenpair. The RNN, whose connection weights are dependent upon the matrix, can be transformed into a complex differential system whose variable z(t)z(t) is a complex vector. By the analytic expression of |z(t)|2|z(t)|2, the convergence properties of the RNN are analyzed in detail. With general nonzero initial complex vector, the RNN obtains the largest imaginary part of all eigenvalues. By a rearrangement of connection matrix, the largest real part is obtained. A practice of a 7×7 matrix indicates the validity of this method. Two matrices, whose dimensionalities are 50 and 100, respectively, are employed to test the efficiency of this approach when dimension number becomes large. The results imply that the iteration number at which the network enters into equilibrium state is not sensitive with dimensionality. This RNN can be used to estimate the largest modulus of eigenvalues, etc. Compared with other neural networks designed for the similar aims, this RNN is applicable to general real matrices.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Computers & Mathematics with Applications - Volume 53, Issue 1, January 2007, Pages 41–53
نویسندگان
, , ,