Article ID Journal Published Year Pages File Type
566319 Signal Processing 2015 11 Pages PDF
Abstract

•Development of the q-gradient.•Development of the q-LMS algorithm.•Derivation of closed-form expressions for the mean-square-error for the proposed algorithm.•Extensive simulation results are carried out to corroborate the theoretical findings.

The Least Mean Square (LMS) algorithm inherits slow convergence due to its dependency on the eigenvalue spread of the input correlation matrix. In this work, we resolve this problem by developing a novel variant of the LMS algorithms based on the q-derivative concept. The q-gradient is an extension of the classical gradient vector based on the concept of Jackson׳s derivative. Here, we propose to minimize the LMS cost function by employing the concept of q-derivative instead of the convent ional derivative. Thanks to the fact that the q-derivative takes larger steps in the search direction as it evaluates the secant of the cost function rather than the tangent (as in the case of a conventional derivative), we show that the q  -derivative gives faster convergence for q>1q>1 when compared to the conventional derivative. Then, we present a thorough investigation of the convergence behavior of the proposed q-LMS algorithm and carry out different analyses to assess its performance. Consequently, new explicit closed-form expressions for the mean-square-error (MSE) behavior are derived. Simulation results are presented to corroborate our theoretical findings.

Related Topics
Physical Sciences and Engineering Computer Science Signal Processing
Authors
, , , ,