Article ID Journal Published Year Pages File Type
558927 Digital Signal Processing 2009 12 Pages PDF
Abstract

A novel class of stochastic gradient descent algorithms is introduced based on the minimisation of convex cost functions with exponential dependence on the adaptation error, instead of the conventional linear combinations of even moments. The derivation is supported by rigourous analysis of the necessary conditions for convergence, the steady state mean square error is calculated and the optimal solutions in the least exponential sense are derived. The normalisation of the associated step size is also considered in order to fully exploit the dynamics of the input signal. Simulation results support the analysis.

Related Topics
Physical Sciences and Engineering Computer Science Signal Processing