کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
563762 | 1451963 | 2014 | 10 صفحه PDF | دانلود رایگان |

• Reweighted l1-norml1-norm penalized least mean square (LMS) algorithm is developed.
• The convergence rate and excess mean square error of the proposed LMS are derived.
• The performance of the proposed LMS depending on the sparsity level is investigated.
• Comparison to other state-of-the-art sparsity aware LMS algorithms is given.
A new reweighted l1-norml1-norm penalized least mean square (LMS) algorithm for sparse channel estimation is proposed and studied in this paper. Since standard LMS algorithm does not take into account the sparsity information about the channel impulse response (CIR), sparsity-aware modifications of the LMS algorithm aim at outperforming the standard LMS by introducing a penalty term to the standard LMS cost function which forces the solution to be sparse. Our reweighted l1-norml1-norm penalized LMS algorithm introduces in addition a reweighting of the CIR coefficient estimates to promote a sparse solution even more and approximate l0-pseudo-norml0-pseudo-norm closer. We provide in depth quantitative analysis of the reweighted l1-norml1-norm penalized LMS algorithm. An expression for the excess mean square error (MSE) of the algorithm is also derived which suggests that under the right conditions, the reweighted l1-norml1-norm penalized LMS algorithm outperforms the standard LMS, which is expected. However, our quantitative analysis also answers the question of what is the maximum sparsity level in the channel for which the reweighted l1-norml1-norm penalized LMS algorithm is better than the standard LMS. Simulation results showing the better performance of the reweighted l1-norml1-norm penalized LMS algorithm compared to other existing LMS-type algorithms are given.
Journal: Signal Processing - Volume 104, November 2014, Pages 70–79