|کد مقاله||کد نشریه||سال انتشار||مقاله انگلیسی||ترجمه فارسی||نسخه تمام متن|
|4977442||1451924||2018||10 صفحه PDF||سفارش دهید||دانلود کنید|
- We clustered the samples and then learned a pair of under-determined dictionaries for each cluster. The pair of dictionaries can represent the corresponding samples in this cluster better.
- The atom number of dictionaries of our algorithm is less than that in the over-complete dictionaries. This leads to a significant reduction of the computational complexity for the reconstruction process.
- For the dictionaries, we used the QR decomposition to update. For coefficient matrices, we used the penalty to approximate the sparsity and exploited the soft-thresholding to optimize.
As the extensive applications of sparse representation, the methods of dictionary learning have received widespread attentions. In this paper, we propose a multi-separable dictionary learning (MSeDiL) algorithm for sparse representation, which is based on the Lagrange Multiplier and the QR decomposition. Different with the traditional dictionary learning methods, the training samples are clustered firstly. Then the separable dictionaries for each cluster are optimized by the QR decomposition. The efficiency of the reconstruction process is improved in our algorithm because of the under-determinedness of the dictionaries for each cluster. Experimental results show that with the similar PSNR (Peak Signal to Noise Ratio) and SSIM (Structure Similarity Index), the reconstruction speed of our algorithm is much faster than other dictionary learning methods, especially when the size of samples is large.
Journal: Signal Processing - Volume 143, February 2018, Pages 354-363