کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
405794 | 678031 | 2016 | 9 صفحه PDF | دانلود رایگان |

Hyperspectral unmixing is a hot topic in signal and image processing. A set of high-dimensional data matrices can be decomposed into two sets of non-negative low-dimensional matrices by Non-negative matrix factorization (NMF). However, the algorithm has many local solutions because of the non-convexity of the objective function. Some algorithms solve this problem by adding auxiliary constraints, such as sparse. The sparse NMF has a good performance but the result is unstable and sensitive to noise. Using the structural information for the unmixing approaches can make the decomposition stable. Someone used a clustering based on Euclidean distance to guide the decomposition and obtain good performance. The Euclidean distance is just used to measure the straight line distance of two points. However, the ground objects usually obey certain statistical distribution. It׳s difficult to measure the difference between the statistical distributions comprehensively by Euclidean distance. Kullback–Leibler divergence (KL divergence) is a better metric. In this paper, we propose a new approach named KL divergence constrained NMF which measures the statistical distribution difference using KL divergence instead of the Euclidean distance. It can improve the accuracy of structured information by using the KL divergence in the algorithm. Experimental results based on synthetic and real hyperspectral data show the superiority of the proposed algorithm with respect to other state-of-the-art algorithms.
Journal: Neurocomputing - Volume 204, 5 September 2016, Pages 153–161