کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
4947995 | 1439605 | 2017 | 28 صفحه PDF | دانلود رایگان |
عنوان انگلیسی مقاله ISI
Kernel-based dimensionality reduction using Renyi's α-entropy measures of similarity
دانلود مقاله + سفارش ترجمه
دانلود مقاله ISI انگلیسی
رایگان برای ایرانیان
موضوعات مرتبط
مهندسی و علوم پایه
مهندسی کامپیوتر
هوش مصنوعی
پیش نمایش صفحه اول مقاله

چکیده انگلیسی
Dimensionality reduction (DR) aims to reveal salient properties of high-dimensional (HD) data in a low-dimensional (LD) representation space. Two elements stipulate success of a DR approach: definition of a notion of pairwise relations in the HD and LD spaces, and measuring the mismatch between these relationships in the HD and LD representations of data. This paper introduces a new DR method, termed Kernel-based entropy dimensionality reduction (KEDR), to measure the embedding quality that is based on stochastic neighborhood preservation, involving a Gram matrix estimation of Renyi's α-entropy. The proposed approach is a data-driven framework for information theoretic learning, based on infinitely divisible matrices. Instead of relying upon regular Renyi's entropies, KEDR also computes the embedding mismatch through a parameterized mixture of divergences, resulting in an improved the preservation of both the local and global data structures. Our approach is validated on both synthetic and real-world datasets and compared to several state-of-the-art algorithms, including the Stochastic Neighbor Embedding-like techniques for which DR approach is a data-driven extension (from the perspective of kernel-based Gram matrices). In terms of visual inspection and quantitative evaluation of neighborhood preservation, the obtained results show that KEDR is competitive and promising DR method.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Neurocomputing - Volume 222, 26 January 2017, Pages 36-46
Journal: Neurocomputing - Volume 222, 26 January 2017, Pages 36-46
نویسندگان
A.M. Alvarez-Meza, J.A. Lee, Michel Verleysen, G. Castellanos-Dominguez,