کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
4969623 | 1449975 | 2017 | 13 صفحه PDF | دانلود رایگان |
عنوان انگلیسی مقاله ISI
Sparse Relational Topical Coding on multi-modal data
دانلود مقاله + سفارش ترجمه
دانلود مقاله ISI انگلیسی
رایگان برای ایرانیان
موضوعات مرتبط
مهندسی و علوم پایه
مهندسی کامپیوتر
چشم انداز کامپیوتر و تشخیص الگو
پیش نمایش صفحه اول مقاله

چکیده انگلیسی
Multi-modal data modeling lately has been an active research area in pattern recognition community. Existing studies mainly focus on modeling the content of multi-modal documents, whilst the links amongst documents are commonly ignored. However, link information has shown being of key importance in many applications, such as document navigation, classification, and clustering. In this paper, we present a non-probabilistic formulation of Relational Topic Model (RTM), i.e., Sparse Relational Multi-Modal Topical Coding (SRMMTC), to model both multi-modal documents and the corresponding link information. SRMMTC has the following three appealing properties: i) It can effectively produce sparse latent representations via directly imposing sparsity-inducing regularizers. ii) It handles the imbalance issues on multi-modal data collections by introducing regularization parameters for positive and negative links, respectively; iii) It can be solved by an efficient coordinate descent algorithm. We also explore a generalized version of SRMMTC to find pairwise interactions amongst topics. Our methods are also capable of performing link prediction for documents, as well as the prediction of annotation words for attendant images in documents. Empirical studies on a set of benchmark datasets show that our proposed models significantly outperform many state-of-the-art methods.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Pattern Recognition - Volume 72, December 2017, Pages 368-380
Journal: Pattern Recognition - Volume 72, December 2017, Pages 368-380
نویسندگان
Lingyun Song, Jun Liu, Minnan Luo, Buyue Qian, Kuan Yang,