کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
10998011 1365117 2018 14 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Exploring coherent topics by topic modeling with term weighting
ترجمه فارسی عنوان
بررسی موضوعات منسجم با استفاده از مدل سازی موضوع با وزن بندی دوره
کلمات کلیدی
مدل سازی موضوع وزن ترمیمی، کلمه آموزنده، آنتروپی شرطی،
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر نرم افزارهای علوم کامپیوتر
چکیده انگلیسی
Topic models often produce unexplainable topics that are filled with noisy words. The reason is that words in topic modeling have equal weights. High frequency words dominate the top topic word lists, but most of them are meaningless words, e.g., domain-specific stopwords. To address this issue, in this paper we aim to investigate how to weight words, and then develop a straightforward but effective term weighting scheme, namely entropy weighting (EW). The proposed EW scheme is based on conditional entropy measured by word co-occurrences. Compared with existing term weighting schemes, the highlight of EW is that it can automatically reward informative words. For more robust word weight, we further suggest a combination form of EW (CEW) with two existing weighting schemes. Basically, our CEW assigns meaningless words lower weights and informative words higher weights, leading to more coherent topics during topic modeling inference. We apply CEW to Dirichlet multinomial mixture and latent Dirichlet allocation, and evaluate it by topic quality, document clustering and classification tasks on 8 real world data sets. Experimental results show that weighting words can effectively improve the topic modeling performance over both short texts and normal long texts. More importantly, the proposed CEW significantly outperforms the existing term weighting schemes, since it further considers which words are informative.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Information Processing & Management - Volume 54, Issue 6, November 2018, Pages 1345-1358
نویسندگان
, , , , ,