کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
4944609 1438006 2017 35 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Consensus algorithms for biased labeling in crowdsourcing
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر هوش مصنوعی
پیش نمایش صفحه اول مقاله
Consensus algorithms for biased labeling in crowdsourcing
چکیده انگلیسی
Although it has become an accepted lay view that when labeling objects through crowdsourcing systems, non-expert annotators often exhibit biases, this argument lacks sufficient evidential observation and systematic empirical study. This paper initially analyzes eight real-world datasets from different domains whose class labels were collected from crowdsourcing systems. Our analyses show that biased labeling is a systematic tendency for binary categorization; in other words, for a large number of annotators, their labeling qualities on the negative class (supposed to be the majority) are significantly greater than are those on the positive class (minority). Therefore, the paper empirically studies the performance of four existing EM-based consensus algorithms, DS, GLAD, RY, and ZenCrowd, on these datasets. Our investigation shows that all of these state-of-the-art algorithms ignore the potential bias characteristics of datasets and perform badly although they model the complexity of the systems. To address the issue of handling biased labeling, the paper further proposes a novel consensus algorithm, namely adaptive weighted majority voting (AWMV), based on the statistical difference between the labeling qualities of the two classes. AWMV utilizes the frequency of positive labels in the multiple noisy label set of each example to obtain a bias rate and then assigns weights derived from the bias rate to negative and positive labels. Comparison results among the five consensus algorithms (AWMV and the four existing) show that the proposed AWMV algorithm has the best overall performance. Finally, this paper notes some potential related topics for future study.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Information Sciences - Volumes 382–383, March 2017, Pages 254-273
نویسندگان
, , , , ,