Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6882973 | Computer Networks | 2015 | 11 Pages |
Abstract
Experiments for the subjective evaluation of multimedia presentations and content are traditionally conducted in a laboratory environment. In this respect common procedures for the evaluation of teleconference systems are no different. The strictly controlled laboratory environment, however, often gives a rather poor representation of the actual use case. Therefore in this study we crowdsourced the evaluation of a teleconference system to perform the evaluation in a real-life environment. Moreover, we used the unique possibilities of crowdsourcing to employ two different demographics by hiring workers from Germany on the one hand and the US and Great Britain on the other hand. The goal of this experiment was to assess the perceived Quality of Experience (QoE) during a listening test and compare the results to results from a similar listening test conducted in the controlled laboratory environment. In doing so, we observed not only intriguing differences in the collected QoE ratings between the results of laboratory and crowdsourcing experiments, but also between the different worker demographics in terms of reliability, availability and efficiency.
Related Topics
Physical Sciences and Engineering
Computer Science
Computer Networks and Communications
Authors
Thomas Volk, Christian Keimel, Michael Moosmeier, Klaus Diepold,