|کد مقاله||کد نشریه||سال انتشار||مقاله انگلیسی||ترجمه فارسی||نسخه تمام متن|
|350216||618433||2016||11 صفحه PDF||سفارش دهید||دانلود رایگان|
• Agree–disagree questions about social media use are prone to acquiescence bias.
• Acquiescence inflates reliabilities and factor loadings, and alters correlations.
• Both better design and statistical tools can correct for these biases.
• There are benefits and limits for each corrective approach.
• Research would be improved by avoiding agree–disagree questions.
Social media measurement relies heavily on self-report survey research. Hence, known biases in how individuals answer survey questions can introduce systematic errors into the social media literature. In particular, many common social media measures are prone to acquiescence response bias, an error that occurs due to individuals' tendency to agree with agree–disagree questions. The current study tests a series of techniques to both detect and overcome acquiescence bias in the context of Facebook measurement. Controlling for individuals' tendency to agree with agree–disagree questions, we find evidence that acquiescence has inflated the reliabilities and factor loadings of many Facebook use scales, and has altered correlations both among Facebook use measures and between those measures and related covariates. Further, when the individual-level tendency to agree with questions is controlled, Facebook measures demonstrate greater criterion validity in their relations to items that do not use agree–disagree scales. Having identified the presence of acquiescent responding, we test three methods for mitigating this response bias: the use of balanced scales, item-specific questions, and statistical correctives. All three methods appear to reduce the bias introduced by acquiescence. Thus, the results provide comparative evidence on strategies to alleviate the consistent impact of an important method bias in social media measurement and thereby contribute to improving the validity of social media research at large.
Journal: Computers in Human Behavior - Volume 57, April 2016, Pages 82–92