Article ID Journal Published Year Pages File Type
910749 Journal of Communication Disorders 2015 14 Pages PDF
Abstract

•Obtaining expert ratings of speech data can be a slow and challenging process.•Ratings are easily obtained via crowdsourcing, e.g. Amazon Mechanical Turk (AMT).•AMT listener samples were compared to a “gold standard” and an “industry standard.”•Samples of ≥ 9 AMT listeners converged with “industry standard” rating behavior.•Speech researchers could benefit from broader adoption of crowdsourcing methods.

Blinded listener ratings are essential for valid assessment of interventions for speech disorders, but collecting these ratings can be time-intensive and costly. This study evaluated the validity of speech ratings obtained through online crowdsourcing, a potentially more efficient approach. 100 words from children with /r/ misarticulation were electronically presented for binary rating by 35 phonetically trained listeners and 205 naïve listeners recruited through the Amazon Mechanical Turk (AMT) crowdsourcing platform. Bootstrapping was used to compare different-sized samples of AMT listeners against a “gold standard” (mode across all trained listeners) and an “industry standard” (mode across bootstrapped samples of three trained listeners). There was strong overall agreement between trained and AMT listeners. The “industry standard” level of performance was matched by bootstrapped samples with n = 9 AMT listeners. These results support the hypothesis that valid ratings of speech data can be obtained in an efficient manner through AMT. Researchers in communication disorders could benefit from increased awareness of this method.Learning outcomes: Readers will be able to (a) discuss advantages and disadvantages of data collection through the crowdsourcing platform Amazon Mechanical Turk (AMT), (b) describe the results of a validity study comparing samples of AMT listeners versus phonetically trained listeners in a speech-rating task.

Related Topics
Life Sciences Neuroscience Cognitive Neuroscience
Authors
, , ,