کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
566034 | 875912 | 2008 | 18 صفحه PDF | دانلود رایگان |

In this paper, we study the impact of considering context information for the annotation of emotions. Concretely, we propose the inclusion of the history of user–system interaction and the neutral speaking style of users. A new method to automatically include both sources of information has been developed making use of novel techniques for acoustic normalization and dialogue context annotation. We have carried out experiments with a corpus extracted from real human interactions with a spoken dialogue system. Results show that the performance of non-expert human annotators and machine-learned classifications are both affected by contextual information. The proposed method allows the annotation of more non-neutral emotions and yields values closer to maximum agreement rates for non-expert human annotation. Moreover, automatic classification accuracy improves by 29.57% compared to the classical approach based only on acoustic features.
Journal: Speech Communication - Volume 50, Issue 5, May 2008, Pages 416–433