Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
455742 | Computers & Electrical Engineering | 2013 | 8 Pages |
•Affect sensing in user’s own social settings is vital for human–machine interaction.•Fusing different affective cues provide a better understanding of user engagement.•Detecting temporal dynamics of neuromuscular actions improves affect recognition.
As robots are increasingly being viewed as social entities to be integrated in our daily lives, social perceptive abilities seem a necessary requirement for enabling more natural interaction with human users. In this paper, we present an interaction scenario where user play chess with an iCat robot and propose an affect recognition system that uses computational models to automatically extract visual features allowing the detection of the level of engagement with a social robot that acts as a game companion. Experimental results show that the multimodal integration of head direction information with facial expressions displayed by the user improves the recognition of the user’s affective states.
Graphical abstractFigure optionsDownload full-size imageDownload as PowerPoint slide