Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
400835 | International Journal of Human-Computer Studies | 2015 | 16 Pages |
•Emotional intelligence system with expressive and perceptive modules is evaluated.•Physical behavior is examined for emotional expression in a non-humanoid robot.•Benefits are seen in dynamic motion and collocated presence as opposed to static pose.•We present an architecture for synthetically generating expressive movement.
For social robots to respond to humans in an appropriate manner, they need to use apt affect displays, revealing underlying emotional intelligence. We present an artificial emotional intelligence system for robots, with both a generative and a perceptual aspect. On the generative side, we explore the expressive capabilities of an abstract, faceless, creature-like robot, with very few degrees of freedom, lacking both facial expressions and the complex humanoid design found often in emotionally expressive robots. We validate our system in a series of experiments: in one study, we find an advantage in classification for animated vs static affect expressions and advantages in valence and arousal estimation and personal preference ratings for both animated vs static and physical vs on-screen expressions. In a second experiment, we show that our parametrically generated expression variables correlate with the intended user affect perception. Combining the generative system with a perceptual component of natural language sentiment analysis, we show in a third experiment that our automatically generated affect responses cause participants to show signs of increased engagement and enjoyment compared with arbitrarily chosen comparable motion parameters.