Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6836872 | Computers in Human Behavior | 2016 | 9 Pages |
Abstract
Behavior-directed intentions can be revealed by certain biological signals that precede behaviors. This study used eye movement data to infer human behavioral intentions. Participants were asked to view pictures while operating under different intentions, which necessitated cognitive search and affective appraisal. Intentions regarding the pictures were non-specific or specific, specific intentions were cognitive or affective, and affective intentions were to evaluate either the positive or negative emotions expressed by the individuals depicted. The affective task group made more fixations and had a larger average pupil size than the cognitive task group. The positive appreciation group made more and shorter fixations, on average, than the negative appreciation group. However, support vector machine algorithms revealed low classification accuracy. This was due to large inter-individual variance and psychological factors underlying intentions. We demonstrated improvement in classification accuracy using individual repeated measures data, which helped infer participants' self-selected intentions.
Keywords
Related Topics
Physical Sciences and Engineering
Computer Science
Computer Science Applications
Authors
Hyeonggyu Park, Sangil Lee, Minho Lee, Mun-Seon Chang, Ho-Wan Kwak,