کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
2576686 | 1561357 | 2007 | 4 صفحه PDF | دانلود رایگان |
Classic models of face perception propose that emotional expressions and identity are processed in parallel and independently, nevertheless, some recent studies suggest that they may interact. Neuroimaging shows that different cerebral areas are involved as a function of emotional expression, and that processing of expressions starts by 100 ms. Typically, studies find that processing of identity occurs later, but distinctions between brain areas involved in these two types of information extracted from faces has not been determined. We presented faces expressing 3 different emotions to 11 adults. Subjects responded to 1) faces that were repeated regardless of emotion (i.e., an identity match) and 2) to repeated emotions regardless of identity (an emotion match task). MEG was recorded with a 151 sensor CTF/VSM system. Recognition of happy faces had faster RTs and higher accuracy than neutral or fearful faces. Three MEG components were measured, and showed early activation not only in the visual cortex, but also in frontal areas and insula that were task and emotion specific.
Journal: International Congress Series - Volume 1300, June 2007, Pages 397–400