|Home | About | Journals | Submit | Contact Us | Français|
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Human emotional expressions serve an important communicatory role allowing the rapid transmission of valence information among individuals. We aimed at exploring the neural networks mediating the recognition of and empathy with human facial expressions of emotion.
A principal component analysis was applied to event-related functional magnetic imaging (fMRI) data of 14 right-handed healthy volunteers (29 +/- 6 years). During scanning, subjects viewed happy, sad and neutral face expressions in the following conditions: emotion recognition, empathizing with emotion, and a control condition of simple object detection. Functionally relevant principal components (PCs) were identified by planned comparisons at an alpha level of p < 0.001.
Four PCs revealed significant differences in variance patterns of the conditions, thereby revealing distinct neural networks: mediating facial identification (PC 1), identification of an expressed emotion (PC 2), attention to an expressed emotion (PC 12), and sense of an emotional state (PC 27).
Our findings further the notion that the appraisal of human facial expressions involves multiple neural circuits that process highly differentiated cognitive aspects of emotion.