|Home | About | Journals | Submit | Contact Us | Français|
Most of our social interactions involve perception of emotional information from the faces of other people. Furthermore, such emotional processes are thought to be aberrant in a range of clinical disorders, including psychosis and depression. However, the exact neurofunctional maps underlying emotional facial processing are not well defined.
Two independent researchers conducted separate comprehensive PubMed (1990 to May 2008) searches to find all functional magnetic resonance imaging (fMRI) studies using a variant of the emotional faces paradigm in healthy participants. The search terms were: “fMRI AND happy faces,” “fMRI AND sad faces,” “fMRI AND fearful faces,” “fMRI AND angry faces,” “fMRI AND disgusted faces” and “fMRI AND neutral faces.” We extracted spatial coordinates and inserted them in an electronic database. We performed activation likelihood estimation analysis for voxel-based meta-analyses.
Of the originally identified studies, 105 met our inclusion criteria. The overall database consisted of 1785 brain coordinates that yielded an overall sample of 1600 healthy participants. Quantitative voxel-based meta-analysis of brain activation provided neurofunctional maps for 1) main effect of human faces; 2) main effect of emotional valence; and 3) modulatory effect of age, sex, explicit versus implicit processing and magnetic field strength. Processing of emotional faces was associated with increased activation in a number of visual, limbic, temporoparietal and prefrontal areas; the putamen; and the cerebellum. Happy, fearful and sad faces specifically activated the amygdala, whereas angry or disgusted faces had no effect on this brain region. Furthermore, amygdala sensitivity was greater for fearful than for happy or sad faces. Insular activation was selectively reported during processing of disgusted and angry faces. However, insular sensitivity was greater for disgusted than for angry faces. Conversely, neural response in the visual cortex and cerebellum was observable across all emotional conditions.
Although the activation likelihood estimation approach is currently one of the most powerful and reliable meta-analytical methods in neuroimaging research, it is insensitive to effect sizes.
Our study has detailed neurofunctional maps to use as normative references in future fMRI studies of emotional facial processing in psychiatric populations. We found selective differences between neural networks underlying the basic emotions in limbic and insular brain regions.
Humans, like other primates,1 are intensely social creatures and their lives are intertwined with those of other people. Most of our social interactions involve recognizing other people’s identities, actions, emotions and intentions. Much of this information is available from their facial expressions. Facial expressions are powerful nonverbal displays of emotion that signal valence information to others and contain information that is vital in the complex social world.2 Recognizing facial expressions permits us to detect another person’s emotional state and provides cues on how to respond in these social interactions.3,4 Some basic emotions can be most reliably recognized from facial expressions (i.e., fear, disgust, anger, happiness, sadness) and have been shown to be universal in their performance and perception.5 Facial perception is defined as “any higher-level visual processing of faces,”6 which involves both perceptual processing — identifying the geometric configuration of facial features to discriminate among different stimuli on the basis of their appearance — and recognition of the emotional meaning of a stimulus.5 Thus, facial emotion perception combines current visual sensory input with retrievable memory and is an important inherited ability evident since the neonatal stages.4
Given the crucial role played by human emotional faces processing in social function, over the past 2 decades affective neurosciences have shown an intense interest in understanding the neural mechanisms that support face perception.7 In particular, functional brain imaging techniques such as functional magnetic resonance imaging (fMRI), which allows the in vivo investigation of the human brain, have been employed to address the neurophysiological substrates of emotional processing. Despite the growing number of fMRI studies in the field, when taken separately these individual imaging studies indicate contrasting findings8 and are unable to definitively characterize what brain regions are associated with each specific emotional condition. Although methodological factors such as different task designs, imaging methods and analysis may be a source of heterogeneity across studies, the major limitations of current literature are the small sample sizes and the associated low statistical power of most fMRI studies. Furthermore, the modulatory effect of other confounding factors such as age,9–13 sex14–17 and type of emotional processing (i.e., explicit or implicit)18–22 is not completely clear.
The above variations have made it particularly difficult to interpret the differences in activation patterns found in clinical populations such as people with depression or psychosis. Thus, the analysis of the consistency and convergence of results across experiments is a crucial prerequisite for correct generalizations about human brain functions.8 Without normative maps, it is difficult to definitively ascertain which of the fMRI alterations observed in depressed or psychotic patients are due to neurobiological disruptions and which are due to other external methodological inconsistencies. Although previous reviews23–30 of facial emotional processing and some meta-analyses31,32 of functional neuroanatomy of emotions have been published, to our knowledge no activation likelihood estimation meta-analysis of fMRI studies during facial emotional processing is currently available. In coordinate-based meta-analyses such as the activation likelihood estimation, activation coordinates reported from independently performed imaging experiments are analyzed in search of functional cortical areas that are relevant for the investigated function.8 Therefore, activation foci do not represent single points but rather localization probability distributions centered on the particular coordinates. In activation likelihood estimation, the foci reported by each study are modelled as a probability distribution. For these reasons, the activation likelihood estimation approach is currently one of the most powerful and reliable meta-analytical methods in imaging research and offers consistent advantages over other approaches.
The aim of our study was to clarify the participation of the different neural regions and networks in the processing of human emotional faces. In line with this premise, the first purpose of this meta-analysis was to provide a normative fMRI atlas of human faces processing in a standard stereotactic space.33 The second was to clarify what brain structures participate in the processing of different types of basic emotional faces (i.e., sadness, fear, happiness, disgust). Specifically, we wanted to clarify whether basic emotional faces were 1) encoded in specific brain regions or 2) were a property of the whole brain neural network or 3) whether there were overlapping neural networks processing different facial emotions. Finally, we aimed at addressing the modulatory effect of age, sex and implicit or explicit processing on the neurophysiological response to emotional faces.
Emotion research uses different types of stimuli (e.g., Ekman faces and Gur faces) to probe affective processing. Facial expressions portraying specific emotions (e.g., happiness, sadness, anger, fear) are universally recognized. Even though facial expressions are used frequently as probes of emotion recognition, some studies have shown that faces can be inducers of emotion.34 The Ekman 60 Faces Test uses a range of photographs from the Ekman and Friesen series of Pictures of Facial Affect.35 The Ekman faces have been the most widely used series of photographs. However, facial stimuli usually encountered are typically of a restricted race and age range and they are mostly 2-dimensional (2-D) photographs. Since 2-D stimuli are not amenable to manipulations of angle and orientation and raise methodological concerns when applied to examination of facial asymmetries that could be related to hemispheric specialization, Gur and colleagues36 developed a set of 3-dimensional (3-D) images to overcome these limitations.
Two independent researchers (P.F.-P., P.L.) conducted a 2-step literature search. First, they conducted 2 separate comprehensive MEDLINE (January 1990 to May 2008) searches in the English-language literature to identify putative functional neuroimaging (fMRI) studies employing a variant of the emotional faces paradigm that had reported data on healthy participants. The search terms entered in MEDLINE were “fMRI AND happy faces,” “fMRI AND sad faces,” “fMRI AND fearful faces,” “fMRI AND angry faces,” “fMRI AND disgusted faces” and “fMRI AND neutral faces.” To qualify for inclusion in the review, studies must have 1) been an original paper published in a peer-reviewed journal, 2) used a variant of the emotional faces paradigm in healthy participants, 3) studied participants using fMRI, 4) used the image subtraction methodology to determine activation/deactivation foci (e.g., happy v. fixation or happy v. neutral) and 5) reported data in standard stereotactic coordinates (either Talairach or Montreal Neurological Institute [MNI] space).37,38 The 2 researchers assessed the inclusion criteria by consensus. In a second step, we checked the reference lists of the articles included in the review for relevant studies not identified by computerized literature searching. Finally, we contacted authors of studies where Talairach or MNI coordinates were not reported to reduce the possibility of a biased sample set.
We included studies reporting single group data (i.e., healthy volunteers only). Thus, we did not consider spatial coordinates reporting a main effect of emotional faces processing across the control and a clinical group, nor did we consider coordinates relative to functional, psychophysiological or psychopathological correlations. We did not include fMRI studies investigating processes other than emotional processing (i.e., working memory,39 attention) by using similar emotional faces stimuli. To avoid an unwanted systematic confounding effect, we excluded other types of emotional faces such as schematic faces. Finally, we excluded positron emission tomography studies to prevent the methodological heterogeneity underlying these different functional imaging techniques.
To assist the reader in forming an independent view of the results, we collected meta-analytical data such as mean age, sex, intensity of magnet, type of analysis (whole brain or region of interest [ROI]) and type of emotional facial paradigms employed across all studies (Table 1). Specifically, when a study included young/old subgroups, different ages were listed for each specific contrast. The emotional faces category included specific paradigms other than the Ekman/Gur faces or a mixture of the original Ekman/Gur stimuli and unstandardized sets of emotional human faces. We coded the procedure as ROI only when specific ROI procedures (i.e., MarsBar) were applied to extract the blood oxygenation–level dependent response in selected brain areas.
Because the Talairach system is defined such that left is negative, we transformed coordinates based on the radiological convention. In addition, we determined the spatial normalization template for each study and converted all foci reported in MNI space to Talairach space with the foci conversion option available in the software used for the meta-analytical procedure. In the past, the Brett transform was used to convert MNI coordinates to Talairach space;136 however, recent findings show that MNI/Talairach coordinate bias associated with reference frame (position and orientation) and scale (brain size) can be substantially reduced using the best-fit MNI-to-Talairach transform.137 This transform has been validated and shown to provide improved fit over the Brett transform.136 We used the Talairach atlas to identify the anatomic landmarks of the activation results.
The main meta-analysis (all faces v. baseline) comprised all coordinates of emotional faces task-related activation reported in the primary literature, regardless of task variation or stimuli type. To distinguish the neural networks underlying specific emotional condition, we subdivided coordinates for the meta-analysis into different sub-groups: 1) neutral > baseline, 2) happy > baseline, 3) sad > baseline, 4) angry > baseline, 5) fearful > baseline and 6) disgusted > baseline. We defined baseline condition as fixation of a crosshair on the screen. Furthermore, to clarify what areas were active in response to each emotion in particular, we computed the following contrasts: 1) happy versus neutral, 2) sad versus neutral, 3) angry versus neutral, 4) disgusted versus neutral and 5) fearful versus neutral.
We used the equally weighted coordinates to form estimates of the activation likelihood for each voxel in the brain, as described by Turkeltaub and colleagues.38 In brief, to allow for error in spatial localization related to intersubject variation in functional anatomy and interstudy differences in data smoothing and registration, we modelled the reported loci of maximal activation as the peaks of 3-D Gaussian probability density functions with a full-width at half-maximum of 10 mm. Then we combined the probabilities of each voxel in standard space representing each primary locus of activation to form a map of the activation likelihood estimation score at each voxel. We assessed statistical significance using a permutation test with 5000 permutations, corrected for multiple comparisons (the false discovery rate [FDR] was set at p = 0.01). We defined clusters of suprathreshold voxels exceeding 200 mm3 in volume as loci of brain activation in common across all studies included in the meta-analysis;37 the resultant activation likelihood estimation maps were thresholded at p = 0.05, in line with previous studies.138 To investigate the effect of explicit/implicit emotional processing, sex and magnet field, we carried out meta-analytic comparisons between subgroups of studies (explicit v. implicit, male v. female participantss, magnet ≥ 3 T v. magnet ≤ 1.5 T) using the permutation test described in more detail by Laird and colleagues37 after ensuring that there were no significant differences in the number of coordinates found in each group. Similarly, we selected coordinates relative to studies investigating young participants (< 20 yr) and contrasted them with that of studies involving older participants (> 40 yr) to address the influence of age on emotional processing. Finally, to better clarify the main effect of valence, we contrasted specific emotions: 1) happy versus sad, 2) angry versus disgusted, 3) sad versus fearful and 4) happy versus fearful. The overall fMRI meta-analytical approach33 has been widely used in a number of studies.138–143 We imported whole-brain maps of the activation likelihood estimation values into the MRIcron software program (www.sph.sc.edu/comd/rorden/mricron) and overlaid them onto the brain template for presentation purposes.
Our combined 1990–May 2008 literature search uncovered 551 potential articles. The number of published articles relating to each emotional condition of interest were as follows: “fMRI sad faces” returned 42 published papers, “fMRI angry faces” returned 65, “fMRI disgust faces” returned 22, “fMRI fearful faces” returned 142, “fMRI happy faces” returned 106 and “fMRI neutral faces” found 174. Most of the excluded studies (94%) did not meet inclusion criteria 2 and 3, whereas 2% did not meet inclusion criterion 4 and 4% did not meet criterion 5.
Overall, 105 articles satisfied the inclusion criteria and we considered them for the meta-analysis. The final database of 105 studies corresponded to a whole sample of 1600 healthy participants (of whom 639 were female, 40%). The mean age of the sample was of 27 (standard deviation [SD] 6.8) years. Forty-seven of 105 studies (45%) reported coordinates in MNI space. Fifty-five of 105 studies (52%) adopted an ROI approach in the analysis of functional activation. Sixty percent of studies employed a classical Ekman faces paradigm, 10% of them used a Gur faces paradigm and the final 30% employed other emotional faces paradigms. We used a total of 1785 brain voxels for the meta-analytical procedure. Table 1 lists all of the studies included in the meta-analysis by first author and year of publication.
The processing of faces (emotional and neutral) compared with baseline (1785 coordinates) was associated with increased activation in a number of visual areas (fusiform gyrus, inferior and middle occipital gyri, lingual gyrus), limbic areas (amygdala and parahippocampal gyrus, posterior cingulate cortex), temporoparietal areas (parietal lobule, middle temporal gyrus, insula), prefrontal areas (medial frontal gyrus), subcortical areas (putamen) and the cerebellum (declive; Fig. 1, Table 2).
The processing of neutral faces compared with baseline (165 foci), was associated with an increased activation of visual areas (bilateral fusiform gyrus, left lingual gyrus, inferior occipital gyrus), the cerebellum (bilateral declive), limbic areas (left amygdala and left cingulate gyrus), subcortical areas (right lentiform nucleus), prefrontal regions (left medial frontal gyrus, right middle frontal gyrus, precentral gyrus) and the left insula (cluster p < 0.001, false discovery rate [FDR] = 0.01; Table 2).
The processing of happy faces compared with baseline (112 foci) was associated with an increased activation in the right middle occipital gyrus, left precuneus, left amygdala, left insula, left medial frontal gyrus, left putamen, left cerebellum, bilateral supramarginal gyrus and left middle temporal gyrus (cluster p < 0.001, FDR = 0.01; Table 2).
The processing of sad faces compared with baseline (37 foci) showed an increased activation in the right superior occipital gyrus, left insula and left thalamus (cluster p < 0.001, FDR = 0.01; Table 2).
The processing of angry faces compared with baseline (77 foci) activated the right cingulate and anterior cingulate gyri, the right parahippocampal gyrus, left cerebellum, subcortical regions such as the left globus pallidus and the right claustrum and prefrontal regions such as the bilateral inferior frontal gyrus and the right middle frontal gyrus (cluster p < 0.001, FDR = 0.01; Table 2).
During the processing of fearful faces compared with baseline (338 foci), there was a significant increase of neural activation in the bilateral amygdala and fusiform gyrus, right cerebellum, left inferior parietal lobule, left inferior frontal and right medial frontal gyrus (cluster p < 0.001, FDR = 0.01; Table 2).
When processing disgusted faces compared with baseline (48 foci), there was an increased neural response in the left amygdala, fusiform gyrus, bilateral middle temporal gyrus, left middle frontal and right inferior frontal gyri, right insula, left precentral gyrus, left inferior parietal lobule and left thalamus (cluster p < 0.001, FDR = 0.01; Table 2).
Processing happy faces (65 foci) was associated with neural activation in the bilateral amygdala, left fusiform gyrus and right anterior cingulate cortex. Sad faces (39 foci) activated the right amygdala and the left lingual gyrus. Fear (257 foci) was associated with activation in the bilateral amygdala and the fusiform and medial frontal gyri. Processing angry faces (66 foci) was associated with increased neural response in the left insula and right inferior occipital gyrus. Disgust (150 foci) was associated with neural activation in the insula bilaterally, right thalamus and left fusiform gyrus (Table 3 and Fig. 2).
When healthy participants were explicitly processing the emotional faces, there was an increased activation in the temporal part of the right fusiform gyrus, bilateral amygdala, left cerebellum (declive), right claustrum and caudate, and the right inferior occipital, left inferior frontal and left medial frontal gyri, as compared with the implicit emotional processing (734 foci). Conversely, during the implicit processing of the emotional faces, healthy participants showed more activation in the occipital parts of the left lingual and right fusiform gyri, left post-central gyrus and right insula, as compared with the explicit emotional processing of faces (cluster p < 0.001, FDR = 0.01; Table 4, Fig. 3).
When processing human emotional faces, older participants showed greater neural response in the temporal part (Brodmann area [BA] 37) of the bilateral fusiform gyrus, left cerebellum and left hippocampus as compared with young healthy participants (196 foci). Conversely, younger participants showed greater activation in a more occipital part (BA19) of the bilateral fusiform gyrus than older participants (cluster p < 0.0001, FDR = 0.01; Table 5, Fig. 4).
When processing human emotional faces (613 foci), male participants showed greater neural response than female participants in a cluster spanning the right amygdala and parahippocampal gyrus (x = 25, y = − 12, z = − 10, activation likelihood estimation = 0.01681, voxels = 1024), in the right medial frontal gyrus (BA6, x = 1, y = 2, z = 52, activation likelihood estimation = 0.02018, voxels = 760) and in the left fusiform gyrus (BA19, x = − 38, y = − 76, z = − 13, activation likelihood estimation = 0.01836, voxels = 360). Conversely, female participants showed greater activation than male participants in the right subcallosal gyrus (BA34, x = 19, y = 3, z = − 13, activation likelihood estimation = − 0.02225, voxels = 696; cluster p < 0.001, FDR = 0.01).
We detected a significant effect for the magnetic field strength on the detection of neural correlates of emotional processing. When using a magnet ≥ than 3 T, as compared with magnets of 1.5 T or less (1675 foci), increased activation was observable in the bilateral fusiform gyri (BA37, right: x = 36, y = − 54, z = − 13, activation likelihood estimation = 0.06837, voxels = 6656; left: x = − 38, y = − 55, z = − 13, activation likelihood estimation = 0.06011), bilateral amygdala (left: x = − 27, y = − 9, z = − 11, activation likelihood estimation = 0.06529, voxels = 2024; right: x = 22, y = − 13, z = − 10, activation likelihood estimation = 0.03178, voxels1408) and left medial frontal gyrus (BA6, x = − 2, y = − 2, z = 54, activation likelihood estimation = 0.04081, voxels = 1104; cluster p < 0.001, FDR = 0.01). We detected no clusters for the opposite contrast.
When processing happy faces compared with sad faces (149 foci), participants showed greater activation in the left amygdala (x = − 22, y = − 8, z = − 11, activation likelihood estimation = 0.02463, voxels = 1472) and right middle occipital gyrus (BA18, x = 40, y = − 72, z = − 9, activation likelihood estimation = 0.01953, voxels = 1336; cluster p < 0.001, FDR = 0.01). We detected no clusters during the processing of sad compared with happy faces (Fig. 5).
When processing angry faces compared with disgusted faces (38 foci) there was increased activation in the right amygdala (x = 17, y = − 5, z = − 18) and right fusiform gyrus and deactivation in the right insula (x = 40, y = − 5, z = 2; Fig. 5).
When processing fearful faces compared with happy faces (59 foci) there was increased activation in the bilateral amygdala (left x = − 19, y = − 2, z = − 13; right x = 16, y = 4, z = − 18) and decreased activation in the anterior cingulate gyrus (x = 2, y = 39, z = 11; Fig. 5).
When processing fearful faces compared with sad faces (68 foci) there was increased activation in the bilateral amygdala (x = − 20, y = − 6, z = − 14) and in the right fusiform gyrus (x = 45, y = − 46, z = − 18). No brain regions were more activated by the sad faces compared with the fearful faces.
The aim of the present meta-analysis was to provide a neurofunctional map of human emotional face processing in healthy individuals.
We adopted a multiple-steps approach with the encompassing objective of reducing heterogeneity across fMRI findings. In a first step (study selection), we chose to include fMRI studies only and avoid positron emission tomography studies given the profound methodological differences of these methods. In addition, to control for considerable variation observed in the results of imaging experiments addressing even closely related experimental paradigms,32 our meta-analysis focused on a single paradigm (i.e., the presentation of human faces and emotional picture) during emotional processing (attentive or mnestic studies were not considered). To overcome the lack of statistical power associated with single fMRI studies, we selected a large sample of 105 studies relating to 1600 healthy volunteers and 1785 foci. Although there are no community-accepted criteria for interpreting the activation likelihood estimation results, for a study of this size if 6 or more foci contribute to a cluster it is considered very robust, and if 3–5 foci contribute to a cluster, it is acceptable.144 It is not convincing if only 1 or 2 foci contribute to a cluster. As recent activation likelihood estimation meta-analyses have been published with samples of 12 studies and fewer than 10 foci,144 our results are particularly robust. We have chosen a function–location statistical approach (voxel-based, activation likelihood estimation) in place of the standard effect-size meta analysis, because it is the location, rather than the magnitude, of the effect that is of interest in the present study.32 Specifically, activation likelihood estimation assumes that although each study reports the specific coordinates of activations, issues such as intersubject variability in brain anatomy and differences in investigators’ labels for anatomic regions may lead to some uncertainty as to the actual locations of these peaks. In fact, one of the difficulties when comparing imaging studies is that there is considerable variability when labelling neuroanatomical regions, and differences in nomenclature could obscure findings. An advantage of the activation likelihood estimation technique is that because it uses the coordinates of reported foci (rather than anatomic labels) for meta-analysis, it avoids the problem of any mislabelling of regions in the primary literature.145 A further benefit is that the exclusion of negative data has very little effect on the results.32
The present study has produced a reliable normative map of brain response to human emotional faces processing. We found that processing of emotional faces was associated with increased activation in a number of visual areas (fusiform gyrus, inferior and middle occipital gyri, lingual gyrus), limbic areas (amygdala and parahippocampal gyrus, posterior cingulate), temporal areas (middle/superior temporal gyrus), temporoparietal areas (parietal lobule, middle temporal gyrus, insula), prefrontal areas (medial frontal gyrus), subcortical areas (putamen) and the cerebellum (declive). This neural network is the most likely to be activated during the processing of a human emotional face. These findings confirm that processing emotion from facial expressions draws on diverse psychological processes implemented in a large array of neural structures.26 Although the exact functional interplay between these areas is not clear, some authors suggest that early perceptual processing of faces draws on cortices in occipital and temporal lobes that construct detailed representations from the configuration of facial features.5 Subsequent recognition requires a set of structures, including the amygdala and orbitofrontal cortex, that links perceptual representations of the face to the generation of knowledge about the emotion signaled.5
Our second aim was to identify discrete neural subsystems underlying processing of “basic” emotional expressions (fear, disgust, anger, happiness and sadness).35 As human signals of basic emotion are universal, fMRI studies that explore the neural substrates of emotion recognition are free of the reliance on vague subjective measures that plagued earlier emotion research.146 When we compared processing of emotional faces with baseline, we uncovered wide and overlapping neural networks. Thus, to better disentangle the specific neural substrates of emotions, we contrasted emotional faces against neutral faces.
With respect to the limbic system, we found a differential sensitivity to emotional conditions. Whereas happy and fearful faces activated the amygdala bilaterally, sad faces showed a laterality effect, and angry or disgusted faces had no effect on this region. In a second step, we tested the differential amygdala sensitivity to happy, fearful and sad faces by computing direct contrasts between these emotional conditions. The intensity of the activation likelihood estimation scores confirmed that amygdala sensitivity was greater during the fearful stimulus than in the other 2 conditions. In other words, it is possible to conclude that amygdala engagement during processing of fearful faces is a strong and consistent finding in the available fMRI literature. Such a result is in line with evidence suggesting that the amygdala is specifically sensitive to fearful emotional processing.147 However, we found the amygdala was also activated while processing neutral faces; as there are face-responsive neurons in the amygdala,68 this brain region may have an additional role for vigilance or for processing salience.148
The second interesting finding of the present study is insular activation during processing of disgusted and angry faces. Although the insula was activated not only by disgusted but also by angry faces, the intensity of activation likelihood estimation scores and the direct contrast between angry and disgusted faces clearly indicated that its sensitivity is greater to disgusted than to angry faces. A number of behavioural and neurobiological accounts have suggested that this brain region is relevant to the neurobiological models of disgust,149 as the insula in primates contains neurons that respond to pleasant and unpleasant tastes.146 Some authors have speculated that whereas limbic (amygdala–hippocampus) regions are particularly involved in the emotional response to exteroceptive sensory stimuli, the insular cortex is preferentially involved in the emotional response to potentially distressing stimuli, interoceptive sensory stimuli and body sensations.146 In fact, the insula is part of the gustatory cortex and is connected to the ventro–posterior–medial thalamic nucleus.146 Interestingly, we have uncovered additional thalamic activation in response to disgusted faces, indicating that the insular–thalamic pathway may represent a core relay of this neural network.
We also uncovered neural activation in different areas of the visual cortex (fusiform gyrus, lingual gyrus, inferior occipital gyrus). This response was observable across all emotional conditions. As discussed in the previous paragraph, it is possible to speculate that these areas are engaged in early perceptual processing of facial stimuli,5 which may be independent from emotional valence. The medial frontal cortex, on the other hand, participates in the conscious experience of emotion, inhibition of excessive emotion, or monitoring one’s own emotional state to make relevant decisions. This region was activated by fearful faces, whereas happy faces activated the anterior cingulate cortex, a region that is specifically involved in the arousal to an emotive visual stimulus.146
Finally, we found no differential brain activation across emotions in the cerebellum. This area is of particular interest in the field of affective neuroscience as it is strongly connected with the reticular system (arousal), cortical association areas (cognitive processing of emotions) and limbic structures (emotional experience and expression) such as the amygdala, the hippocampus and the septal nuclei.150 Previous studies have indicated that cerebellar lesions result in flattening or blunting of emotions151 and cerebellar activation has been observed in response to different emotions.152,153 The lack of differential activation in this area is in line with evidence that the cerebellum plays a general role in emotional processing.154
These findings taken together suggest that emotional conditions are implemented by neural systems that are at least partially separable, although they are not represented by entirely distinct neural circuits. Certainly, at least at level of resolution of fMRI there appear to be interesting limbic and insular differences between the basic emotions.
The present meta-analysis has also identified the neurofunctional maps of implicit and explicit emotional faces processing. Of particular interest, we found that the explicit processing of stimuli was associated with activation of the prefrontal cortex and amygdala. In particular, the medial prefrontal cortex is thought to be involved not only in general emotional processing31 but also in emotional self-awareness.155 Specifically, this region plays a key role in complex aspects of emotional processing such as social interaction by virtue of its connections with the discrete parts of the temporal lobe and subcortical structures that control autonomic responses.156 Given the anatomic connections between the medial pre-frontal cortex and the amygdala,157 the coactivation of these 2 structures may reflect possible influence of cortical control on explicit processing. The amygdala–medial prefrontal circuitry has been termed as the emotion generation–regulation circuit158 and is implicated in attention to threat and interpretation of emotional stimuli.159 Previous lesion studies have confirmed that insults to these areas result in emotion dysregulation.160
Conversely, the implicit processing was associated with activation in more ventral regions spanning the inferior pre-frontal cortex and insula. The insular cortex plays a key role in emotional processing owing to its abundant connections with other association and primary sensory areas and its involvement in language, vestibular and pain perception (for a review see Nagai et al.161). In particular, insular projections to the inferior prefrontal cortex and amygdala may convey social information from emotional expressions.22 The anterior part of the insular cortex is considered to be a limbic-related cortex and part of the interoceptive system, which associates (with a mirror neuron link) external and internal experience.25 This insular subregion provides information about aversive body states associated with conditional stimuli, signalling this information to prefrontal brain areas that are critical for the allocation of attention and the execution of actions.162 The cognitive control of appraisal and emotional relevance is not mediated by the orbitofrontal cortex alone, but arises from large-scale systems that include the amygdala and the insular cortices.163 In line with such evidence our findings suggest that the explicit or implicit emotional processing is regulated by the functional interaction of a network comprising these brain regions.
With respect to age, we demonstrated that the neural response to emotional faces in the fusiform gyrus, the cerebellum and the hippocampus is modulated by this factor. The hippocampus shows extensive connections with extrastriate visual areas, including the fusiform gyrus. Previous fMRI paradigms have confirmed that brain activity in this area is affected by age.164 The observed age-related differences could reflect functional compensation within the neural system involved in perception of facial affect or the fact that older adults process emotional information in a different manner than young adults. Finally, we found that male participants activated the limbic (amygdala/parahippocampal gyrus) and the prefrontal cortices more than female participants during emotional processing; conversely, female participants showed greater activation in the right subcallosal gyrus. Previous studies found that male and female participants engage the amygdala and the prefrontal cortex in different ways while passively viewing emotional faces.15
Over the past decade, a number of behavioural findings have shown that alterations of emotional faces processing play a role in a range of psychiatric disorders spanning affective diseases (e.g., major depression, bipolar disorders, anxiety-related disorders)24 and the psychotic spectrum (e.g., schizophrenia,165–167 autism168). For instance, depressed participants show a state-related positive bias toward negative emotional cues (e.g., sad faces) and a bias away from positive emotional cues (e.g., happy faces).24 These findings are in line with cognitive theories of depression, which emphasizes the role of negative biases in information processing in the etiology and maintenance of depression.169 Consistent with these observations, there are data to suggest there may be an analogous, state-related negative recognition bias for negative emotions in mania.91 On the other hand, patients with schizophrenia have difficulty recognizing the emotion that corresponds with a given facial expression.2,170–172 Early deficit in visual processing underlying emotion recognition170,173 is a hallmark of schizophrenia, with consequences for cognitive function,174 social function2 and subjective well-being.171 Specific alterations of emotional faces processing of schizophrenic patients include bias for threat-related emotional material, which may be regarded with increased significance by delusion-prone individuals, and it is possible that this bias is involved in the formation of delusional beliefs.2,175,176
Facial emotional stimuli may serve as valid tools tapping on neural networks implicated in emotional processing. Thus,30 fMRI has been exponentially employed to address the neurophysiological substrates of impairments in emotional faces processing under different psychiatric conditions.177 Although many of the brain regions activated during facial emotional processing in healthy participants are also implicated in the pathophysiology of psychiatric disorders,24 current imaging literature indicates contrasting findings and a variable picture8 and is unable to definitively characterize what brain region is etiologically associated with each psychiatric disease. For this reason, the use of the physiologic maps provided herein will help to strengthen future study hypotheses and designs by robustly identifying regions of normal activation.
There are several limitations to our study. The first is the heterogeneity of the studies included. Factors such as behavioural performance, demographic information, substance abuse, cognitive function and personality traits, which could potentially influence the results, vary across the different samples. Although we have attempted to address the effect of age, sex, implicit/explicit emotional processing and intensity of magnet, at the present time it is not possible to directly evaluate the influence of all these factors on the results. Second, our methods did not allow for weighting of the results based on the level of statistical significance reported in each study. This means that we cannot exactly determine the relative strengths of activation differences. Third, although the quantitative meta-analytic method used herein represents a substantial advance for integrating functional neuroimaging data, the method remains subject to the basic limitation of literature reviews, in particular the “file drawer” problem. However, a benefit of activation likelihood estimation meta-analyses is that the exclusion of negative data has very little effect on the results.32 Conversely, it is possible that studies employing ROI analysis of brain activation may have influenced the resulting activation patterns and potentially biased the results. Previous activation likelihood estimation studies144 did not acknowledge such a methodological limitation, including in the same database whole brain and ROI studies. To our knowledge, the present study is the first activation likelihood estimation meta-analysis to address this point by providing details of the methodological approach adopted in each study.
Functional neuroimaging is currently advancing from the simple detection and localization of cortical activation to the investigation of complex cortical processes and associated functional relations between cortical areas. Such research questions can no longer be addressed by the isolated analysis of single experiments alone, but necessitate the consolidation of results across different cognitive tasks and experimental paradigms. This again makes meta-analyses an increasingly important part in the evaluation of functional imaging results.8 A combination of functional imaging and electrophysiological techniques (i.e., fMRI and electroencephalography) may also represent the future research instrument to combine the high spatial resolution of the first technique with the high temporal resolution of the second. Finally, although the evidence suggests that defined areas of the brain are responsive to specific facial expressions, recent evidence indicates neuronal subspecialization in face-specific brain regions for different components of facial perception such as facial expression, viewing of familiar and unfamiliar faces, and discrimination of spatial relations of facial features (i.e., eyes, lips or nose).28 Investigations into the neural systems underlying processing of these cues for each of the basic emotions may be helpful to further elucidate their neural representation.
The wide neurofunctional network underlying human faces processing includes a number of visual, limbic, temporoparietal, prefrontal and subcortical areas as well as the cerebellum. Whereas occipital areas and the cerebellum were commonly activated across different emotions, a discrete response to valence has been reported for the limbic system and insular cortex. Although the basic emotions are not represented by entirely distinct neural circuits, they are at least partially separable. Sex, age and consciousness modulate the neurophysiological response to human emotional faces.
We thank Dr. Angela Laird for her valuable advice with the meta-analytical package.
Competing interests: None declared.
Contributors: Drs. Fusar-Poli, Placentino, Allen, Barale, Perez, McGuire and Politi designed the study. Drs. Fusar-Poli, Carletti, Landi, Abbamonte and Gasparotti acquired the data, which Drs. Fusar-Poli, Surguladze, Benedetti and McGuire analyzed. Drs. Fusar-Poli, Landi and Perez wrote the article, which all other authors reviewed. All authors approved the final version for publication.