In the studies reviewed in this paper, ERPs were measured in response to emotional and neutral faces. We investigated the onset, time course, and topographic distribution of brain responses to emotional faces in order to learn more about functional properties of emotional face processing. These studies have revealed robust and replicable differential ERP responses to emotional versus neutral faces. In all experiments reviewed in this paper, emotional faces triggered an increased positivity relative to neutral faces. The onset of this emotional expression effect was remarkably early, ranging from 120 ms post-stimulus when faces were presented at fixation and no competing non-face stimuli were simultaneously present (Eimer & Holmes, 2002
) to 180 ms post-stimulus when emotional faces were presented at peripheral locations together with other stimuli close to fixation (Eimer et al., 2003
). While we assume that factors such as the retinal position of faces and the presence versus absence of potentially competing non-face stimuli in the display are largely responsible for such onset latency differences, the possibility remains that other cognitive factors might determine the onset of these effects. It is unlikely that the presence of an explicit emotion task plays an important role, given that very early emotional expression effects were observed under conditions where participants had to detect the immediate repetitions of face as well as non-face stimuli, and emotional expression was entirely irrelevant (Eimer & Holmes, 2002
). The intriguing possibility that the onset of these effects might also vary as a function of participants’ trait or state anxiety (see Bishop, Duncan, & Lawrence, 2004
; Fox, 2002
, for recent behavioural and neuroimaging evidence for an impact of anxiety on emotional processing) will be investigated in future studies.
The initial phase of the emotional expression effects observed in the studies reviewed here showed a frontocentral scalp distribution, while its later phase beyond 250 ms post-stimulus was more broadly distributed (see ), thus strongly suggesting that different neural generators are activated during early and later stages of this emotional positivity. While similar broadly distributed longer latency positivities have been reported in previous ERP studies for emotionally salient non-face stimuli (Cuthbert et al., 2000; Diedrich et al., 1997; Keil et al., 2002
), differential ERP modulations at latencies below 200 ms have not been found with other types of emotional material, thus suggesting that such rapid effects might be specific for emotional faces (see Ashley et al., 2003
, for similar early ERP effects of emotional facial expression).
The short onset latency of emotional expression effects observed in the ERP studies reviewed in this paper might suggest that the early phase of these effects reflects the rapid pre-attentive automatic assessment of the emotional content of faces, as implemented by structures such as the amygdala. One obvious objection against this hypothesis is that it is highly unlikely that the effects of emotional expression on ERP waveforms reviewed in this paper are generated in the amygdala, given its deep position and its electrically closed nuclear structure of clustered neurons (unlike the regular alignment of neurons in layers of the neocortex). Of course, even though amygdala activations themselves are unlikely to result in measurable ERP responses at the scalp surface, rapid automatic amygdala activations triggered by emotional faces might still be relayed directly to neocortical areas (Morris et al., 1998
), where they could be picked up via ERPs. It is conceivable that the early emotional expression effects observed in our ERP studies reflect activity within a neural network involved in the rapid automatic classification of emotional faces, which includes limbic structures as well as interconnected neocortical regions. However, several other findings from our ERP studies provide strong evidence against such a hypothesis. First, we found that ERP emotional expression effects were triggered at comparable latencies, and showed similar amplitudes and scalp topographies for all six basic emotions (Eimer et al., 2003
). Although clearly preliminary (see the cautionary comments in Section 3
), this finding contrasts markedly with the emotion-specificity of the neural structures assumed to be involved in the rapid automatic evaluation of emotional content. Several studies have found a disproportionate activation of the amygdala in response to facial expressions of fear (Breiter et al., 1996; Morris et al., 1996; Phillips et al., 1998; Whalen et al., 2001
). Insula and basal ganglia seem to be particularly involved in processing facial expressions of disgust (Adolphs et al., 2003; Calder et al., 2000, 2001; Phillips et al., 1997, 1998; Sprengelmeyer et al., 1998
), and prefrontal areas appear to be specifically implicated in the recognition of angry faces (Blair, Morris, Frith, Perrett, & Dolan, 1999
; Harmer, Thilo, Rothwell, & Goodwin, 2001
). Second, we found that ERP responses to emotional faces are not selectively driven by low spatial frequency information (Holmes et al., 2005b
). Again, this observation contrasts with previous observations for structures subserving the automatic classification of emotional input, such as the amygdala, which has been found to be preferentially activated by LSF signals (Vuilleumier et al., 2003; Winston et al., 2004
). And finally, fear-specific amygdala activation appears to start only at latencies of 200 ms (Krolak-Salmon et al., 2004
), and thus considerably later than the early emotional expression effects observed in our studies. This observation casts further doubt on the hypothesis that rapid amygdala signals are responsible for these ERP effects.
The most important evidence against the hypothesis that ERP modulations sensitive to emotional facial expression are generated during the initial automatic classification of emotional content comes from our finding that spatial attention had a strong modulatory effect on these modulations. When faces were presented foveally, early emotional expression effects (but not longer latency effects) were triggered independently of the current focus of attention. However, and most importantly, even these early effects were completely eliminated when attention was directed away from the location of peripheral emotional faces towards another perceptual task (Eimer et al., 2003; Holmes et al., 2003
). This observation is clearly at odds with the idea that early emotional expression effects reflect the pre-attentive registration of facial expression. Recent fMRI evidence suggests that amygdala responses to fearful faces are unaffected by attention (Lane et al., 1999; Vuilleumier et al., 2001
), indicating that the amygdala is part of a network involved in the pre-attentive automatic detection of emotional content. The strong dependence of ERP emotional expression effects on spatial attention demonstrated in our studies implies that the processes responsible for the generation of these effects are functionally distinct from this pre-attentive detection network. One could speculate that this remarkable effect of attention on emotional expression processing, as reflected by the absence of any differential ERP effects for unattended emotional faces, might be mediated by control structures in orbitofrontal cortex. Orbitofrontal regulatory processes can suppress responses to emotional stimuli in the amygdala as well as in higher-order emotion areas (Davidson, 2002
; Freedman, Black, Ebert, & Binns, 1998
; Rolls, 1996
). Orbitofrontal cortex has also been implicated in monitoring and restricting affective impulses through feedback mechanisms (Rolls, 1999
; see also Rolls, 2006
, this issue), and has been posited as a key structure in the reallocation of attention when emotional faces are task irrelevant (Vuilleumier et al., 2001
It is conceivable that the absence of early emotional expression effects in response to unattended peripheral emotional faces found in our previous studies (Eimer et al., 2003; Holmes et al., 2003
) may be due to the fact that observers were unable to identify facial expressions when attention was directed elsewhere. Although face identification performance was excellent when peripheral faces were attended, the possibility that observers are unaware of emotional expressions when attention is diverted has not yet been explicitly addressed. Links between attention, early emotional expression effects, and the presence versus absence of conscious awareness of facial expressions will need to be systematically investigated in future studies.
It should also be stressed that amygdala activations in response to emotional stimuli do not always and exclusively represent a rapid and automatic classification of these stimuli. Attentional modulations of amygdala responses to fearful or happy facial expressions have in fact been observed (Pessoa et al., 2002a, 2002b
), and modulations of amygdala activation were also found as a function of top–down control processes involved in the intentional regulation or reinterpretation of affective information (Ochsner, Bunge, Gross, & Gabrielli, 2002
; Schaefer et al., 2002
). These findings suggest that the amygdala may also play an important role during later processing stages where the processing of emotional information is under attentional control and conscious representations of emotional content are generated.
Overall, our findings that ERP modulations in response to emotional faces were triggered in a very similar fashion for all basic facial expressions, were not selectively driven by low spatial frequency information, but were strongly modulated by attention all suggest that these effects are elicited during stages in the processing of emotional information that are located beyond the initial rapid and automatic classification of emotional content. The outcome of an initial rapid and automatic appraisal of emotional stimuli will be fed into higher necortical stages of emotional processing, where the evaluation of emotional material is likely to continue in parallel with ongoing emotion evaluation in amygdala and related brain circuits. We suggest that emotional expression effects, as reflected by ERP waveforms in response to emotional faces, reflect processes at this later neocortical stage in the processing of emotional information.
At present, it would be premature to speculate in more detail about specific cortical regions where the ERP effects reviewed in this paper might be generated, although regions such as anterior cingulate, somatosensory cortex, or medial prefrontal cortical areas seem plausible candidates. For example, Vuilleumier, Armony, Driver, and Dolan (2001)
have identified a dorsal region of the anterior cingulate, which showed greater activation when emotional faces were attended, analogous to our finding that ERP effects of emotional expression are strongly affected by attention. The dorsal subdivision of the anterior cingulate has been implied in functions such as attentional control, error and response conflict monitoring (Bush, Luu, & Posner, 2000
; Paus, Koski, Caramanos, & Westbury, 1998
; Posner & Rothbart, 1998
), but also as a potential locus of emotional awareness (Lane, Reiman, Axelrod, Holmes, & Schwartz, 1998
), consistent with evidence that this region participates directly in the affective component of personally experienced (Rainville, Duncan, Price, Carrier, & Bushnell, 1997
) or observed (Hutchison, Davis, Lozano, Tasker, & Dostrovsky, 1999
The analysis of emotional facial expression is based on a complex neural network, and includes both a rapid, obligatory, and pre-attentive classification of emotional content (implemented within the amygdala, orbitofrontal cortex, and ventral striatum), and the subsequent in-depth analysis of emotional faces in higher order necortical emotion areas (including somatosensory cortex, anterior cingulate, and medial prefrontal cortices). In spite of the fact that ERP effects of emotional facial expression are triggered at very short latencies, they are likely to reflect processes that form part of the second, higher level and attention-dependent emotional processing system, where representations of emotional content are generated in a strategic and task-dependent fashion for the adaptive intentional control of behaviour. Further studies will have to investigate to what extent effects of emotional facial expression on ERP waveforms, such as reviewed here, might represent the initial stages of higher level emotional processes, which may not just be involved in the analysis of emotionally relevant sensory stimuli, but perhaps also in the representation of subjective emotional states.