|Home | About | Journals | Submit | Contact Us | Français|
Intentionally adopting a discrete emotional facial expression can modulate the subjective feelings corresponding to that emotion; however, the underlying neural mechanism is poorly understood. We therefore used functional brain imaging (functional magnetic resonance imaging) to examine brain activity during intentional mimicry of emotional and non-emotional facial expressions and relate regional responses to the magnitude of expression-induced facial movement. Eighteen healthy subjects were scanned while imitating video clips depicting three emotional (sad, angry, happy), and two ‘ingestive’ (chewing and licking) facial expressions. Simultaneously, facial movement was monitored from displacement of fiducial markers (highly reflective dots) on each subject's face. Imitating emotional expressions enhanced activity within right inferior prefrontal cortex. This pattern was absent during passive viewing conditions. Moreover, the magnitude of facial movement during emotion-imitation predicted responses within right insula and motor/premotor cortices. Enhanced activity in ventromedial prefrontal cortex and frontal pole was observed during imitation of anger, in ventromedial prefrontal and rostral anterior cingulate during imitation of sadness and in striatal, amygdala and occipitotemporal during imitation of happiness. Our findings suggest a central role for right inferior frontal gyrus in the intentional imitation of emotional expressions. Further, by entering metrics for facial muscular change into analysis of brain imaging data, we highlight shared and discrete neural substrates supporting affective, action and social consequences of somatomotor emotional expression.
Conceptual accounts of emotion embody experiential, perceptual, expressive and physiological modules (Izard et al., 1984) that interact with each other, and influence other psychological processes, including memory and attention (Dolan, 2002). In dynamic social interactions, the perception of another's facial expression can induce a ‘contagious’ or complementary subjective experience and a corresponding facial musculature reaction, evident in facial electromyography (EMG) (Dimberg, 1990; Harrison et al., 2006). Further, the relationship between facial muscle activity and emotional processing is reciprocal: emotional imagery is accompanied by changes in facial EMG that reflect the valence of one's thoughts (Schwartz et al., 1976). Conversely, intentionally adopting a particular facial expression can influence and enhance subjective feelings corresponding to the expressed emotion (Ekman et al., 1983; review, Adelmann and Zajonc, 1989). To explain this phenomenon, Ekman (1992) proposed a ‘central, hard-wired connection between the motor cortex and other areas of the brain involved in directing the physiological changes that occur during emotion’.
Neuroimaging studies of emotion typically probe neural correlates of the perception of emotive stimuli or of induced subjective emotional experience. A complementary strategy is to use objective physiological or expressive measures to identify activity correlating with the magnitude of emotional response. Thus, activity in the amygdala predicts the magnitude of heart rate change (Critchley et al., 2005) and electrodermal response to emotive stimuli (Phelps et al., 2001; Williams et al., 2004).
Facial expressions are overtly more differentiable than internal autonomic response patterns. In the present study, we used the objective measurement of facial movement to index the expressive dimension of emotional processing. Our approach hypothesises that the magnitude of facial muscular change during emotional expression ‘resonates’ with activity related to emotion processing (Ekman et al., 1983; Ekman, 1992). Thus,we predicted that brain activity correlating with facial movement, when subjects adopt emotional facial expressions, will extend beyond classical motor regions (i.e. precentral gyrus, premotor region and supplementary motor area) to engage centres supporting emotional states. Recently, a ‘mirror neuron’ system (MNS; engaged when observing or performing the same action) has been proposed to play an important role in imitation, involving the inferior frontal gyrus, Brodmann area 44 (BA 44) (Rizzolatti and Craighero, 2004). While clinical studies suggest right hemisphere dominance in emotion expression, the neuroimaging evidence is equivocal (Borod, 1992; Carr et al., 2003; Leslie et al., 2004; Blonder et al., 2005). One focus of our analysis was to clarify evidence for right hemisphere specialisation in BA 44 for emotion expression.
We measured regional brain activity using functional magnetic resonance imaging (fMRI) while indexing the facial movement during imitation of emotional and non-emotional expressions (see ‘Materials and Methods’ section). Subjects were required to imitate dynamic video stimuli portraying angry, sad and happy emotional expressions and non-emotional (ingestive) expressions of chewing and licking. Evidence suggests that facial expressions may intensify subjective feelings arising from emotion-eliciting events (Dimberg, 1987; Adelmann and Zajonc, 1989). We therefore predicted that neural activity, besides motor regions and MNS, would correlate with the magnitude of facial movement during emotion mimicry. Moreover, we predicted that activity within regions implicated in representations of pleasant feeling states and reward (including ventral striatum) would be enhanced during imitation of happy expressions, activity within regions associated with sad feeling states (notably subcallosal cingulate cortex) would be enhanced during imitation of sad faces (Mayberg et al., 1999; Phan et al., 2002; Murphy et al., 2003), and regions associated with angry feeling/aggression modulation (putatively, ventromedial prefrontal region) would be enhanced while imitating angry faces (Damasio et al., 2000; Pietrini et al., 2000; Dougherty et al., 2004). Further, since facial movement communicates social motives, we also predicted the engagement of brain regions implicated in social cognition (including superior temporal sulcus) during emotion mimicry (Frith and Frith, 1999; Wicker et al., 2003; Parkinson, 2005).
We recruited 18 healthy right-handed volunteers (mean age, 26 years; 9M, 9F). Each gave informed written consent to participate in an fMRI study approved by the local Ethics Committee. Subjects were screened to exclude history or evidence of neurological, medical or psychological disorder including substance misuse. None of the subjects was taking medication.
Experimental stimuli consisted of short movies of five dynamic facial expressions, (representing anger, sadness, happiness, chewing and licking) performed by four male and four female models. All of the models received training before videotaping and half of them had previous acting experience or drama background. Subjects performed an incidental sex-judgement task, signalling the gender of the models via a two-button, hand-held response pad. To ensure subjects focused on the faces, the hair was removed in post-processing of the video stimuli, see Figure 1 (i).
The experiment was split into three sessions each consisting of eight interleaved blocks. A block was either imitation (IM), where the subjects imitated the movies, or passive viewing (PV), where the subjects just passively viewed the video stimuli. Within each block there were two trials for each facial expression, and the order of the trials was randomised. Thus, each subject viewed a total of 24 trials of IM or PV for each facial expression. In the IM blocks, the subjects were instructed to mimic, as accurately as possible, the movements depicted on video clips.
On each trial, the video (movie) clip lasted 0.7s. Four seconds after the movie onset, a white circle was presented on the screen for 0.5s to cue the response (gender judgment). On IM blocks, the subject imitated the facial expression during the interval between the movie offset and gender response cue. Trial onset asynchrony was jittered between 6.5 and 8.5s (average 7.5s) to reduce anticipatory effects. Each session lasted 12min and 30s. The whole experiment lasted ~40min and the trial structure is illustrated in Figure 1 (iii). The leading block of the three sessions was either IM–PV–IM or PV–IM–PV, counterbalanced across subjects.
In scanner, three facial markers (dots) were placed on the face according to the electrodes sites suggested in facial EMG guidelines by Fridlund and Cacioppo (Fridlund and Cacioppo, 1986). The first dot, D1, was affixed directly above the brow on an imaginary vertical line that traverses the endocanthion, while the second, D2, was positioned 1cm lateral to, and 1cm inferior to, the cheilion and the third, D3, was placed 1cm inferior and medial to the midway along an imagery line joining the cheilion and the preauricular depression. Their movement conveyed, respectively, the activities of corrugator supercilii, depressor anguli oris and zygomaticus major. Activity of corrugator supercilii and depressor anguli oris is associated with negative emotions (including anger and sadness) and zygomaticus major with happiness (Schwartz et al., 1976; Dimberg, 1987; Hu and Wan, 2003). The facial markers were located on the left side of the face consistent with studies reporting more extensive left-hemiface movement during emotional expression (Rinn, 1984). The dots were made from highly reflective material (3M™ Scotchlite™ Reflective Material), and were 2mm in diameter, weighing 1mg. We adjusted the eye-tracker system to record dot position using infrared light luminance in darkness. The middle part of the subject's face was obscured by part of the head coil. Dot movement was recorded on video (frame width × height, 480 × 720 pixels; frame rate, 30 frames per second), see Figure 1 (ii). The analysis of facial movement used a brightness threshold to delineate the dot position from the central point of the marker. Dot movement was calculated as the maximal deviation from baseline within 4s after stimulus onset; where the baseline was defined as the average position of the dot in the preceding 10 video frames. During imitation of sadness and anger, the magnitude of facial change was taken from the summed movement of D1 and D2. During imitation of happiness, facial change was measured from movement of D3 and, for chewing and licking, from D2. We adopted the simplest linear metric of movement in our analyses. Movie segments of 5s were constructed for each imitation trial post experiment. Each segment was visually appraised by the experimenter to identify correct and incorrect responses and exclude the presence of confounding ‘contaminating’ movements.
We acquired sequential T2*-weighted echoplanar images (Siemens Sonata, 1.5-T, 44 slices, 2.0mm thick, TE 50ms, TR 3.96s, voxel size 3 × 3 × 3mm3) for blood oxygenation level dependent (BOLD) contrast. The slices covered the whole brain in an oblique orientation of 30° to the anterior–posterior commissural line to optimise sensitivity to orbitofrontal cortex and medial temporal lobes (Deichmann et al., 2003). Head movement was minimised during scanning by comfortable external head restraint. 196 whole-brain images were obtained over 13min for each session. The first five echoplanar volumes of each session were not analysed to allow for T1-equilibration effects. A T1-weighted structural image was obtained for each subject to facilitate anatomical description of individual functional activity after coregistration with fMRI data.
We used software SPM2 (http://www.fil.ion.ucl.ac.uk/spm/spm2.html/) on a Matlab platform (Mathwork, IL) to analyse the fMRI data. Scans were realigned (motion-corrected), spatially transformed to standard stereotaxic space (with respect to the Montreal Neurologic Institute (MNI) coordinate system) and smoothed (Gaussian kernel full-width half-maximum, 8mm) prior to analysis. Task-related brain activities were identified within the general linear model. Separate design matrices were constructed for each subject to model; firstly, presentation of video face stimuli as event inputs (delta functions) and, secondly, the magnitudes of movement of dots on the face as parametric inputs. For clarity, in the following context we refer to the resultant statistical parametric maps (SPMs) of the former ‘categorical SPM’ and the latter ‘parametric SPM’. Data from 16 subjects were entered in the parametric SPM analyses; two subjects were excluded because of incomplete video recordings of facial movement.
In individual subject analyses, low-frequency drifts and serial correlations in the fMRI time series were respectively accounted for using a high-pass filter (constructed by discrete cosine basis functions) and non-sphericity correction, created by modelling a first degree autoregressive process (http://www.fil.ion.ucl.ac.uk/spm/; Friston et al., 2002). Error responses representing trials in which a subject incorrectly imitated the video clip were detected from recorded movies and modelled separately within the design matrix. Activity related to stimulus events was modelled separately for the five different categories of facial expressions using a canonical haemodynamic response function (HRF) with temporal and spatial dispersion derivatives (to compensate for discrepant characteristics of haemodynamic responses). In categorical SPM analyses, contrast images were computed for activity differences of imitation minus passive viewing for each stimulus category. These were entered into group level (second level) analyses employing an analysis of variance (ANOVA) model.
Second level random effect analyses were performed separately as F-tests of event-related activity (categorical SPM) and F-tests of the parametric association between the facial movements (parametric SPM). The statistical threshold was set at 0.05, corrected, for the former, and at 0.0001, uncorrected, for the latter. We made an assumption that ingestive and emotional facial expressions are not comparable in terms of underlying mental processes, and consequently avoided a subtraction logic (e.g. smiling minus chewing) commonly employed in neuroimaging studies. To constrain our analysis to brain regions specific to imitation of emotion processing, we used an exclusive mask representing the conjunction of activity elicited by the two ingestive facial expressions (IGs) in both categorical and parametric SPMs. We examined parameter estimates of peak coordinates to distinguish activations from deactivations in F-tests.
Subjects imitated emotional and ingestive facial expressions from the video clips with >90% accuracy (error rates for angry face 7.1%, sad face 3.8%, happy face 1.6%, chewing face 6.3% and licking face 1.9%). Movement of each of the three facial markers reflected the differential imitation of facial expressions conditions [D1, F = 5.66 (P = 0.016); D2, F = 5.507 (P = 0.007) and; D3, F = 17.828 (P < 0.001) under sphericity correction]. Since the facial markers were very light in weight, no subject remembered that there were three dots on the face after scanning.
To test for the possibility of confounding head movement during expression imitation trials, we assessed the displacement parameters (mm) used in realignment calculations during pre-processing of function scan time series (entered for each subject within SPM). For IM and PV blocks: −0.009 (s.d. 0.037) and 0.004 (s.d. 0.042) along the X-direction, 0.091 (s.d. 0.086) and 0.114 (s.d. 0.071) along the Y-direction and 0.172 (s.d. 0.147) and 0.162 (s.d. 0.171) along the Z-direction. The mean rotation parameters (rad) for IM and PV blocks are 0.0002 (s.d. 0.0030) and −0.0011 (s.d. 0.0035) around pitch, 0.0002 (s.d. 0.0010) and 0.0000 (s.d. 0.0013) around roll, and 0.0001 (s.d. 0.0007) and −0.0003 (s.d. 0.0008) around yaw. For the above six parameters, paired t-tests of IM and PV do not reach statistical significance (df = 17).
Bilateral somatomotor cortices (precentral gyrus, BA 4 and 6) were activated during imitation of all the five emotional and ingestive facial expressions, compared with passive viewing. Imitation of emotions (IEs), compared with imitation of IGs, enhanced activity within the right inferior frontal gyrus (BA 44) (Figure 2, Table 1). A condition by hemisphere (contrasting subject-specific contrast images with the equivalent midline-‘flipped’ images) did not reach statistical significance (P-value = 0.001, with region of interest analysis at BA 44), consistent with relative lateralisation of BA 44 emotion-related response. Bilateral BA 44 activity was observed in categorical SPM at an uncorrected P-value = 0.0001.
In addition to BA 44, the three IE conditions all evoked activity within medial prefrontal gyrus (BA 6), anterior cingulate cortex (24/32), left superior temporal gyrus (38) and left inferior parietal lobule (BA 40). Emotion-specific activity changes patterns were also noted in these categorical analyses: imitation of angry facial expressions was associated with selective activation of the left lingual gyrus (BA 18). Similarly, imitation of happy facial expressions was associated with selective activation of the lentiform nucleus (globus pallidus) (P < 0.05, corrected. Activity related to non-emotional IGs was used as an exclusive mask; Table 2).
Electrophysiological evidence suggests that passive viewing of emotional facial expressions can evoke facial EMG responses reflecting automatic motor mimicry of facial expressions (Dimberg, 1990; Rizzolatti and Craighero, 2004). We tested whether passive viewing of expressions (in contrast to viewing a static neutral face) evoked activity within the MNS. We failed to observe activation within MNS at the threshold significance of P < 0.05, corrected (or even at P < 0.001, uncorrected; Table 3). However, at this uncorrected threshold, enhanced activity was observed within precentral gyrus across angry, happy and chewing conditions.
During all the five (emotional and ingestive) expression imitation conditions, facial movement correlated parametrically with activity in bilateral somatomotor cortices, (prefrontal gyrus, BA 4/6). Moreover, when imitating the three emotional expressions (IE conditions), facial movement correlated with activity within the inferior frontal gyrus (44), medial frontal (BA 6) and the inferior parietal lobule (39/40) in a pattern resembling that observed in the categorical SPM analysis (Figure 3). After taking conjunction of parametric SPM of ingestive expression as an exclusive mask (Table 4), we also observed right insula activation across all three IEs. Interestingly, the categorical activation within anterior cingulate cortex (BA 24/32) did not vary parametrically with movement during these IE conditions.
We were able to further dissect distinct activity patterns evoked during imitation of each emotional expression (IE trials) that correlated with the degree of facial movement (analyses were constrained by an exclusive mask of the non-emotional IG-related activity). Ventromedial (BA 11) prefrontal cortex, bilateral superior prefrontal gyrus (BA 10) and bilateral lentiform nuclei reflected parametrically the degree of movement when imitating angry facial expressions (but were absent in categorical SPM analysis of anger imitation even when the statistical threshold is also set at the same uncorrected 0.0001 level). Conversely, activity with the lingual gyrus was absent in parametric SPM but was present in categorical SPM analysis.
Again, ventromedial prefrontal gyrus (BA 11) covaried with the facial movement during imitation of sad facial expression, representing an additional activation compared with categorical SPM. Since the activation of BA 11 was present in imitation of sad and angry faces, but absent in imitation of happy, chewing and licking faces, it may reflect specific, perhaps empathetic, processing of negative emotions. Other activated areas in parametric SPM during imitation of sad expression included rostral anterior cingulate (BA 32) and right temporal pole (BA 38).
The degree of facial movement during imitation of happy facial expressions correlated parametrically with activity in bilateral lentiform nucleus, bilateral temporal pole (BA 38), bilateral fusiform gyri (BA 37), right posterior superior temporal sulcus (BA 22), right middle occipital gyrus (BA 18), right insula (BA 13) and, notably, left amygdala (Figure 4, Table 5).
Our study highlights the inter-relatedness of imitative and internal representations of emotion by demonstrating engagement of brain regions supporting affective behaviour during imitation of emotional, but not non-emotional, facial expressions. Moreover, our study applies novel methods to the interpretation of neuroimaging data in which metrics for facial movement delineate the direct coupling of regional brain activity to expressive behaviour.
Explicitly imitating the facial movements of another person non-specifically engaged somatomotor and premotor cortices. In addition, imitating both positive and negative emotional expressions was observed to activate the right inferior frontal gyrus, BA 44. The human BA 44 is proposed to be a critical component of an action-imitation MNS: mirror neurons were described in non-human primates and are activated whether one observes another performing an action or when one executes the same action oneself. Mirror neurons, sensitive to hand and mouth action, are reported in monkey premotor, inferior frontal (F5) and inferior parietal cortices (Buccino et al., 2001; Rizzolatti et al., 2001; Ferrari et al., 2003; Rizzolatti and Craighero, 2004). The human homologue of F5 covers part of the precentral gyrus and extends into the inferior frontal gyrus (BA 44 pars opercularis). In primates, including humans, the MNS is suggested as a neural basis for imitation and learning, permitting the direct, dynamic transformation of sensory representations of action into corresponding motor programmes. Thus explicit imitation, as in our study, maximises the likelihood of engaging the MNS. At an uncorrected statistical threshold (P = 0.0001, uncorrected), we observed the activation of bilateral inferior frontal gyri and inferior parietal lobules for all the five imitation conditions (Buccino et al., 2001; Carr et al., 2003; Leslie et al., 2004) concordant with the current knowledge of imitation network (Rizzolatti and Craighero, 2004).
Nevertheless, we had also predicted activation of the MNS, albeit at reduced magnitude, during passive viewing, but were unable to demonstrate this even at a generous statistical threshold (P = 0.001, uncorrected). Across other studies, evidence for passive engagement of BA 44 pars opercularis when watching facial movements is rather equivocal (Buccino et al., 2001; Carr et al., 2003; Leslie et al., 2004). One factor that may underlie these differences is attentional focus: in our study, the subjects performed an incidental gender discrimination task so that attention was diverted from the emotion. In fact, it is plausible that the human MNS is necessarily sensitive to intention and attention, to constrain adaptively any interference to goal-directed behaviours from involuntarily mirroring signals within a rich social environment.
The right, and to a lesser extent the left, inferior frontal gyrus was engaged during the imitation of emotional facial expressions. In fact, despite clinical anatomical evidence for the dependency of affective behaviours on the integrity of right hemisphere, including prosody and facial expression (Ross and Mesulam, 1979; Gorelick and Ross, 1987; Borod, 1992), we showed only a relative, not absolute, right lateralised predominance of BA 44 activation. Besides the MNS, there are other possible accounts for enhanced activation within inferior frontal gyri. It is possible, for example, that the imitation condition (relative to passive viewing) enhances the semantic processing of emotional/communicative information, thereby enhancing activity within inferior frontal gyri (George et al., 1993; Hornak et al., 1996; Nakamura et al., 1999; Kesler-West et al., 2001; Hennenlotter et al., 2005). Activation of BA 44 would thus reflect an interaction between facial imitative engagement and interpretative semantic retrieval.
We also observed emotion-specific engagement of a number of other brain regions, notably inferior parietal lobule (BA 40), medial frontal gyrus (BA 6), anterior cingulate cortex [BA 24/32, anterior cingulate cortex (ACC)] and insula. Each of these brain regions is implicated in components of imitative behaviours: the inferior parietal lobule supports ego-centric spatial representations and cross-modal transformation of visuospatial input to motor action (Buccino et al., 2001; Andersen and Buneo, 2002). Correspondingly, damage to this region may engender ideomotor apraxia (Rushworth et al., 1997; Grezes and Decety, 2001). Similarly, the medial frontal gyrus [BA 6, supplementary motor area (SMA)] is implicated in the preparation of self-generated sequential motor actions (Marsden et al., 1996) and dorsal ACC is associated with voluntary and involuntary motivational behaviour and control including affective expression (Devinsky et al., 1995; Critchley et al., 2003; Rushworth et al., 2004). In monkeys, SMA and ACC contain accessory cortical representations of the face and project bilaterally to brainstem nuclei controlling facial musculature (Morecraft et al., 2004). Positron emission tomography (PET) evidence suggests a homology between human and non-human primate anatomy in this respect (Picard and Strick, 1996). Lastly, insula cortex, where activity also correlated with magnitude of facial muscular movement during emotional expressions, is implicated in perceptual and expressive aspects of emotional behaviour (Phillips et al., 1997; Carr et al., 2003). Insula cortex is proposed to support subjective and empathetic feeling states yoked to autonomic bodily responses (Critchley et al., 2004; Singer et al., 2004b). It is striking that the activation of these brain regions [particularly BA 44 pars opercularis and insula which contain primary taste cortices (Scott and Plata-Salaman, 1999; O'Doherty et al., 2002)], was not strongly coupled to the imitation of ingestive expressions (Tables 4 and and5).5). However, our observation of emotional engagement of a distributed matrix of brain regions during imitative behaviour highlights the primary salience of communicative affective signals (compared with non-communicative ingestive actions) to guide social interactions. In this regard, we hypothesise that cinguloinsula coupling supports an affective set critical to this apparent selectivity of prefrontal and parietal cortices.
In addition to defining regional brain activity patterns mediating social affective interaction, a key motivation of our study was to dissociate, using emotional mimicry, neural substrates supporting specific emotions. These effects were most striking when the magnitude of facial movement was used to identify ‘resonant’ emotion-specific activity. Thus, across the imitation of three emotions, enhanced activity within right insular region might reflect representation of the feelings states that may have their origin in interoception (Critchley et al., 2004). Correlated activity at bilateral lentiform nuclei in the imitation of angry faces might reflect goal-directed behaviour (Hollerman et al., 2000). Anger-imitation also engaged bilateral frontal polar cortices (BA 10). The frontal poles are implicated in a variety of cognitive functions including prospective memory and self-attribution (Okuda et al., 1998; Ochsner et al., 2004). Nevertheless, underlying these roles, BA 10 is suggested to support a common executive process, namely the ‘voluntary switching of attention from an internal representation to an external one…’ (Burgess et al., 2003). Within this framework, BA 10 activity may be evoked during anger imitation since subjects are required to suppress pre-potent reactive responses in order to affect a confrontational external expression (inducing activity within BA 10). Recently, Hunter et al. (2004) reported bilateral frontal poles activation during action execution, which further suggests that in our study, bilateral BA 10 activation in the imitation of anger might be related to the prominent behaviour dimension of anger expression.
Activity within ventromedial prefrontal cortex (VMPC) correlated significantly with the degree of facial muscle movement when mimicking both angry and sad expressions (Figure 4), suggesting a specific association between the activity of this region and expression of negative emotions (Damasio et al., 2000; Pietrini et al., 2000). A direct relationship was also observed between activity in the adjacent rostral ACC, very close to subgenual cingulate, and facial muscular movement during imitation of sadness. This region is implicated in subjective experience of sadness and with dysfunctional activity during depression (Mayberg et al., 1999; Liotti et al., 2000).
In contrast, the more the subjects smiled in imitation of happiness (degree of movement of zygomatic major), the greater the enhancement of activity in cortical and subcortical brain regions including the globus pallidus, amygdala, right posterior superior temporal sulcus (STS) and fusiform cortex. This pattern of activity suggests recruitment in the context of positive affect of regions ascribed to the ‘social brain’ (Brothers, 1990). The globus pallidus is a ventral striatal region implicated in dopaminergic reward representations (Ashby et al., 1999; Elliott et al., 2000) and affective approach behaviours (Arkadir et al., 2004). It is interesting that basal ganglion activation was observed in imitation of angry and happy faces but not in imitation of sad faces, where both emotions carry on approaching action tendency. The right posterior STS is particularly implicated in processing social information from facial expression and movement (Perrett et al., 1982; Frith and Frith, 1999). The recruitment of this region during posed facial expression further endorses its contribution to emotional communication beyond merely a sensory representation of categorical visual information. The preferential recruitment of these visual cortical regions when imitating expressions of happiness emphasises the importance of reciprocated signalling of positive affect to social engagement and approach behaviour; signals of rejection in effect may turn off ‘social’ brain regions. This argument is particularly pertinent when considering the activation evoked in the left amygdala when smiling: Although much literature is devoted to the role of amygdala in processing threat and fear signals, the region is sensitive to affective salience and intensity of emotion, independent of emotion-type (Buchel et al., 1998; Morris et al., 2001; Hamann and Mao, 2002; Morris et al., 2002; Winston et al., 2003). Thus, reciprocation of a smile (a signal of acceptance and approach) permits privileged access to social brain centres. Smiling may thus represent a more salient and socially committing (or perhaps risky) behaviour than imitation of other expressions.
A specific consideration is that even though our parametric analysis explored neural activity correlating with facial movements, our findings do not constitute direct evidence for the causal generation of emotions by facial movements. Nevertheless, the context of our experiment (expression mimicry) embodies social affective interaction and is distinct from intentional ‘non-emotional’ muscle-by-muscle mobilisation of posed facial expression (Ekman et al., 1983). By highlighting the modulation of neural activity in brain regions implicated in emotional processing, our findings supplement and extend the data showing that ‘facial efference’, when congruent with emotional stimuli, can modulate subjective emotional state (review, Adelmann and Zajonc, 1989). In addition to experiential, reactive and social cognitive dimensions, emotions interact with psychological constructs and their underlying neural mechanisms (Ekman, 1997; Dolan, 2002). Consequently, interpretations of the results of our parametric analysis may extend beyond social affective inferences to include interactions with other cognitive functions, including concurrent mnemonic, anticipatory, psychophysiological processes and so on (Ekman, 1997); however, the evidence supporting their relationship with facial expression is either inconsistent or lacking (review, Barrett, 2006). Nevertheless, our results endorse the proposal that emotional facial mimicry is not purely a motoric behaviour, but engages distinctive neural substrates implicated in emotion processing.
To summarise, our findings define shared and dissociable substrates for affective facial mimicry. We highlight, first, the primacy of affective behaviours in engaging action-perception (mirror-neuron) systems and, second, a subsequent valence-specific segregation of emotional brain centres. At a methodological level, our study illustrates how the magnitude of facial muscular movements can enhance sensitivity in the identification of emotion-related neural activity. The face conveys abundant information communicating internal emotional state to hermeneutically inform social cognition and the dynamics of human interaction (Singer et al., 2004a).
T.-W.L. is supported by a scholarship from Ministry of Education, Republic of China, Taiwan. H.D.C., R.J.D. and O.J. are supported by the Wellcome Trust.