PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptNIH Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
Neuroimage. Author manuscript; available in PMC Jan 15, 2008.
Published in final edited form as:
PMCID: PMC1839041
NIHMSID: NIHMS15228
Neural dynamics for facial threat processing as revealed by gamma band synchronization using MEG
Qian Luo,*1 Tom Holroyd,2 Matthew Jones,1 Talma Hendler,3,4 and James Blair1
1 Mood and Anxiety Disorders Program National Institute of Mental Health, Bethesda, MD 20892
2 NIMH MEG Core Facility National Institute of Mental Health, Bethesda, MD 20892
3 Laboratory of Brain and Cognition National Institute of Mental Health, Bethesda, MD 20892
4 Wohl Institute for Advanced Imaging Tel-Aviv Sourasky Medical Center, Tel-Aviv, 64239 Israel
Correspondence should be addressed to: Qian Luo, Ph., D Mood and Anxiety Disorders Program, National Institute of Mental Health, National Institutes of Health, 15K North Drive, Room 300C, MSC 2670, Bethesda, Maryland, 20892-2670, USA,. E-mail address: luoj/at/mail.nih.gov
Facial threat conveys important information about imminent environmental danger. The rapid detection of this information is critical for survival and social interaction. However, due to technical and methodological difficulties, the spatiotemporal profile for facial threat processing is unknown. By utilizing Magnetoencephalography (MEG), a brain-imaging technique with superb temporal resolution and fairly good spatial resolution, Synthetic Aperture Magnetometry (SAM), a recently developed source analysis technique, and a sliding window analysis, we identified the spatiotemporal development of facial threat processing in the gamma frequency band. We also tested the dual-route hypothesis by LeDoux who proposed, based on animal research, that there are two routes to the amygdala: a quick subcortical routeand a slower and cortical route. Direct evidence with humans supporting this model has been lacking. Moreover, it has been unclear whether the subcortical route responds specifically to fearful expressions or to threatening expressions in general. We found early event-related synchronizations (ERS) in response to fearful faces in the hypothalamus/thalamus area (10–20 ms) and then the amygdala (20–30 ms). This was even earlier than the ERS response seen to fearful faces in visual cortex (40–50 ms). These data support LeDoux’s suggestion of a quick, subcortical thamaloamygdala route. Moreover, this route was specific for fear expressions; the ERS response in the amygdala to angry expressions had a late onset (150–160 ms). The ERS onset in prefrontal cortex followed that seen within the amygdala (around 160–210 ms). This is consistent with its role in higher-level emotional/cognitive processing.
Rapid detection of threat in the environment is critical for survival and social interaction. The amygdala is thought to be crucially involved in such detection. It is proposed that the amygdala receives information on potential threats by two parallel routes: a crude but fast subcortical route (thalamus-amygdala) and a slower route allowing cortical analysis (thalamus-sensory cortex-amygdala) (LeDoux, 1996). The purpose of this quick subcortical route is thought to prepare an organism very rapidly for the incoming danger even before the nature of the danger is known (LeDoux, 1996; also see Johnson, 2005; and Dolan, 2002, for a review).
While LeDoux’s dual route hypothesis was based on rodent data (LeDoux, 1996; Quirk et al., 1995), recent fMRI work in humans (Whalen et al., 1998; de Gelder et al., 1999 de Gelder et al., 2003; Morris et al., 1999; Liddell et al., 2005) has suggested that the subcortical route is also involved in processing facial expressions such as fear. However, due to the limited temporal resolution of fMRI, the exact time course of this route is unknown. Moreover, what happens after initial sensory and emotional encoding in the visual cortex and amygdala is unclear. It has been proposed that prefrontal cortex (PFC) is involved in later processing, particularly in higher-level emotional and cognitive evaluation (Ochsner & Gross, 2005). However, the dynamic profile of such processing is unknown.
A second issue concerns the specificity of the subcortical route. LeDoux’s hypothesis is based on fear conditioning in rodents and the subcortical route evidence in humans has largely been based on fearful expression processing. It remains unknown whether the subcortical route responds to threatening expressions in general or whether it is specific for fearful expressions.
Angry and fearful facial expressions appear to have different functional roles in human social interaction (e.g., Averill, 1982; Blair 2003, Klinnert et al., 1987; van Honk et al., 2001; 2005). It is argued that angry facial expressions contribute to hierarchical relations (Blair, 2003; Knutson, 1996) and are met with either appeasement or retaliation, depending on the receiver’s relative position in the dominance hierarchy (e.g., Blair, 2003, Van Honk et al., 2001; 2005). In contrast, fearful expressions typically communicate an external threat to be avoided or learnt to avoid (e.g., Blair, 2003; Klinnert et al., 1987; Mineka & Cook, 1993). Previous neuroimaging studies have shown that while angry expressions may activate the amygdala, they frequently do so to a significantly lesser degree than fearful expressions (Blair et al., 1999; Fitzgerald; 2005; Whalen et al., 2001). It has been suggested that while fearful expressions initiate stimulus-reinforcement learning, a function that the amygdala is crucially involved in (Blair, 2003), angry expressions prompt the alteration of behavior by the observer (e.g., Blair, 2003, van Honk et al., 2001; 2005). We predicted that fearful expressions might activate the amygdala through the quick subcortical route because this expression effectively prepares the individual for incoming danger even before the nature of the danger is known. In contrast, we predicted that any amygdala response to angry expressions would implicate the cortical route due to greater requirements for more elaborative cortical processing regarding, for example, the identity of the displayer and their hierarchy status.
Magnetoencephalography (MEG) is ideal for examining these issues. In contrast to functional Magnetic Resonance Imaging (fMRI), MEG provides excellent temporal resolution (in the order of milliseconds) and good spatial resolution with appropriate source modeling methods. Many previous MEG studies, however, were analyzed with relatively low spatial resolution (hemispheric or lobar level) due to a lack of source modeling. Some studies did use source modeling such as equivalent current dipole (ECD) fitting. However, ECD is limited by the requirement for a priori hypotheses regarding the number and location of active sources. Moreover, ECD requires averaged evoked responses that are time and phase-locked. Neuronal responses that are not phase-locked but are time-locked cannot be assessed. Furthermore, frequency information is not available with ECD (see Hillbrand et al., 2005 for a discussion of ECD limitations). This is unfortunate as the functional significance of different frequency bands is becoming an important issue in neuroimaging.
A recently developed source analysis method, Synthetic Aperture Magnetomery (SAM) based on the beamformer approach (Vrba & Robinson, 2001) overcomes these limitations. SAM is a spatial filtering technique based on the nonlinear constrained minimum-variance beamformer and is capable of estimating source current power changes in an arbitrarily chosen voxel within the whole brain with high resolution. SAM requires no a priori estimates of numbers or approximate locations of sources. While including the ability to analyze phase locked data (averaged evoked responses), it can also reveal significant power changes of non-phase locked data within selected frequency bands in the brain. Importantly, SAM retains the millisecond temporal resolution needed to unravel cortical dynamics.
Because of these advantages, SAM has become an increasingly popular analytic tool for MEG data (Vrba and Robinson, 2001; Hillebrand et al., 2005; Brooks et al., 2005; Hall et al., 2005; Fawcett et al., 2004; Furlong et al., 2004; Singh et al., 2003). Moreover, event-related oscillation as revealed by SAM also has a demonstrable spatial coincidence with the BOLD (blood oxygenation level-dependent) fMRI response (Brookes et al., 2005; Foucher et al., 2003; Singh et al., 2002; Hall et al., 2005; Crone, 1998; See, for a review, Hillebrand et al 2005). However, most previous studies adopting the SAM technique tend to use just one, fixed active and control window pair. This means that it is difficult to see a dynamic spatiotemporal profile brain activity related to a brain region. In the present study, the sliding window method was adopted, enabling us to capture very fine-scale dynamic changes spatiotemporally.
There have been previous demonstrations of MEG’s ability to detect signal from deep structures such as hippocampus (Ioannides et al., 1995; Rogers, 1990) and amygdala (Ioannides et al., 1995; Streit et al., 2003) using evoked field methods. The sensitivity of a source method to deep brain structure such as the amygdala depends on both the signal to noise ratio and the spatial resolution it provides (Vrba & Robinson, 2001). SAM uses the second-order covariance between channels rather than single-channel averages, and thus is sensitive to spatially correlated activity. In addition, the use of the forward magnetic field solution for a source means that SAM detects dipole sources and therefore is less sensitive to artifacts that do not look like dipoles (Vrba & Robinson, 2001). In short, the detection of responses in the amygdala should be possible using SAM. Indeed, as an adaptive technique, SAM is probably better at localization of temporally uncorrelated sources than non-adaptive techniques (cf. Sekihara et al., 2005).
With respect to MEG and EEG (electroencephalogram) studies, there has been much recent interest in the frequency-specific oscillatory power changes that take place whenever a task is performed. These changes are termed event-related desynchronization (ERD) or event-related synchronization (ERS), defined as a localized decrease or increase in oscillatory power (Pfurtscheller and Lopes da Silva 1999). While there is debate regarding the functional implications of ERD/ERS, gamma band ERS is thought to reflect the cooperative behavior of a large number of neurons associated with a task and active information processing allowing rapid coupling between spatially separate cell assemblies (Pfurtscheller and Lopes da Silva 1999).
It has been suggested that gamma-band synchronization play a crucial role in integrating distributed neural processes into highly ordered cognitive functions and is important in a wide range of cognitive, perceptual, attentional and emotional processes (Bichot et al., 2005; Fries et al., 2001; Tallon-Baudry et al., 2005; Müller et al., 1999; Taylor et al., 2000; Keil et al., 2001, Oya et al., 2002). In regard to emotional processing, while there has been some interest in theta-band oscillation in animal work (e.g., Seidenbecher et al., 2003), gamma-band oscillation has been considered of special interest (Oya et al., 2002; Müller et al., 1999; Taylor et al., 2000; Keil et al., 2001) and has been associated with emotional processing within the amygdala (Oya et al., 2002).
In the present study, the neural dynamics of threatening face processing was explored using faces with fearful, angry and neutral expressions. By adopting MEG and sliding window SAM, we focused on gamma band ERS changes in the brain. We investigated the following questions. 1) Would gamma band oscillation reflect brain activity for emotional processing and if so, what areas are sensitive to such modulation? 2) Would there be indications of a quick subcortical route in processing threatening faces? We predicted that if the amygdala receives early subcortical information, there would be early activity in thalamus and amygdala to expression information. This activity should be augmented following later cortical input. 3) If a subcortical route exists, will it be responsive to both fearful and angry expressions? 4) Would PFC show response after emotional and sensory encoding in the amygdala and visual cortex? If so, what is the exact spatiotemporal profile?
Subjects
Fifteen healthy volunteers, 7 males and 8 females, between the ages of 22 to 36 participated in the experiment. All gave written informed consent to participate in the study, which was approved by the NIMH Institutional Review Board.
Design
The stimuli were faces with fearful, angry and neutral expressions selected from the NimStim Face Stimulus Set (http://www.macbrain.org/resource.htm). There were 52 (26 male) exemplars of each expression. The face stimuli were presented without hair and have been transformed into grayscale.
The stimuli were presented using Presentation software (www.neuro-bs.com). Each trial involved the presentation of a face for 300 ms followed by a 200 ms blank screen (see Figure 1). There was then 1500 ms response window during which the participant chose one of two buttons according to the gender of the stimulus. The button associated with each gender was randomized across trials and the subjects were told which button corresponded with which gender during the response window; i.e., “M F” indicated that the left button was the response for male while “F M” indicated that the right button was the response for male. This was done to reduce the subject’s expectancy and preparatory responses. An emotion-irrelevant gender judgment task was adopted to avoiding possible attentional bias associated with certain expression types. The response window was followed by a blank of 600 ms.
Fig. 1
Fig. 1
Stimuli presentation sequence
Data Acquisition
The MEG data were recorded at 600 Hz using a 275-channel CTF whole head MEG system in a shielded environment. The CTF MEG system is equipped with synthetic 3rd gradient balancing, an active noise cancellation technique that uses a set of reference channels to subtract background interference. The resulting noise floor is in the order of 5–7 fT above 1 Hz. At the beginning and end of each measurement, the participant’s head position was registered with localization coils that were placed at the nasion and the bilateral preauricular points. It was required that head movements did not exceed 0.5 cm. By registration of the head position at these three points, the MEG data could be superimposed on the individual anatomical images with an accuracy of a few millimeters.
Anatomical images were also recorded. High-resolution anatomical images were acquired using a T1-weighted, three-dimensional, Spoiled GRASS imaging (spgr) sequence (1 X 1 X 1.5 mm3 ) with either a 1.5 Tesla or a 3 Tesla GE scanner (The resolution is almost the same and adequate for the creation of the multisphere forward model).
Data Processing
VSM/CTF software (http://www.ctf.com/products/meg/ctf/software.htm) and softwares developed at the NIMH MEG core facility in combination with AFNI (http://www.afni.nimh.nih.gov) were used for MEG/MRI data processing.
The DC (direct current) offset was removed, and the data were high-pass filtered at 0.61 Hz and powerline-notch filtered at 60 Hz (width = 3.1 Hz). The data were then marked according to the three stimulus types.
Before doing SAM analysis, a multisphere head model was created for each subject based on anatomical images of each subject using AFNI. The advantage of using a multisphere over a single sphere model is that in the former, each sphere (one per MEG sensor) is fit to a small patch of the head model (directly under the sensor) in order to better model the local return currents.
SAM was then used to analyze task-related activation differences in the gamma band (30-50Hz). SAM estimates source power with high spatial resolution using an optimal linear combination of sensors that suppresses signals from environmental and other brain noise without attenuating power from the target voxel. SAM creates an optimum spatial filter from the covariance between the active state and the control state to calculate a 3-d source image comparing the source strength for specified time windows for the two states in a certain band. It is based on the beamformer technique with the source strength of a beamformer at a voxel being the weighted sum of the signal strength of all channels (Van Veen, 1997):
equation M1
where W is the sum of weights and B is N × T matrix of the source magnetic field at the sensor location.
To obtain an image of the dynamic spatiotemporal development of the brain’s activity, a sliding window analysis was used in combination with SAM. A short time window length of 150 ms was adopted to ensure a fine spatiotemporal profile. A drawback of a short window is that sources are more likely to be correlated (Hillbrand et al., 2005). Correlated sources are not detected by SAM. In short, it is possible that some sources are not identified because they are correlated with one another. With a window length of 150 ms and a step of 10 ms, we estimated the signal power in each voxel by using dual-state SAM imaging, in which the control state (baseline) was the 150 ms before stimulus onset (or –150 ms—0 ms) and the active state was a 150 ms window sliding with a 10 ms step: –150 ms—0 ms, −140 ms—10 ms, −130 ms—20 ms, … , 340 ms—490 ms, 350 ms—500 ms. With sliding window SAM, we were able to tell quite accurately at what time significant ERS emerges, peaks and offsets. For example, if an ERS in a region was not significant in the window beginning at –100 ms and ending at 40 ms, but significant in the window beginning at –100 ms and ending at 50 ms, then we could conclude that the ERS in this region became significant between 40 to 50 ms. Fifty dual-state SAM imaging analyses were performed with a spatial resolution of 7 mm. The output results were then concatenated, enabling us to obtain a time course in combination with spatial activation maps across all the time points starting from 150 ms before the stimulus to 500 ms after the stimulus. The high-performance computational capabilities of the NIH Biowulf PC/Linux cluster, Bethesda, MD. (http://biowulf.nih.gov) was utilized to perform the above computation-intensive tasks.
For group analysis, individual anatomical images were first spatially normalized to the Talairach brain atlas. The SAM results of different subjects were also normalized (transformed to z-score) and then registered to their anatomical Talairach images respectively. The group analysis for each time window was then performed using a random-effects ANOVA model in AFNI, which generates the ERS/ERD results for the three conditions and the contrast effect between the three conditions. Fifty ANOVAs were performed. Voxels with a uncorrected p < .05 were considered statistically significant.
There were no significant differences in reaction times (RT) between the three conditions (RT_fear = 544.28±102.48 ms, RT_anger = 543.58±102.31 ms, RT_neutral = 535.60±109.68 ms; F (2,28) =1.115, p = .342). However, there was a significant difference in error rates (ER) was significant (ER_fear = 7.44%±6.12%, ER_anger = 9.49%±5.21%, ER_neutral = 6.03%±5.03% ms; F (2,28) = 3.438, p = .046). Follow up tests showed that this was due to the subjects being significantly less active for angry relative to neutral faces (p < 0.01). All other trial types did not significantly differ from one another.
Our MEG-SAM results revealed significant ERS in the brain in the gamma frequency band in all the three conditions. We described the results by onset, peak and offset of ERS. Onset and offset of ERS means that at a certain time, ERS became statistically significant/insignificant versus the control period. Peak means that at a certain time, the activity reaches the highest level.
The amygdala and related limbic structures
With respect to fearful expressions, the ERS in the hypothalamus/thalamus area became significant shortly after stimulus onset (at around 10–20 ms). By 20–30 ms, a significant ERS in the amygdala was seen. The ERS in the right amygdala peaked at around 230–240 ms and offset by 300–310 ms. See Fig. 2 A and C for an illustration of response profiles in the right amygdala.
Fig. 2
Fig. 2
Spatiotemporal profiles for the amygdalae
With respect to angry expressions, we observed only a late ERS in the left amygdala, that became significant (onset) at around 150–160 ms, peaked at around 210–220 ms and offset at around 260–270 ms. See Fig.2 B and D for an illustration.
No significant ERS activity in the amygdala was seen in the neutral condition.
Fig. 2 is an illustration of the spatiotemporal profiles for the amygdalae
The occipitotemporal cortex
Our results indicate significant ERS in a large cluster in the posterior cortex evolved dynamically, covering parts of occipital and temporal cortex (including fusiform gyrus) as well as cerebellum. Here we used the term occipitotemporal cortex to refer to this cluster as the occipital and temporal cortex constituted the bulk of it.
The occipitotemporal cortex showed significant ERS for the fearful, angry and neutral expressions and the spatiotemporal profile was spatially homologous and qualitatively similar between conditions. In all three conditions, the earliest significant ERS started at around 40 ms (fear: 40–50 ms; anger: 20–30 ms; neutral: 30–40 ms), peaked at around 140–150 ms and offset at around 300 ms (fear: 300–310 ms; anger: 290–300 ms; neutral: 270–280 ms).
See Fig. 3 for an illustration of response profiles in the occipitotemporal cortex.
Fig. 3
Fig. 3
Spatiotemporal profiles for the occipitotemporal cortex
The prefrontal cortex (PFC)
Our results indicated that fear stimuli elicited a dynamic development of the ERS from the right amygdala, to thalamus, basal ganglia and anterior insular that became significant within inferior frontal gyrus (IFG, BA 47) at around 200–210 ms. This ERS in the right IFG peaked at around 240–250 ms and offset at around 310–320ms. See Fig. 4 A and D for an illustration of such dynamic evolvement.
Fig. 4
Fig. 4
Spatiotemporal profiles for prefrontal cortex
Angry stimuli elicited a significant ERS within right anterior cingulate cortex (ACC, BA 32) that onset at around 170–180 ms, peaked at around 220–230 ms and offset at around 270–280 ms. See Fig. 4 B and E for an illustration. Angry stimuli also elicited a significant ERS in the left orbitofrontal cortex (OFC, BA 10) that onset at around 170–180 ms, peaked at around 200–210 ms and offset at around 230–240 ms.
General pattern of temporal profiles
In general, the time for ERS onset and peak in the occipitotemporal cortex and the amygdala was earlier than that in PFC. This suggest a temporal sequence of lower-level visual/emotional encoding and higher-level cognitive/emotional evaluation, consistent with the general assumption about information processing stages in the brain.
Interestingly, despite differences in onset/peak time in different areas, the ERS offset time was very consistent across regions. The ERS offset in the amygdala, occipitotemporal cortex and PFC was generally around 300 ms, coincident with stimulus offset. This suggests that ERS is closely stimulus-locked and task-related.
For a summary of spatiotemporal information of ERS of the above areas, see Table 1.
Table 1
Table 1
Spatiotemporal information for brain regions showing significant ERS
Our results suggest that emotional processing is associated with gamma band synchronization, consistent with some earlier scalp EEG and intracranial studies on emotional processing in humans (Müller et al., 1999; Taylor et al., 2000; Keil et al., 2001, Oya et al., 2002).
A dynamic spatiotemporal profile was observed for threatening face processing. In the fear condition, there was a very early ERS within hypothalamus/thalamus (at around 10–20 ms) followed by an early ERS onset within the amygdala (onset at around 20–30 ms), earlier than that seen in the occipitotemporal cortex, suggesting a fast subcortical route is involved in fearful face processing. In the anger condition, the ERS onset in the amygdala was much later (at around 150–160 ms), later than the ERS in the occipitotemporal cortex, suggesting that anger face processing goes through the cortical route. ERS onset in PFC was seen at around 160–210 ms, consistent with its role in integrating cognitive and social emotional information (Ochsner & Gross, 2005).
Visual area
Gamma band synchronization in the visual area has been consistently observed during recordings of single neuron activities or local field potentials (Bichot et al., 2005; Fries et al., 2001, Logothetis et al., 2001), EEG and MEG (Tallon-Baudry et al., 1997; 2005,) and recently using MEG-SAM methods (Brookes et al., 2005). In line with previous findings, our MEG-SAM results showed significant gamma band synchronization in occipitotemporal cortex. We found that visual ERS occurred as early as around 20–50 ms after stimulus onset, a time-range consistent with previous reports on short visual latencies using single neuron recordings (Luck et al., 1997; Cottaris & De Valois, 1998; Tovee et al., 1993) and MEG/EEG methods (Braeutigam et a., 2001; Seeck et al, 1997). Our finding of significant ERS in the occipitotemporal cortex including the fusiform face area is also consistent with fMRI findings on face processing (Yovel and Kanwisher, 2004; Haxby et al., 2001).
Subcortical and cortical pathways to the amygdala
Our results provide direct support for suggestions of a fast subcortical route for emotional expression processing. We observed a very early ERS within hypothalamus/ thalamus (with an onset at around 10–20 ms) in response to fearful expressions. This was followed by an early ERS onset within the amygdala that became significant at around 20–30 ms (earlier than the significant ERS seen to this expression in occipitotemporal cortex), suggesting that for fast fearful face processing, the subcortical thalamus-amygdala route was involved. Such processing is likely to be crude and coarse (LeDoux, 1996). This ERS peaked at around 230–240 ms (later than that in the occipitotemporal cortex), presumably as a result of additional cortical input from the visual cortex for a more elaborated processing. With respect to angry expressions, the ERS within the amygdala was considerably later, onsetting at around 150–160 ms and then peaking, similarly to fearful expressions, at 210–220 ms. In the anger condition, a much later ERS onset in the amygdala compared to that in the visual cortex suggests that in angry face processing, the amydala is activated following visual analysis in occipitotemporal cortex through a cortical route.
Our finding of a very early ERS in the amygdala in response to fearful expressions is broadly in line with studies on auditory fear conditioning in rats, which indicate that auditory fearful signals can reach the amygdala through the thalamus with a latency of 12 ms (LeDoux, 1996; Quirk et al., 1995). The current data suggest that humans can also process fearful signals in the visual channel through the subcortical route at a very short latency. Our results on expression processing through the subcortical route are also consistent with previous fMRI reports (Whalen et al., 1998; de Gelder et al., 1999; Morris et al., 1999; Liddell et al., 2005). In addition, our finding on increased gamma band activity in the amygdala in response to emotional expressions is consistent with a previous report using intracranial recordings (Oya et al., 2002).
We found no evidence of an early ERS to angry expressions, suggesting that angry expression processing does not adopt the subcortical thalamo-amygdala route. We believe that fearful, but not angry, expressions elicit quick, subcortical responding because of their different functional roles in human social interaction (e.g., Averill, 1982; Blair, 2003, Klinnert et al., 1987; van Honk et al., 2001; 2005). Fearful expressions are crucial for social aversive conditioning (Mineka & Zinbarg, 1995), they provide very rapid transmission of environmental threat information from caregivers to infants, informing the infant what objects should be avoided (e.g., Blair, 2003; Klinnert, 1987; Mineka & Cook, 1993). Fearful expressions may activate the amygdala through the quick subcortical route (thalamus-amygdala) because they prepare the individual for incoming danger/ learning even before the nature of the danger is known. At a later time, the amygdala’s response is reinforced with additional cortical input.
Angry expressions, in contrast, contribute to hierarchical relations (Blair, 2003; Knutson, 1996). The response to these expressions depends on the relative positions of the displayer and receiver in the dominance hierarchy (e.g., Blair, 2003, Van Honk et al., 2001; 2005). We believe that the amygdala’s response to angry expressions requires the cortical route (visual cortex-amygdala) because of the need for elaborative cortical processing regarding the identity of the displayer/ their hierarchy status.
One caveat should be mentioned with respect to the current results. It is possible that the subcortical route responds to specific physical characteristics of face stimuli that are independent of their emotional implications. Unfortunately, it is difficult to disentangle the effect of physical characteristics from emotional expressions given that these physical differences can be part of the emotional expression. For example, eye whites may be larger in fearful expressions rather than happy or angry expressions. Whalen et al (2004) reported increased amygdala responses to masked larger eye whites than masked smaller eye whites. It is possible that the subcortical activation seen in the current study could be a response to the relatively large eye whites seen to the fearful expressions. Future work will be needed to determine what properties of the stimulus determine the recruitment of the subcortical route.
The present study showed ERS in the right amygdala in response to fear faces and in the left amygdala to anger faces. The involvement of the left (e.g. Whalen et al., 2001) or the right (Pegna et al., 2005) amygdala in response to fear and anger face processing has both been reported using fMRI. The reason why individual studies report greater lateralization for one or other side, whether, for example, it reflects functional lateralization or of task demands, strategy or subject variability, remain unclear.
From the amygdala to PFC: integrative emotional/cognitive processing in PFC
For one to evaluate the threat and to decide on what action to take, the amygdala needs to communicate the threat signal to PFC. Our finding that ERS onset in PFC was later than that in the amygdala strongly suggests such a cognitive sequence. This was mostly vividly seen in the fear condition, where there was a dynamic evolvement of ERS from the right amygdala to thalamus, basal ganglia, insular and the right IFG. Previous fMRI studies have also implicated the involvement of IFG in responding to fearful expressions (Sprengelmeyer, 1998) and OFC and ACC in responding to angry expressions (Blair et al., 1999; Harmer et al., 2001).
In the present study, we investigated the neural dynamics for processing facial threat in the gamma frequency band using MEG and SAM. We found that emotional processing is associated with gamma band synchronization. We also demonstrated specific spatiotemporal profiles in different areas of the brain including amygdala, visual cortex and PFC in the gamma band. Our results provide support for the involvement of the quick subcortical route to the amygdala in facial threat processing and suggest that it selectively responds to fearful but not angry expressions.
Interestingly, while MEG signals mainly arise from synaptic currents, the cortical loci of ERS in the gamma band identified in our study are broadly similar to those described by BOLD fMRI. This supports a recently suggested link between synaptic and BOLD responses (Logothetis et al., 2001) and a relationship between gamma synchronization and BOLD responses (Niessing et al., 2005; Crone, 1998; Brookes, 2005; Foucher, 2003; Kilner et al., 2005).
Acknowledgments
This research was supported by the NIMH Intramural Research Program.
Footnotes
This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
  • Averill JR. Anger and aggression: an essay on emotion. Springer-Verlag; New York: 1982.
  • Bichot N, Rossi A, Desimone R. Parallel and serial neural mechanisms for visual search in macaque area V4. Science. 2005;308:529–534. [PubMed]
  • Blair RJ. Facial expressions, their communicatory functions and neuro-cognitive substrates. Philos Trans R Soc Lond B Biol Sci. 2003;358:561–572. [PMC free article] [PubMed]
  • Blair RJR, Morris JS, Frith CD, Perrett DI, Dolan RJ. Dissociable neural responses to facial expressions of sadness and anger. Brain. 1999;122:883–893. [PubMed]
  • Braeutigam S, Bailey AJ, Swithenby SJ. Task-dependent early latency (30–60 ms) visual processing of human faces and other objects. Neuroreport. 2001;12:1531–1536. [PubMed]
  • Brookes MJ, Gibson AM, Hall SD, Furlong PL, Barnes GR, Hillebrand A, Singh KD, Holliday IE, Francis ST, Morris PG. GLM-Beamformer method demonstrates stationary field, alpha ERD and gamma ERS colocalisation with fMRI BOLD response in visual cortex. NeuroImage. 2005;26:302–8. [PubMed]
  • Cottaris NP, De Valois RL. Temporal dynamics of chromatic. tuning in macaque primary visual cortex. Nature. 1998;395:896–900. [PubMed]
  • Crone NE, Miglioretti DL, Gordon B, Lesser RP. Functional mapping of human sensorimotor cortex with electrocorticographic spectral analysis. II Event-related synchronization in the gamma band. Brain. 1998;1121:2301–2315. [PubMed]
  • de Gelder B, Vroomen J, Pourtois G, Weiskrantz L. Non-conscious recognition of affect in the absence of striate cortex. Neuroreport. 1999;10:3759–63. [PubMed]
  • de Gelder B, Morris JS, Dolan RJ. Unconscious fear influences emotional awareness of faces and voices. Proc Natl Acad Sci USA. 2005;102:18682–7. [PubMed]
  • Dolan RJ. Emotion, cognition, and behavior. Science. 2002;298:1191–1194. [PubMed]
  • Fawcett IP, Barnes GR, Hillebrand A, Singh KD. The temporal frequency tuning of human visual cortex investigated using synthetic aperture magnetometry. Neuroimage. 2004;21:542–1553. [PubMed]
  • Fitzgerald DA, Angstadt M, Jelsone LM, Nathan PJ, Phan KL. Beyond threat: Amygdala reactivity across multiple expressions of facial affect. 2006. In press. [PubMed]
  • Foucher JR, Otzenberger H, Gounot D. The BOLD response and the gamma oscillations respond differently than evoked potentials: an interleaved EEG-fMRI study. BMC Neurosci. 2003;4:22. [PMC free article] [PubMed]
  • Fries P, Reynolds JH, Rorie AE, Desimone R. Modulation of oscillatory neuronal synchronization by selective visual attention. Science. 2001;291:1560–3. [PubMed]
  • Furlong PL, Hobson AR, Aziz Q, Barnes GR, Singh KD, Hillebrand A, Thompson DG, Hamdy S. Dissociating the spatiotemporal characteristics of cortical neuronal activity associated with human volitional swallowing in the healthy adult brain. Neuroimage. 2004;22:1447–1455. [PubMed]
  • Hall SD, Holliday IE, Hillebrand A, Singh KD, Furlong PL, Hadjipapas A, Barnes GR. The missing link: concurrent human and primate cortical gamma oscillations. Neuroimage. 2005;15:13–7. [PubMed]
  • Harmer CJ, Thilo KV, Rothwell JC, Goodwin GM. Transcranial magnetic stimulation of medial-frontal cortex impairs the processing of angry facial expressions. Nat Neurosci. 2001;4:17–18. [PubMed]
  • Haxby JV, Gobbini MI, Furey ML, Ishai A, Schouten JL, Pietrini P. Distributed and Overlapping Representations of Faces and Objects in Ventral Temporal Cortex. Science. 2001;293:2425–2430. [PubMed]
  • Hillebrand A, Singh KD, Furlong PL, Holliday IE, Barnes GR. A new approach to neuroimaging with magnetoencephalography. Human Brain Mapping. 2005;25:199–211. [PubMed]
  • Ioannides AA, Liu MJ, Liu LC, Bamidis PD, Hellstrand E, Stephan KM. Magnetic field tomography of cortical and deep processes: examples of ‘real-time mapping’ of averaged and single trial MEG signals. Int J Psychophysiol. 1995;20:161–175. [PubMed]
  • Johnson MH. Subcortical face processing. Nature Rev Neurosci. 2005;6:766–774. [PubMed]
  • Keil A, Müller MM, Gruber T, Wienbruch C, Stolarova M, Elbert T. Effects of emotional arousal in the cerebral hemispheres: a study of oscillatory brain activity and event-related potentials. Clin Neurophysiol. 2001;112:2057–2068. [PubMed]
  • Kilner JM, Mattout J, Henson R, Friston J. Hemodynamic correlates of EEG: a heuristic. Neuroimage. 2005;28:280–6. [PubMed]
  • Klinnert MD, Emde RN, Butterfield P, Campos JJ. Social referencing: the infant's use of emotional signals from a friendly adult with mother present. Annu Prog Child Psychiatry Child Dev. 1987;22:427–32.
  • Knutson B. Facial expressions of emotion influence interpersonal trait inferences. Journal of Nonverbal Behavior. 1996;20:165–182.
  • LeDoux JE. The Emotional Brain. Simon and Schuster; New York: 1996.
  • Liddell BJ, Brown KJ, Kemp AH, Barton MJ, Das P, Peduto AS, Gordon E, Williams LM. A direct brainstem-amygdala-cortical ‘alarm’ system for subliminal signals of fear. Neuroimage. 2005;4:235–243. [PubMed]
  • Logothetis NK, Pauls J, Augath M, Trinath T, Oeltermann A. Neurophysiological investigation of the basis of the fMRI signal. Nature. 2001;412:150–157. [PubMed]
  • Luck SJ, Chelazzi L, Hillyard SA, Desimone R. Neural mechanisms of spatial selective attention in areas V1, V2, and V4 of macaque visual cortex. J Neurophysiol. 1997;77:24–42. [PubMed]
  • Mineka S, Cook M. Mechanisms involved in the observational conditioning of fear. J Exp Psychol Gen. 1993;122:23–38. [PubMed]
  • Mineka S, Zinbarg R. Conditioning and Ethological Models of Social Phobia. In: Heimberg RG, Liebowitz MR, Hope DA, Schneier FR, editors. Social Phobia: Diagnosis, Assessment and Treatment. Guilford; New York: 1995. pp. 134–62.
  • Morris JS, Ohman A, Dolan RJ. A subcortical pathway to the right amygdala mediating ‘unseen’ fear. Proc Natl Acad Sci USA. 1999;96:1680–85. [PubMed]
  • Müller MM, Keil A, Gruber T, Elbert T. Processing of affective pictures modulates right-hemispheric gamma band EEG activity. Clin Neurophysiol. 1999;110:1913–1920. [PubMed]
  • Niessing J, Ebisch B, Schmidt KE, Niessing M, Singer W, Galuske RAW. Hemodynamic signals correlate tightly with synchronized gamma oscillations. Science. 2005;309:948–951. [PubMed]
  • Ochsner K, Gross JJ. The Cognitive Control of Emotion. Trends In Cognitive Sciences. 2005;9:242–249. [PubMed]
  • Oya H, Kawasaki H, Howard MA, III, Adolphs R. Electrophysiological responses in the human amygdala discriminate emotion categories of complex visual stimuli. J Neurosci. 2002;22:9502–9512. [PubMed]
  • Pegna AJ, Khateb A, Lazeyras F, Seghier ML. Discriminating emotional faces without primary visual cortices involves the right amygdala. Nature Neurosci. 2005;8:24–25. [PubMed]
  • Pfurtscheller G, Lopes da Silva FH. Event-related EEG/MEG synchronization and desynchronization: basic principles. Clin Neurophysiol. 1999;110:1842–1857. [PubMed]
  • Quirk GJ, Repa C, LeDoux JE. Fear conditioning enhances short-latency auditory responses of lateral amygdala neurons: parallel recordings in the freely behaving rat. Neuron. 1995;15:1029–1039. [PubMed]
  • Rogers RL, Papanicolaou AC, Baumann SB, Bourbon TW, Alagarsamy S, Eisenberg HM. Localization of P3 sources using magnetoencephalography and magnetic resonance imaging. Electroenceph Clin Neurophysiol. 1990;79:308–321. [PubMed]
  • Seeck MCM, Michel N, Mainwaring R, Cosgrove H, Blume J, Ives T, Landis DL. Schomer Evidence for rapid recognition from human scalp and intracranial electrodes. Neuroreport. 1997;8:2749–2754. [PubMed]
  • Seidenbecher T, Laxmi TR, Stork O, Pape HC. Amygdalar and hippocampal theta rhythm synchronization during fear memory retrieval. Science. 2003;301:846–50. [PubMed]
  • Sekihara K, Nagarajan SS, Sahani M. Location Bias and Spatial Resolution in Adaptive and Non-adaptive Spatial Filters for Magnetoencephalography. Neuroimage. 2005;25:1056–67. [PubMed]
  • Singh KD, Barnes GR, Hillebrand A, Forde EM, Williams AL. Task-related changes in cortical synchronization are spatially coincident with the hemodynamic response. NeuroImage. 2002;16:103–114. [PubMed]
  • Sprengelmeyer R, Rausch M, Eysel UT, Przunte H. Neural structures associated with recognition of facial expressions of basic emotions. Proc R Soc Lond, b Biol Sci. 1998;265:1927–1931. [PMC free article] [PubMed]
  • Streit M, Dammers J, Simsek-Kraues S, Brinkmeyer J, Wolwer W, Ioannides AA. "Time course of regional brain activations during facial emotion recognition in humans", Neurosci. Lett. 2003;342:101–104. [PubMed]
  • Tallon-Baudry C, Bertrand O, Hénaff MA, Isnard J, Fischer C. Attention modulates of gamma-band oscillations differently in the human lateral occipital cortex and fusiform gyrus. Cereb Cortex. 2005;15:654–662. [PubMed]
  • Taylor SF, Liberzon I, Koeppe RA. The effect of graded aversive stimuli on limbic and visual activation. Neuropsychologia. 2000;38:1415–1425. [PubMed]
  • Tovee MJ, Rolls ET, Treves A, Bellis RP. Information encoding and the responses of single neurons in the primate temporal visual cortex. J Neurophysiol. 1993;70:640–654. [PubMed]
  • van Honk J, Tuiten A, Hermans E, Putman P, Koppeschaar H, Thijssen J, et al. A single administration of testosterone induces cardiac accelerative responses to angry faces in healthy young women. Behav Neurosci. 2001;115:238–42. [PubMed]
  • van Honk J, Peper JS, Schutter DJ. Testosterone reduces unconscious fear but not consciously experienced anxiety. Biological Psychiatry. 2005;58:218–225. [PubMed]
  • Van Veen BD, van Drongelen W, Yuchtman M, Suzuki A. Localization of brain electrical activity via linearly constrained minimum variance spatial filtering. IEEE Trans Biomed Eng. 1997;44:867–880. [PubMed]
  • Vrba J, Robinson SE. Signal processing in magnetoencephalography. Methods. 2001;25:249–271. [PubMed]
  • Whalen PJ, Shin LM, McInerney SC, Fischer H, Wright CI, Rauch SL. A functional MRI study of human amygdala responses to facial expressions of fear vs. anger. Emotion. 2001;1:70–83. [PubMed]
  • Whalen PJ, Rauch SL, Etcoff NL, McInerney SC, Lee MB, Jenike MB. Masked presentations of emotional facial expressions modulate amygdala activity without explicit knowledge. J Neurosci. 1998;18:411–418. [PubMed]
  • Whalen PJ, Kagan J, Cook RG, et al. Human amygdala responsivity to masked fearful eye whites. Science. 2004;306:2061. [PubMed]
  • Yovel G, Kanwisher N. Face perception; domain specific, not process specific. Neuron. 2004;44:889–98. [PubMed]