PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1076930)

Clipboard (0)
None

Related Articles

1.  Affective priming in major depressive disorder 
Research on cognitive biases in depression has provided considerable evidence for the impact of emotion on cognition. Individuals with depression tend to preferentially process mood-congruent material and to show deficits in the processing of positive material leading to biases in attention, memory, and judgments. More research is needed, however, to fully understand which cognitive processes are affected. The current study further examines the impact of emotion on cognition using a priming design with facial expressions of emotion. Specifically, this study tested whether the presentation of facial expressions of emotion affects subsequent processing of affective material in participants with major depressive disorder (MDD) and healthy controls (CTL). Facial expressions displaying happy, sad, angry, disgusted, or neutral expressions were presented as primes for 500 ms, and participants' speed to identify a subsequent target's emotional expression was assessed. All participants displayed greater interference from emotional vs. neutral primes, marked by slower response times to judge the emotion of the target face when it was preceded by an emotional prime. Importantly, the CTL group showed the strongest interference when happy emotional expressions served as primes whereas the MDD group failed to show this bias. These results add to a growing literature that shows that depression is associated with difficulties in the processing of positive material.
doi:10.3389/fnint.2012.00076
PMCID: PMC3464437  PMID: 23060758
affective priming; cognitive biases; depression
2.  Dissociable patterns of medial prefrontal and amygdala activity to face identity versus emotion in bipolar disorder 
Psychological medicine  2012;42(9):1913-1924.
Background
Individuals with bipolar disorder demonstrate abnormal social function. Neuroimaging studies in bipolar disorder have shown functional abnormalities in neural circuitry supporting face emotion processing, but have not examined face identity processing, a key component of social function. We aimed to elucidate functional abnormalities in neural circuitry supporting face emotion and face identity processing in bipolar disorder.
Method
Twenty-seven individuals with bipolar disorder I currently euthymic and 27 healthy controls participated in an implicit face processing, block-design paradigm. Participants labeled color flashes that were superimposed on dynamically changing background faces comprising morphs either from neutral to prototypical emotion (happy, sad, angry and fearful) or from one identity to another identity depicting a neutral face. Whole-brain and amygdala region-of-interest (ROI) activities were compared between groups.
Results
There was no significant between-group difference looking across both emerging face emotion and identity. During processing of all emerging emotions, euthymic individuals with bipolar disorder showed significantly greater amygdala activity. During facial identity and also happy face processing, euthymic individuals with bipolar disorder showed significantly greater amygdala and medial prefrontal cortical activity compared with controls.
Conclusions
This is the first study to examine neural circuitry supporting face identity and face emotion processing in bipolar disorder. Our findings of abnormally elevated activity in amygdala and medial prefrontal cortex (mPFC) during face identity and happy face emotion processing suggest functional abnormalities in key regions previously implicated in social processing. This may be of future importance toward examining the abnormal self-related processing, grandiosity and social dysfunction seen in bipolar disorder.
doi:10.1017/S0033291711002935
PMCID: PMC3685204  PMID: 22273442
Affective disorders; amygdala; bipolar disorder; face processing; functional magnetic resonance imaging; identity processing; medial prefrontal cortex; morph; self-processing; social processing
3.  High Hostility Among Smokers Predicts Slower Recognition of Positive Facial Emotion 
High levels of trait hostility are associated with wide-ranging interpersonal deficits and heightened physiological response to social stressors. These deficits may be attributable in part to individual differences in the perception of social cues. The present study evaluated the ability to recognize facial emotion among 48 high hostile (HH) and 48 low hostile (LH) smokers and whether experimentally-manipulated acute nicotine deprivation moderated relations between hostility and facial emotion recognition. A computer program presented series of pictures of faces that morphed from a neutral emotion into increasing intensities of happiness, sadness, fear, or anger, and participants were asked to identify the emotion displayed as quickly as possible. Results indicated that HH smokers, relative to LH smokers, required a significantly greater intensity of emotion expression to recognize happiness. No differences were found for other emotions across HH and LH individuals, nor did nicotine deprivation moderate relations between hostility and emotion recognition. This is the first study to show that HH individuals are slower to recognize happy facial expressions and that this occurs regardless of recent tobacco abstinence. Difficulty recognizing happiness in others may impact the degree to which HH individuals are able to identify social approach signals and to receive social reinforcement.
doi:10.1016/j.paid.2011.11.009
PMCID: PMC3249417  PMID: 22223928
hostility; facial emotion recognition; smoking; nicotine
4.  Perceptual bias of patients with schizophrenia in morphed facial expression 
Psychiatry research  2010;185(0):60-65.
Limited research has specifically examined the nature of the dysfunction in emotion categorization representation in schizophrenia. The current study aimed to investigate the perception bias of morphed facial expression in subjects with schizophrenia and healthy controls in the emotion continua. Twenty-eight patients with schizophrenia and thirty-one healthy controls took part in this study. They were administered a standardized set of morphed photographs of facial expressions with varying emotional intensities between 0% and 100% of the emotion, in 10% increments to provide a range of intensities from pleasant to unpleasant and approach to withdraw. Shift points, indicating the time point that the subjects’ emotion identification begins to change, and response slopes, indicating how rapidly these changes have happened at the shift points in the emotion continuum, were measured. Patients exhibited a significantly greater response slope (i.e., patients’ perception changed more rapidly) and greater shift point (i.e., patients still perceived mild expressions of anger as happy faces) with increasing emotion signal compared with healthy controls when the facial expression morphed from happy to angry. Furthermore, patients with schizophrenia still perceived mild expressions of fear as angry faces(a greater shift point) and were less discriminative from angry to fearful emotion(a flatter response slope). They were sensitive to sadness (a smaller shift point) and the perception changed rapidly (a sharper response slope) as compared with healthy controls in the emotion continuum of happy to sad. In conclusion, patients with schizophrenia demonstrated impaired categorical perception of facial expressions, with generally ‘rapid’ but ‘late’ discrimination towards social threat-related stimuli such as angry facial expression. Compared with healthy controls, these patients have a sharper discrimination perception pattern in the emotion continua from positive valence to negative valence.
doi:10.1016/j.psychres.2010.05.017
PMCID: PMC3805827  PMID: 20646764
Emotion perception; Schizophrenia; Morphed facial expression
5.  Recognition of emotion from moving facial and prosodic stimuli in depressed patients 
Background: It has been suggested that depressed patients have a "negative bias" in recognising other people's emotions; however, the detailed structure of this negative bias is not fully understood.
Objectives: To examine the ability of depressed patients to recognise emotion, using moving facial and prosodic expressions of emotion.
Methods: 16 depressed patients and 20 matched (non-depressed) controls selected one basic emotion (happiness, sadness, anger, fear, surprise, or disgust) that best described the emotional state represented by moving face and prosody.
Results: There was no significant difference between depressed patients and controls in their recognition of facial expressions of emotion. However, the depressed patients were impaired relative to controls in their recognition of surprise from prosodic emotions, judging it to be more negative.
Conclusions: We suggest that depressed patients tend to interpret neutral emotions, such as surprise, as negative. Considering that the deficit was seen only for prosodic emotive stimuli, it would appear that stimulus clarity influences the recognition of emotion. These findings provide valuable information on how depressed patients behave in complicated emotional and social situations.
doi:10.1136/jnnp.2004.036079
PMCID: PMC1738863  PMID: 15548479
6.  Visual field bias in hearing and deaf adults during judgments of facial expression and identity 
The dominance of the right hemisphere during face perception is associated with more accurate judgments of faces presented in the left rather than the right visual field (RVF). Previous research suggests that the left visual field (LVF) bias typically observed during face perception tasks is reduced in deaf adults who use sign language, for whom facial expressions convey important linguistic information. The current study examined whether visual field biases were altered in deaf adults whenever they viewed expressive faces, or only when attention was explicitly directed to expression. Twelve hearing adults and 12 deaf signers were trained to recognize a set of novel faces posing various emotional expressions. They then judged the familiarity or emotion of faces presented in the left or RVF, or both visual fields simultaneously. The same familiar and unfamiliar faces posing neutral and happy expressions were presented in the two tasks. Both groups were most accurate when faces were presented in both visual fields. Across tasks, the hearing group demonstrated a bias toward the LVF. In contrast, the deaf group showed a bias toward the LVF during identity judgments that shifted marginally toward the RVF during emotion judgments. Two secondary conditions tested whether these effects generalized to angry faces and famous faces and similar effects were observed. These results suggest that attention to facial expression, not merely the presence of emotional expression, reduces a typical LVF bias for face processing in deaf signers.
doi:10.3389/fpsyg.2013.00319
PMCID: PMC3674475  PMID: 23761774
deafness; face perception; visual field bias; laterality; sign language; emotional expression
7.  Neural circuitry of emotional face processing in autism spectrum disorders 
Background
Autism spectrum disorders (ASD) are associated with severe impairments in social functioning. Because faces provide nonverbal cues that support social interactions, many studies of ASD have examined neural structures that process faces, including the amygdala, ventromedial prefrontal cortex and superior and middle temporal gyri. However, increases or decreases in activation are often contingent on the cognitive task. Specifically, the cognitive domain of attention influences group differences in brain activation. We investigated brain function abnormalities in participants with ASD using a task that monitored attention bias to emotional faces.
Methods
Twenty-four participants (12 with ASD, 12 controls) completed a functional magnetic resonance imaging study while performing an attention cuing task with emotional (happy, sad, angry) and neutral faces.
Results
In response to emotional faces, those in the ASD group showed greater right amygdala activation than those in the control group. A preliminary psychophysiological connectivity analysis showed that ASD participants had stronger positive right amygdala and ventromedial prefrontal cortex coupling and weaker positive right amygdala and temporal lobe coupling than controls. There were no group differences in the behavioural measure of attention bias to the emotional faces.
Limitations
The small sample size may have affected our ability to detect additional group differences.
Conclusion
When attention bias to emotional faces was equivalent between ASD and control groups, ASD was associated with greater amygdala activation. Preliminary analyses showed that ASD participants had stronger connectivity between the amygdala ventromedial prefrontal cortex (a network implicated in emotional modulation) and weaker connectivity between the amygdala and temporal lobe (a pathway involved in the identification of facial expressions, although areas of group differences were generally in a more anterior region of the temporal lobe than what is typically reported for emotional face processing). These alterations in connectivity are consistent with emotion and face processing disturbances in ASD.
doi:10.1503/jpn.090085
PMCID: PMC2834792  PMID: 20184808
8.  Identification of emotionally ambiguous interpersonal stimuli among dysphoric and nondysphoric individuals 
Cognitive therapy and research  2009;33(3):283-290.
This study examined whether dysphoria influences the identification of non-ambiguous and ambiguous facial expressions of emotion. Dysphoric and non-dysphoric college students viewed a series of human faces expressing sadness, happiness, anger, and fear that were morphed with each other to varying degrees. Dysphoric and non-dysphoric individuals identified prototypical emotional expressions similarly. However, when viewing ambiguous faces, dysphoric individuals were more likely to identify sadness when mixed with happiness than non-dysphoric individuals. A similar but less robust pattern was observed for facial expressions that combined fear and happiness. No group differences in emotion identification were observed for faces that combined sadness and anger or fear and anger. Dysphoria appears to enhance the identification of negative emotion in others when positive emotion is also present. This tendency may contribute to some of the interpersonal difficulties often experienced by dysphoric individuals.
doi:10.1007/s10608-008-9198-6
PMCID: PMC2699207  PMID: 20046979
interpretation bias; depression; cognitive models; information processing
9.  The Extended Functional Neuroanatomy of Emotional Processing Biases for Masked Faces in Major Depressive Disorder 
PLoS ONE  2012;7(10):e46439.
Background
Major depressive disorder (MDD) is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however.
Aims
To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants.
Method
Unmedicated-depressed participants with MDD (n = 22) and healthy controls (HC; n = 25) underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD) signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups.
Results
The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex.
Conclusions
Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.
doi:10.1371/journal.pone.0046439
PMCID: PMC3466291  PMID: 23056309
10.  On the Time Course of Vocal Emotion Recognition 
PLoS ONE  2011;6(11):e27256.
How quickly do listeners recognize emotions from a speaker's voice, and does the time course for recognition vary by emotion type? To address these questions, we adapted the auditory gating paradigm to estimate how much vocal information is needed for listeners to categorize five basic emotions (anger, disgust, fear, sadness, happiness) and neutral utterances produced by male and female speakers of English. Semantically-anomalous pseudo-utterances (e.g., The rivix jolled the silling) conveying each emotion were divided into seven gate intervals according to the number of syllables that listeners heard from sentence onset. Participants (n = 48) judged the emotional meaning of stimuli presented at each gate duration interval, in a successive, blocked presentation format. Analyses looked at how recognition of each emotion evolves as an utterance unfolds and estimated the “identification point” for each emotion. Results showed that anger, sadness, fear, and neutral expressions are recognized more accurately at short gate intervals than happiness, and particularly disgust; however, as speech unfolds, recognition of happiness improves significantly towards the end of the utterance (and fear is recognized more accurately than other emotions). When the gate associated with the emotion identification point of each stimulus was calculated, data indicated that fear (M = 517 ms), sadness (M = 576 ms), and neutral (M = 510 ms) expressions were identified from shorter acoustic events than the other emotions. These data reveal differences in the underlying time course for conscious recognition of basic emotions from vocal expressions, which should be accounted for in studies of emotional speech processing.
doi:10.1371/journal.pone.0027256
PMCID: PMC3210149  PMID: 22087275
11.  Amygdala and whole brain activity to emotional faces distinguishes major depressive disorder and bipolar disorder 
Bipolar disorders  2013;15(7):741-752.
Objectives
It can be clinically difficult to distinguish depressed individuals with bipolar disorder (BD) and major depressive disorder (MDD). To examine potential biomarkers of difference between the two disorders, the current study examined differences in the functioning of emotion processing neural regions during a dynamic emotional faces task.
Methods
During functional magnetic resonance imaging, healthy control adults (HC) (n = 29) and depressed adults with MDD (n = 30) and BD (n = 22) performed an implicit emotional-faces task in which they identified a color label superimposed on neutral faces that dynamically morphed into one of four emotional faces (angry, fearful, sad, happy). We compared neural activation between the groups in an amygdala region-of-interest and at the whole brain level.
Results
Adults with MDD showed significantly greater activity than adults with BD in the left amygdala to the anger condition (p = 0.01). Results of whole brain analyses (at p < 0.005, k ≥ 20) revealed that adults with BD showed greater activity to sad faces in temporoparietal regions, primarily in the left hemisphere, whereas individuals with MDD demonstrated greater activity than those with BD to displays of anger, fear, and happiness. Many of the observed BD–MDD differences represented abnormalities in functioning compared to HC.
Conclusions
We observed a dissociation between depressed adults with BD and MDD in the processing of emerging emotional faces. Those with BD showed greater activity during mood-congruent (i.e., sad) faces, whereas, those with MDD showed greater activity for mood-incongruent (i.e., fear, anger, and happy) faces. Such findings may reflect markers of differences between BD and MDD depression in underlying pathophysiological processes.
doi:10.1111/bdi.12106
PMCID: PMC3864629  PMID: 23911154
amygdala; bipolar disorder; brain imaging; emotion processing; major depressive disorder; whole brain
12.  Preferential Amygdala Reactivity to the Negative Assessment of Neutral Faces 
Biological psychiatry  2009;66(9):847-853.
Background
Prior studies suggest that the amygdala shapes complex behavioral responses to socially ambiguous cues. We explored human amygdala function during explicit behavioral decision making about discrete emotional facial expressions that can represent socially unambiguous and ambiguous cues.
Methods
During functional magnetic resonance imaging, 43 healthy adults were required to make complex social decisions (i.e., approach or avoid) about either relatively unambiguous (i.e., angry, fearful, happy) or ambiguous (i.e., neutral) facial expressions. Amygdala activation during this task was compared with that elicited by simple, perceptual decisions (sex discrimination) about the identical facial stimuli.
Results
Angry and fearful expressions were more frequently judged as avoidable and happy expressions most often as approachable. Neutral expressions were equally judged as avoidable and approachable. Reaction times to neutral expressions were longer than those to angry, fearful, and happy expressions during social judgment only. Imaging data on stimuli judged to be avoided revealed a significant task by emotion interaction in the amygdala. Here, only neutral facial expressions elicited greater activity during social judgment than during sex discrimination. Furthermore, during social judgment only, neutral faces judged to be avoided were associated with greater amygdala activity relative to neutral faces that were judged as approachable. Moreover, functional coupling between the amygdala and both dorsolateral prefrontal (social judgment > sex discrimination) and cingulate (sex discrimination > social judgment) cortices was differentially modulated by task during processing of neutral faces.
Conclusions
Our results suggest that increased amygdala reactivity and differential functional coupling with prefrontal circuitries may shape complex decisions and behavioral responses to socially ambiguous cues.
doi:10.1016/j.biopsych.2009.06.017
PMCID: PMC3013358  PMID: 19709644
Amygdala; cingulate; facial expressions; fMRI; prefrontal cortex; social decision making
13.  The Impact of a Single Administration of Intranasal Oxytocin on the Recognition of Basic Emotions in Humans: A Meta-Analysis 
Neuropsychopharmacology  2013;38(10):1929-1936.
Many studies have highlighted the potential of oxytocin (OT) to enhance facial affect recognition in healthy humans. However, inconsistencies have emerged with regard to the influence of OT on the recognition of specific emotional expressions (happy, angry, fear, surprise, disgust, and sadness). In this study, we conducted a meta-analysis of seven studies comprising 381 research participants (71 females) examining responses to the basic emotion types to assess whether OT enhances the recognition of emotion from human faces and whether this was influenced by the emotion expression and exposure time of the face. Results showed that intranasal OT administration enhances emotion recognition of faces overall, with a Hedges g effect size of 0.29. When analysis was restricted to facial expression types, significant effects of OT on recognition accuracy were specifically found for the recognition of happy and fear faces. We also found that effect sizes increased to moderate when exposure time of the photograph was restricted to early phase recognition (<300 ms) for happy and angry faces, or later phase recognition for fear faces (>300 ms). The results of the meta-analysis further suggest that OT has potential as a treatment to improve the recognition of emotion in faces, allowing individuals to improve their insight into the intentions, desires, and mental states of others.
doi:10.1038/npp.2013.86
PMCID: PMC3746698  PMID: 23575742
behavioral science; emotion; neuroendocrinology; neuropeptides; psychopharmacology; emotion; oxytocin; perception; face; expression, peptide
14.  Visual Scanning Patterns and Executive Function in Relation to Facial Emotion Recognition in Aging 
Objective
The ability to perceive facial emotion varies with age. Relative to younger adults (YA), older adults (OA) are less accurate at identifying fear, anger, and sadness, and more accurate at identifying disgust. Because different emotions are conveyed by different parts of the face, changes in visual scanning patterns may account for age-related variability. We investigated the relation between scanning patterns and recognition of facial emotions. Additionally, as frontal-lobe changes with age may affect scanning patterns and emotion recognition, we examined correlations between scanning parameters and performance on executive function tests.
Methods
We recorded eye movements from 16 OA (mean age 68.9) and 16 YA (mean age 19.2) while they categorized facial expressions and non-face control images (landscapes), and administered standard tests of executive function.
Results
OA were less accurate than YA at identifying fear (p<.05, r=.44) and more accurate at identifying disgust (p<.05, r=.39). OA fixated less than YA on the top half of the face for disgust, fearful, happy, neutral, and sad faces (p’s<.05, r’s≥.38), whereas there was no group difference for landscapes. For OA, executive function was correlated with recognition of sad expressions and with scanning patterns for fearful, sad, and surprised expressions.
Conclusion
We report significant age-related differences in visual scanning that are specific to faces. The observed relation between scanning patterns and executive function supports the hypothesis that frontal-lobe changes with age may underlie some changes in emotion recognition.
doi:10.1080/13825585.2012.675427
PMCID: PMC3448814  PMID: 22616800
Aging; Emotion Recognition; Visual Scanning; Executive Function; Frontal Lobes
15.  What’s in a Smile? Maternal Brain Responses to Infant Facial Cues 
Pediatrics  2008;122(1):40-51.
Objectives
To determine how a mother’s brain responds to her own baby’s facial expressions, comparing happy, neutral and sad face affect.
Methods
In an event-related functional MRI study, 28 first-time mothers were shown novel face images of their own 5–10 month-old baby and a matched unknown baby. Sixty unique stimuli from 6 categories (own-happy, own-neutral, own-sad, unknown-happy, unknown-neutral and unknown-sad) were presented randomly for 2 seconds each, with a variable 2–6 second inter-stimulus interval.
Results
Key dopamine-associated reward processing regions of the brain were activated when mothers viewed their own baby’s face, compared to an unknown baby face. These included the ventral tegmental area / substantia nigra regions, the striatum, and frontal lobe regions involved in 1) emotion processing (medial prefrontal, anterior cingulate and insula cortex), 2) cognition (dorsolateral prefrontal cortex) and 3) motor/behavioral outputs (primary motor area) (P<0.001, false discovery rate corrected [FDR] q<0.05). Happy, but not neutral or sad own-infant faces, activated nigrostriatal brain regions interconnected by dopaminergic neurons (P<0.0005, FDR q<0.05), including the substantia nigra and dorsal putamen. A region-of-interest analysis revealed that activation in these regions was related to positive infant affect (happy>neutral>sad) for each own-unknown baby face contrast.
Conclusions
When first-time mothers see their own baby’s face, an extensive brain network appears to be activated, wherein affective and cognitive information may be integrated and directed toward motor/behavioral outputs. Dopaminergic reward-related brain regions are activated specifically in response to happy, but not sad, baby faces. Understanding how a mother responds uniquely to her own baby, when smiling or crying, may be the first step in understanding the neural basis of mother-infant attachment.
doi:10.1542/peds.2007-1566
PMCID: PMC2597649  PMID: 18595985
attachment; dopamine; maternal responsiveness; mother-child relations; neuroimaging
16.  ‘Can you look me in the face?' Short-term SSRI Administration Reverts Avoidant Ocular Face Exploration in Subjects at Risk for Psychopathology 
Neuropsychopharmacology  2014;39(13):3059-3066.
Anxiety and depression are associated with altered ocular exploration of facial stimuli, which could have a role in the misinterpretation of ambiguous emotional stimuli. However, it is unknown whether a similar pattern is seen in individuals at risk for psychopathology and whether this can be modified by pharmacological interventions used in these disorders. In Study 1, eye gaze movement during face discrimination was compared in volunteers with high vs low neuroticism scores on the Eysenck Personality Questionnaire. Facial stimuli either displayed a neutral, happy, or fearful expression. In Study 2, volunteers with high neuroticism were randomized in a double-blind design to receive the selective serotonin reuptake inhibitor citalopram (20 mg) or placebo for 7 days. On the last day of treatment, eye gaze movement during face presentation and the recognition of different emotional expressions was assessed. In Study 1, highly neurotic volunteers showed reduced eye gaze towards the eyes vs mouth region of the face compared with low neurotic volunteers. In Study 2, citalopram increased gaze maintenance over the face stimuli compared with placebo and enhanced recognition of positive vs negative facial expressions. Longer ocular exploration of happy faces correlated positively with recognition of positive emotions. Individuals at risk for psychopathology presented an avoidant pattern of ocular exploration of faces. Short-term SSRI administration reversed this bias before any mood or anxiety changes. This treatment effect may improve the capacity to scan social stimuli and contribute to the remediation of clinical symptoms related to interpersonal difficulties.
doi:10.1038/npp.2014.159
PMCID: PMC4229577  PMID: 25035080
17.  No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces 
Autism Research and Treatment  2014;2014:345878.
Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD) group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region) with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness) or in the mouth region (so-called bottom-emotions: sadness, anger, and fear). No stronger reliance on mouth information was found in children with ASD.
doi:10.1155/2014/345878
PMCID: PMC3909988  PMID: 24527213
18.  Electrophysiological Evidence of Attentional Biases in Social Anxiety Disorder 
Psychological medicine  2008;39(7):1141-1152.
Background
Previous studies investigating attentional biases in social anxiety disorder (SAD) have yielded mixed results. Recent event-related potential (ERP) studies employing the dot-probe paradigm in non-anxious participants have shown that the P1 component is sensitive to visuospatial attention toward emotional faces. We used a dot-probe task in conjunction with high-density ERPs and source localization to investigate attentional biases in SAD.
Method
Twelve SAD and 15 control participants performed a modified dot-probe task using angry-neutral and happy-neutral face-pairs. The P1 component elicited by face-pairs was analyzed to test the hypothesis that SAD participants would display early hypervigilance to threat-related cues. The P1 component to probes replacing angry, happy or neutral faces was used to evaluate whether SAD participants show sustained hypervigilance or rather decreased visual processing of threat-related cues at later processing stages.
Results
Compared to controls, SAD participants showed relatively (a) potentiated P1 amplitudes and fusiform gyrus activation to angry-neutral vs. happy-neutral face-pairs; (b) decreased P1 amplitudes to probes replacing emotional (angry and happy) vs. neutral faces; and (c) higher sensitivity (d′) to probes following angry-neutral vs. happy-neutral face-pairs. SAD participants also showed significantly shorter reaction times to probes replacing angry vs. happy faces, but no group differences emerged for reaction time.
Conclusions
The results provide electrophysiological support for early hypervigilance to angry faces in SAD with involvement of the fusiform gyrus, and reduced visual processing of emotionally salient locations at later stages of information processing, which might be a manifestation of attentional avoidance.
doi:10.1017/S0033291708004820
PMCID: PMC3204217  PMID: 19079826
Social anxiety disorder; social phobia; electrophysiology; hypervigilance-avoidance hypothesis; attention; faces; anger
19.  Invisible emotional expressions influence social judgments and pupillary responses of both depressed and non-depressed individuals 
We used filtered low spatial frequency images of facial emotional expressions (angry, fearful, happy, sad, or neutral faces) that were blended with a high-frequency image of the same face but with a neutral facial expression, so as to obtain a “hybrid” face image that “masked” the subjective perception of its emotional expression. Participants were categorized in three groups of participants: healthy control participants (N = 49), recovered previously depressed (N = 79), and currently depressed individuals (N = 36), All participants were asked to rate how friendly the person in the picture looked. Simultaneously we recorded, by use of an infrared eye-tracker, their pupillary responses. We expected that depressed individuals (either currently or previously depressed) would show a negative bias and therefore rate the negative emotional faces, albeit the emotions being invisible, as more negative (i.e., less friendly) than the healthy controls would. Similarly, we expected that depressed individuals would overreact to the negative emotions and that this would result in greater dilations of the pupil's diameter than those shown by controls for the same emotions. Although we observed the expected pattern of effects of the hidden emotions on both ratings and pupillary changes, both responses did not differ significantly among the three groups of participants. The implications of this finding are discussed.
doi:10.3389/fpsyg.2013.00291
PMCID: PMC3660658  PMID: 23734141
depression; pupillometry; subliminal perception; facial emotions; face hybrids
20.  Influence of Emotional Expression on Memory Recognition Bias in Schizophrenia as Revealed by fMRI 
Schizophrenia Bulletin  2009;36(4):800-810.
We recently showed that, in healthy individuals, emotional expression influences memory for faces both in terms of accuracy and, critically, in memory response bias (tendency to classify stimuli as previously seen or not, regardless of whether this was the case). Although schizophrenia has been shown to be associated with deficit in episodic memory and emotional processing, the relation between these processes in this population remains unclear. Here, we used our previously validated paradigm to directly investigate the modulation of emotion on memory recognition. Twenty patients with schizophrenia and matched healthy controls completed functional magnetic resonance imaging (fMRI) study of recognition memory of happy, sad, and neutral faces. Brain activity associated with the response bias was obtained by correlating this measure with the contrast subjective old (ie, hits and false alarms) minus subjective new (misses and correct rejections) for sad and happy expressions. Although patients exhibited an overall lower memory performance than controls, they showed the same effects of emotion on memory, both in terms of accuracy and bias. For sad faces, the similar behavioral pattern between groups was mirrored by a largely overlapping neural network, mostly involved in familiarity-based judgments (eg, parahippocampal gyrus). In contrast, controls activated a much larger set of regions for happy faces, including areas thought to underlie recollection-based memory retrieval (eg, superior frontal gyrus and hippocampus) and in novelty detection (eg, amygdala). This study demonstrates that, despite an overall lower memory accuracy, emotional memory is intact in schizophrenia, although emotion-specific differences in brain activation exist, possibly reflecting different strategies.
doi:10.1093/schbul/sbn172
PMCID: PMC2894593  PMID: 19176471
amygdala; faces; facial expression; functional neuroimaging; sad; happy; symptomatology; recollection; familiarity
21.  Normative data on development of neural and behavioral mechanisms underlying attention orienting toward social-emotional stimuli: An exploratory study 
Brain research  2009;1292C:61-70.
The ability of positive and negative facial signals to influence attention orienting is crucial to social functioning. Given the dramatic developmental change in neural architecture supporting social function, positive and negative facial cues may influence attention orienting differently in relatively young or old individuals. However, virtually no research examines such age-related differences in the neural circuitry supporting attention orienting to emotional faces. We examined age-related correlations in attention-orienting biases to positive and negative face emotions in a healthy sample (N=37; 9-40 years old) using functional magnetic resonance imaging and a dot-probe task. The dot-probe task in an fMRI setting yields both behavioral and neural indices of attention biases towards or away from an emotional cue (happy or angry face). In the full sample, angry-face attention bias scores did not correlate with age, and age did not correlate with brain activation to angry faces. However, age did positively correlate with attention bias towards happy faces; age also negatively correlated with left cuneus and left caudate activation to a happy-bias fMRI contrast. Secondary analyses suggested age-related changes in attention bias to happy faces. The tendency in younger children to direct attention away from happy faces (relative to neutral faces) was diminished in the older age groups, in tandem with increasing neural deactivation. Implications for future work on developmental changes in attention-emotion processing are discussed.
doi:10.1016/j.brainres.2009.07.045
PMCID: PMC2739245  PMID: 19631626
22.  Age-related differences in affective responses to and memory for emotions conveyed by music: a cross-sectional study 
There is mounting evidence that aging is associated with the maintenance of positive affect and the decrease of negative affect to ensure emotion regulation goals. Previous empirical studies have primarily focused on a visual or autobiographical form of emotion communication. To date, little investigation has been done on musical emotions. The few studies that have addressed aging and emotions in music were mainly interested in emotion recognition, thus leaving unexplored the question of how aging may influence emotional responses to and memory for emotions conveyed by music. In the present study, eighteen older (60–84 years) and eighteen younger (19–24 years) listeners were asked to evaluate the strength of their experienced emotion on happy, peaceful, sad, and scary musical excerpts (Vieillard et al., 2008) while facial muscle activity was recorded. Participants then performed an incidental recognition task followed by a task in which they judged to what extent they experienced happiness, peacefulness, sadness, and fear when listening to music. Compared to younger adults, older adults (a) reported a stronger emotional reactivity for happiness than other emotion categories, (b) showed an increased zygomatic activity for scary stimuli, (c) were more likely to falsely recognize happy music, and (d) showed a decrease in their responsiveness to sad and scary music. These results are in line with previous findings and extend them to emotion experience and memory recognition, corroborating the view of age-related changes in emotional responses to music in a positive direction away from negativity.
doi:10.3389/fpsyg.2013.00711
PMCID: PMC3797547  PMID: 24137141
aging; musical emotions; emotional responses; facial muscle activity; incidental recognition; positivity effect
23.  Differential Brain Activation to Angry Faces by Elite Warfighters: Neural Processing Evidence for Enhanced Threat Detection 
PLoS ONE  2010;5(4):e10096.
Background
Little is known about the neural basis of elite performers and their optimal performance in extreme environments. The purpose of this study was to examine brain processing differences between elite warfighters and comparison subjects in brain structures that are important for emotion processing and interoception.
Methodology/Principal Findings
Navy Sea, Air, and Land Forces (SEALs) while off duty (n = 11) were compared with n = 23 healthy male volunteers while performing a simple emotion face-processing task during functional magnetic resonance imaging. Irrespective of the target emotion, elite warfighters relative to comparison subjects showed relatively greater right-sided insula, but attenuated left-sided insula, activation. Navy SEALs showed selectively greater activation to angry target faces relative to fearful or happy target faces bilaterally in the insula. This was not accounted for by contrasting positive versus negative emotions. Finally, these individuals also showed slower response latencies to fearful and happy target faces than did comparison subjects.
Conclusions/Significance
These findings support the hypothesis that elite warfighters deploy greater processing resources toward potential threat-related facial expressions and reduced processing resources to non-threat-related facial expressions. Moreover, rather than expending more effort in general, elite warfighters show more focused neural and performance tuning. In other words, greater neural processing resources are directed toward threat stimuli and processing resources are conserved when facing a nonthreat stimulus situation.
doi:10.1371/journal.pone.0010096
PMCID: PMC2854680  PMID: 20418943
24.  Frontolimbic Responses to Emotional Face Memory: The Neural Correlates of First Impressions 
Human brain mapping  2009;30(11):3748-3758.
First impressions, especially of emotional faces, may critically impact later evaluation of social interactions. Activity in limbic regions, including the amygdala and ventral striatum, has previously been shown to correlate with identification of emotional content in faces; however, little work has been done describing how these signals may influence emotional face memory. We report an event-related fMRI study in 21 healthy adults where subjects attempted to recognize a neutral face that was previously viewed with a threatening (angry or fearful) or non-threatening (happy or sad) affect. In a hypothesis-driven region of interest analysis, we found that neutral faces previously presented with a threatening affect recruited the left amygdala. In contrast, faces previously presented with a non-threatening affect activated the left ventral striatum. A whole-brain analysis revealed increased response in the right orbitofrontal cortex to faces previously seen with threatening affect. These effects of prior emotion were independent of task performance, with differences being seen in the amygdala and ventral striatum even if only incorrect trials were considered. The results indicate that a network of frontolimbic regions may provide emotional bias signals during facial recognition.
doi:10.1002/hbm.20803
PMCID: PMC4244703  PMID: 19530218
amygdala; ventral striatum; fMRI; face; memory; emotion; orbitofrontal cortex
25.  Multisensory perception of the six basic emotions is modulated by attentional instruction and unattended modality 
Previous studies have shown that the perception of facial and vocal affective expressions interacts with each other. Facial expressions usually dominate vocal expressions when we perceive the emotions of face–voice stimuli. In most of these studies, participants were instructed to pay attention to the face or voice. Few studies compared the perceived emotions with and without specific instructions regarding the modality to which attention should be directed. Also, these studies used combinations of the face and voice which expresses two opposing emotions, which limits the generalizability of the findings. The purpose of this study is to examine whether the emotion perception is modulated by instructions to pay attention to the face or voice using the six basic emotions. Also we examine the modality dominance between the face and voice for each emotion category. Before the experiment, we recorded faces and voices which expresses the six basic emotions and orthogonally combined these faces and voices. Consequently, the emotional valence of visual and auditory information was either congruent or incongruent. In the experiment, there were unisensory and multisensory sessions. The multisensory session was divided into three blocks according to whether an instruction was given to pay attention to a given modality (face attention, voice attention, and no instruction). Participants judged whether the speaker expressed happiness, sadness, anger, fear, disgust, or surprise. Our results revealed that instructions to pay attention to one modality and congruency of the emotions between modalities modulated the modality dominance, and the modality dominance is differed for each emotion category. In particular, the modality dominance for anger changed according to each instruction. Analyses also revealed that the modality dominance suggested by the congruency effect can be explained in terms of the facilitation effect and the interference effect.
doi:10.3389/fnint.2015.00001
PMCID: PMC4313707
attentional instruction; audiovisual integration; unattended stimuli; modality dominance; congruency effect; emotion perception

Results 1-25 (1076930)