PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of schbulOxford JournalsContact UsMy BasketMy AccountSchizophrenia BulletinAbout this JournalContact this JournalSubscriptionsCurrent IssueArchiveSearch
 
Schizophr Bull. Sep 2011; 37(5): 1001–1008.
Published online Feb 15, 2010. doi:  10.1093/schbul/sbq006
PMCID: PMC3160215
How Do Schizophrenia Patients Use Visual Information to Decode Facial Emotion?
Junghee Lee,*1,2 Frédéric Gosselin,3 Jonathan K. Wynn,1,2 and Michael F. Green1,2
1Department of Psychiatry & Biobehavioral Sciences, Semel Institute for Neuroscience and Human Behavior, UCLA, Los Angeles, CA 90095-6968
2VA Greater Los Angeles Healthcare System, Los Angeles, CA
3Department of Psychology, University of Montreal, Montréal, QC, Canada
*To whom correspondence should be addressed; tel: (310) 794-9010, fax: (310) 825-6626, e-mail: jungheelee/at/ucla.edu.
Impairment in recognizing facial emotions is a prominent feature of schizophrenia patients, but the underlying mechanism of this impairment remains unclear. This study investigated the specific aspects of visual information that are critical for schizophrenia patients to recognize emotional expression. Using the Bubbles technique, we probed the use of visual information during a facial emotion discrimination task (fear vs. happy) in 21 schizophrenia patients and 17 healthy controls. Visual information was sampled through randomly located Gaussian apertures (or “bubbles”) at 5 spatial frequency scales. Online calibration of the amount of face exposed through bubbles was used to ensure 75% overall accuracy for each subject. Least-square multiple linear regression analyses between sampled information and accuracy were performed to identify critical visual information that was used to identify emotional expression. To accurately identify emotional expression, schizophrenia patients required more exposure of facial areas (i.e., more bubbles) compared with healthy controls. To identify fearful faces, schizophrenia patients relied less on bilateral eye regions at high-spatial frequency compared with healthy controls. For identification of happy faces, schizophrenia patients relied on the mouth and eye regions; healthy controls did not utilize eyes and used the mouth much less than patients did. Schizophrenia patients needed more facial information to recognize emotional expression of faces. In addition, patients differed from controls in their use of high-spatial frequency information from eye regions to identify fearful faces. This study provides direct evidence that schizophrenia patients employ an atypical strategy of using visual information to recognize emotional faces.
Keywords: schizophrenia, emotional face recognition, spatial frequency, bubbles tasks, potent information, strategy
Emotional expression of faces is one of the most important sources of socially relevant information that is needed for adaptive behavior. Schizophrenia patients have difficulty in identifying and discriminating emotional expression of faces.1,2 These impairments are found consistently in first episode and chronic phases of illness3,4 and are less associated with the severity or chronicity of the illness.5,6 Impairments in emotional face recognition cannot be explained by antipsychotic medications7,8 and appear to be closely related to daily functioning of patients.9,10
A growing body of research has attempted to better understand the nature of impaired recognition of facial emotion in schizophrenia. Patients appear to have less clear categorical boundaries between emotions compared with controls.11 Using signal detection indices, some studies showed that schizophrenia patients have difficulty in differentiating one emotion from others and also tend to misattribute one emotion to others.12,13 In addition, several studies found that impaired emotional recognition of faces was associated with impaired visual processing in schizophrenia.1418 However, the mechanisms through which visual processing is related to impaired emotional recognition in schizophrenia remain to be determined. To better characterize the relationship between impaired emotional recognition and visual processing, this study aimed to address an important question: how do schizophrenia patients utilize facial visual information to recognize facial emotion?
In studies with healthy individuals, it has been well established that distinct spatial frequency ranges play different roles in facial processing, including emotional face recognition. In general, low-spatial frequency is associated with global configural facial information and provides coarse emotional cues, whereas high-spatial frequency is related to featural facial processing and is important for detailed analysis of facial traits, such as the precise recognition of identity.19,20 Furthermore, healthy individuals show a particular strategy of collecting information of distinct spatial frequencies from certain facial regions when recognizing facial emotions.21 For example, healthy individuals rely more on high-frequency visual information from the eyes to recognize fear but on visual information from the mouth to identify happiness.
The goal of this study was to determine how schizophrenia patients utilize visual facial information to identify the emotional content of faces. To do so, we employed the Bubbles technique. Developed by Gosselin and Schyns,22 the Bubbles technique isolates the visual information that is used to recognize or categorize visual objects (e.g., facial affect). In this context, the visual information that is critical for making judgments, including identification of facial emotion, is referred to as “potent” information. In a typical experiment using the Bubbles technique, the objects are sampled through an opaque field with randomly located Gaussian apertures or “bubbles,” and participants are presented with partial information about an object that is revealed through the bubbles. Sampling can be done separately by spatial frequency bandwidths so that fine and coarse information can be presented simultaneously.23 Critical visual information underlying accurate recognition (i.e., potent information) can be identified by performing regression analyses on the location of the bubbles and the participants’ response (i.e., accuracy data). In the context of facial affect perception, this procedure makes it possible to determine what facial information is critical for correct identification of emotional expression and at which spatial frequency. We focused on fearful and happy emotions because they are highly distinctive21 and examined the following research questions: (1) do schizophrenia patients require more visual information to correctly identify emotional expression? and (2) what specific visual information (i.e., parts of the face and spatial frequency) do schizophrenia patients rely on when identifying emotional expression?
Participants
Twenty-one (8 females) patients with schizophrenia or schizoaffective disorder (N = 3) and 17 (6 females) healthy controls were recruited for this study. Participants were recruited from a larger National Institute of Mental Health study “Early Visual Processing in Schizophrenia” (PI: M.F.G.). Schizophrenia patients were recruited from outpatient treatment clinics at the Veterans Affairs (VA) Greater Los Angeles Healthcare System and from local board and care facilities. Schizophrenia patients met diagnostic criteria for schizophrenia or schizoaffective disorder using the Structured Clinical Interview for DSM-IV (SCID) Axis I Disorders.24 Exclusion criteria for patients included: (1) substance abuse or dependence in the last 6 months, (2) current major depressive episode, (3) mental retardation based on review of medical records, (4) history of loss of consciousness for more than 1 h, (5) an identifiable neurological disorder, or (6) insufficient fluency in English.
Normal control participants were recruited through flyers posted in the local community and website postings. Exclusion criteria for control participants included: (1) history of schizophrenia or other psychotic disorder, bipolar disorder, recurrent depression, substance dependence, or any substance abuse in the last 6 months based on the SCID,25 (2) current major depressive episode, (3) any of the following Axis II disorders: avoidant, paranoid, schizoid, or schizotypal, based on the SCID for Axis II disorders,26 (4) schizophrenia or other psychotic disorder in a first-degree relative, (5) any significant neurological disorder or head injury, or (6) insufficient fluency in English.
Schizophrenia patients and normal controls were comparable in terms of age and parental education but not personal education (for demographic information, see table 1). Clinical symptoms for patients were rated using the expanded 24-item version of the Brief Psychiatric Rating Scale (BPRS)27,28; and the Scale for Assessment of Negative Symptoms (SANS)29 (table 1). All the patients were taking antipsychotic medications at the time of testing (aripiprazole, n = 9; clozapine, n = 2; olanzapine, n = 2; prolixin decanoate, n = 1; quetiapine, n = 4; risperidone, n = 2). All participants had normal or corrected to normal vision of at least 20/30.
Table 1.
Table 1.
Demographics of Schizophrenia Patients and Healthy Controls
All interviewers were trained through the Treatment Unit of the Department of Veterans Affairs VISN 22 Mental Illness Research, Education, and Clinical Center. SCID interviewers were trained to a minimum kappa of 0.75 for key psychotic and mood items, and symptom raters were trained to a minimum intraclass correlation of 0.80. All participants were evaluated for the capacity to give informed consent and provided written informed consent after all procedures were fully explained, according to procedures approved by the Institutional Review Boards at UCLA and the VA Greater Los Angeles Healthcare System.
Bubbles Experiment
The Bubbles experiment was adapted from Gosselin and Schyns.22 The experiment was programmed with the Psychophysics Toolbox30 and the Pyramid Toolbox31 for Matlab. The original facial stimuli32 came from 5 Caucasian females and 5 Caucasian male posers. Each poser displayed both a fearful and a happy facial expression, yielding 20 different facial expressions. Faces were normalized for location of the eyes and mouth and lighting and were presented through an elliptical aperture to exclude features outside a face. Figure 1 illustrates the experimental stimulus-generation procedure for a given trial (for more details, see 22,33). First (top row), the original facial stimuli (upper left image) were decomposed into 5 bands of spatial frequency bandwidth of one octave each (85.3–42.7, 42.7–21.3, 21.3–10.6, 10.6–5.3, and 5.3–2.6 cycles per face width; scale 1–5 from the highest to lowest) using a Laplacian transform. The remaining bandwidth of <2.6 cycles per face served as constant background. Second (middle row), we created a number of randomly positioned Gaussian apertures (i.e., the bubbles). Each aperture had a SD of 3 cycles per image such that the size of the bubbles at each spatial scale was adjusted accordingly. This ensured that bubbles revealed similar amount of visual information across spatial frequencies. Third (bottom row), randomly sampled Gaussian apertures were independently applied to decomposed face images at each of the 5 spatial frequency bandwidths to reveal partial facial information at each spatial frequency. Finally, the information revealed at each spatial frequency was summed to generate an experimental stimulus for each trial (lower right image at the bottom row). This procedure ensured that for each trial, facial information revealed by the bubbles (i.e., facial features at each spatial frequency) was randomly sampled.
Fig. 1.
Fig. 1.
Illustration of How the Bubbles Stimuli Were Generated. First, each of original faces was decomposed into 5 spatial frequency bandwidths of one octave each (the first row). Second, each spatial frequency bandwidth was independently sampled with randomly (more ...)
At the beginning of each trial, one experimental stimulus appeared on the screen subtending 5.73 × 5.73 degrees of visual angle, and participants were asked to press a key on a computer keyboard to indicate whether the emotion was fear or happiness. Stimuli remained on the screen until participants made their response. Accuracy was recorded, and feedback was given on each trial. We adjusted the number of bubbles on a trial-by-trial basis equally at all spatial frequencies using the QUEST algorithm34 to maintain correct differentiation at a rate of 75%. No separate adjustment was made for each separate emotion. Participants finished 20 practice trials with the experiment stimuli generated by the original 20 emotional expression images. The experiment consisted of 15 blocks of 80 trials each (i.e., 20 emotional expressions, repeated 4 times) and lasted about an hour.
Statistical Analysis
First, we examined whether groups differed in the amount of information needed for identifying emotional expressions. Because the number of bubbles was adjusted proportionally across all spatial frequencies based on the performance, we compared the average number of bubbles (i.e., the amount of face exposed) for correct trials. Second, to determine the facial features that participants used to discriminate fearful versus happy emotions, we performed least-square multiple linear regression on the location of the center of the bubbles and accuracy data. The plane of regression coefficient from this analysis is called a classification image. We computed a classification image per subject per emotion at each spatial frequency. To increase signal-to-noise ratio, we summed all classification images within each group per emotion and per band of spatial frequency. We then smoothed the classification images (full width at half maximum = 23.54 pixels) and performed a Z score transformation. To determine which parts of faces participants utilized significantly more (i.e., potent information), we applied a 1-tailed Pixel test to the group classification images (figure 2; Sr = 32.93 pixels; Zcrit = 3.67; P = .05). This statistical threshold was corrected for multiple comparisons while taking into account the spatial correlation inherent to classification images.35
Fig. 2.
Fig. 2.
Classification Images that Reveal Visual Facial Information Significantly Correlated with Correct Identification of Fear (A) and Happiness (B). For an illustrative purpose, the classification images were overlaid with faces decomposed at each spatial (more ...)
One patient (a male patient diagnosed with schizoaffective disorder) was excluded from the data analysis because of invalid data. To determine whether schizophrenia patients need more visual information to correctly identify facial emotion, we compared the number of bubbles between schizophrenia patients and controls. Schizophrenia patients required more facial areas to correctly identify facial emotions compared with healthy controls (the number of bubbles: 68.7 ± 34.9 and 38.2 ± 10.0, t35 = 3.47, P < .001, for patients and controls, respectively). The number of bubbles in schizophrenia patients was not associated with clinical symptom (BPRS total, r = .10, P = .67; SANS total, r = .04, P = .84).
Figure 2 shows facial information that was significantly correlated with accurate identification of emotions in schizophrenia patients (areas in green) and healthy controls (areas in red); the areas in yellow indicate facial features that both schizophrenia patients and healthy controls used. Figure 3 displays crucial facial features for identification of emotional expression that were summed across spatial frequencies in schizophrenia patients and healthy controls.
Fig. 3.
Fig. 3.
Images that Summed Potent Visual Information across Spatial Frequencies for Correct Identification of Emotional Expressions: (A) fear in healthy controls, (B) fear in schizophrenia patients, (C) happy in healthy controls, and (D) happy in schizophrenia (more ...)
For fearful faces (figure 2, first row), facial features that were utilized most effectively in healthy controls were the bilateral eye regions at high-spatial frequency (scale 1) and the mouth regions at mid-range spatial frequencies (scales 2–4). Schizophrenia patients also utilized mouth regions at mid-range spatial frequencies and left eyebrow (scales 2–4), but patients did not use visual information from the eyes at high-spatial frequency (scale 1). For happy faces, healthy controls used the mouth at spatial scale 2 (areas in red and yellow on the second row). In contrast, schizophrenia patients utilized regions around the mouth at high-spatial frequency (scale 1) and the mouth and the eyes at mid-range spatial frequencies (scale 2 and 3; areas in green on the second row).
Figure 2 suggests that schizophrenia patients used an atypical strategy to collect visual information from the eyes and mouth regions, compared with healthy controls. We further examined the data by spatial frequency to determine whether patients used a distinctly different strategy to collect visual information from these facial regions of interests (ROIs; eyes and mouth). We extracted maximum z values from these ROIs for each emotional expression at each spatial scale (i.e., scales 1–4; see table 2) and performed a repeated-measures analysis of variance with emotion as a within-subject factor and group as a between-subject factor for each ROI for each spatial scale separately. For spatial scale 1 (highest spatial frequency), there was a significant emotion by group interaction for the eye region (F1,35 = 9.99, P < .01). Post hoc analyses revealed a significant group difference for fear identification (P < .05), with controls utilizing the eye region more than patients. There was a nonsignificant difference in the opposite direction (controls < patients) for happy faces (P = .14). For the mouth region at spatial scale 1, a main effect of emotion was significant (F1,35 = 8.787, P < .001), indicating that both groups showed higher utilization of the mouth to identify happiness. For spatial scale 2, neither ROI showed a significant effect. For spatial scale 3, there were no significant effects for the eyes. However, there was a main effect of emotion (F1,35 = 9.651, P < .01) and an emotion by group interaction (F1,35 = 4.365, P < .05) for the mouth. Controls depended heavily on the mouth for fear but relatively little for happiness; patients utilized the mouth about the same for both emotions. For spatial scale 4, there were no significant effects for the eyes. A main effect of emotion was significant (F1,35 = 4.62, P < .05) for the mouth, indicating that both groups utilized this region more for fear than happy expressions.
Table 2.
Table 2.
Maximum z Values from Eyes and Mouth Regions
Determining how schizophrenia patients visually decode emotional content from faces is crucial to understanding the nature of impaired emotion perception in schizophrenia. With the Bubbles procedure, we examined how schizophrenia patients employ visual information to judge emotional expression of faces. The amount of visual information revealed by the bubbles across trials was adjusted to maintain 75% accuracy for each participant, so that schizophrenia patients and healthy controls were matched in behavioral performance.
To correctly identify emotional expressions of faces in general, schizophrenia patients required more visual information compared with controls. Moreover, schizophrenia patients used a different strategy of collecting visual information to identify emotional expression. For identifying fearful emotion, schizophrenia patients did not utilize information from the eyes but instead relied on areas around the mouth, whereas healthy controls relied on eyes at high-spatial frequency and areas around the mouth at mid-spatial frequencies. To identify happiness, schizophrenia patients relied on the area around the mouth at high- and mid-range spatial frequencies and the eyes at mid-range spatial frequency, whereas healthy controls only used mid-range spatial frequency information around the mouth. This study is the first, to our knowledge, to identify potent information that is used to make decisions about emotional content of facial stimuli in schizophrenia.
Using the Bubbles technique, this study recreated the information that patients used to discriminate emotional expressions. Not only did schizophrenia patients use different facial features but they also utilized different spatial frequencies. Especially this study presented spatial frequency information across all spatial frequency bands simultaneously instead of one spatial frequency bandwidth at a time. In this sense, the current study differs from previous studies that manipulated spatial frequency information in schizophrenia.36 By presenting all spatial frequency bandwidths simultaneously, this study was able to identify which spatial frequency information is more critical at what facial regions. Schizophrenia patients showed an atypical usage of visual information, and this atypical strategy was more prominent in high-spatial frequency bandwidth. Previous studies on face processing suggested that schizophrenia patients have more difficulty processing configural facial information than featural face information.37,38 Considering the role of high-spatial frequency information in featural face processing,19,20 the current finding demonstrates that schizophrenia have abnormal featural processing when judging emotional expression of faces.
One major distinction between schizophrenia patients and controls is the usage of high-spatial frequency information from the eye regions. Schizophrenia patients did not use high-spatial frequency information from the eyes to identify fearful emotion, whereas this information was critical for controls. When processing fearful faces, healthy individuals tend to activate the amygdala, an effect that is associated with viewing the eye regions, as opposed to other parts of the face.39,40 A previous report that used the Bubbles technique showed that a patient with an amygdala lesion used less of the eye regions at high-spatial frequency when recognizing fear.41 Among schizophrenia patients, studies using functional magnetic resonance imaging found reduced activation associated with processing fearful faces in the amygdala.42,43 Our finding of reduced use of high-spatial frequency information around the eyes may be related to the reduced activation of the amygdala to fear in schizophrenia patients. Patients with autism also show an atypical strategy of using visual information to recognize fearful faces, which is similar to what is seen in schizophrenia patients. In those studies,44,45 autistic patients (as well as their unaffected parents) relied less on the eye areas and more on the mouth areas when judging fear. Furthermore, the parents who used this atypical search strategy to a greater extent were also more likely to be socially aloof, a personality characteristic related to autism.45 Future studies with larger samples will be able to determine whether this aberrant strategy is associated with other characteristics of schizophrenia patients.
Restricted visual scanning of emotional faces has been previously suggested as a possible mechanism of impaired emotional recognition in schizophrenia patients.6,46 In general, visual scanning measured through eye movements (i.e., visual scan paths) provides useful information about where on the face, and for how long, people look when judging emotions. However, these studies do not inform us how schizophrenia patients use visual information. Knowing where someone is looking does not tell us how he/she uses visual information. In contrast, with the Bubbles technique, this study showed which parts of facial features or what levels of spatial frequency were critical to make decisions about emotional expression of faces. Hence, this study provides information that is not available from studies using visual scan paths.
On the surface, the current study resembles previous studies on visual integration of schizophrenia patients.47 For example, both perceptual closure tests of visual integration and the Bubbles task include partially obscured stimuli. However, it is difficult to make any inference about visual integration from the current study because it is possible that participants were making decisions by using specific visual cues that do not require integration (e.g., making a decision based on the amount of white above the iris) instead of mentally filling in the rest of faces from bubbles (i.e., visual integration).
Fearful and happy faces were selected because they have been shown to be associated with the most distinct use of facial visual information.21 Because we only used 2 emotions; however, it is unclear if patients made their decisions based on the presence of the critical features of one emotion or the absence of the critical features of the other emotion. It is possible that this finding characterizes how patients differentiate fear from happy instead of how they recognize each emotion in isolation. Thus, it remains to be tested whether schizophrenia patients would show a similar use of visual information when asked to recognize fear and happiness among more alternatives. Reassuringly, healthy controls in this study showed the same search strategies for fear and happy that were found when people were asked to identify the 6 basic emotions plus neutral faces.21 Another limitation of this study is that we did not collect the stimulus presentation time during the Bubbles task. It remains to be determined whether schizophrenia patients need longer stimulus presentation time to collect necessary visual information.
In summary, we found that compared with controls schizophrenia patients collected different spatial frequency information from different facial regions when judging emotional expression of faces. The atypical usage of visual information in schizophrenia patients suggests an abnormal processing of featural facial information. This study helps us to better understand the underlying mechanism of impaired recognition of emotional expression in schizophrenia.
Funding
National Institute of Mental Health Grant MH 43929 (PI: M.F.G.).
Acknowledgments
The authors wish to thank Shelly Crosby for assistance in data collection.
1. Mandal MK, Pandey RP, Prasad AB. Facial expressions of emotions and schizophrenia: a review. Schizophr Bull. 1998;24:399–412. [PubMed]
2. Edwards J, Jackson HJ, Pattison PE. Emotion recognition via facial expression and affective prosody in schizophrenia: a methodological review. Clin Psychol Rev. 2002;22(6):789–832. [PubMed]
3. Kohler CG, Turner TH, BIker WB, et al. Facial emotion recognition in schizophrenia: intensity effects and error pattern. Am J Psychiatry. 2003;160:1768–1774. [PubMed]
4. Edwards J, Pattison PE, Jackson HJ, Wales RJ. Facial affect and affective prosody recognition in first-episode schizophrenia patients. Schizophr Res. 2001;48:235–253. [PubMed]
5. Addington J, Addington D. Neurocognitive and social functioning in schizophrenia: a 2.5 year follow-up study. Schizophr Res. 2000;44(1):47–56. [PubMed]
6. Streit M, Wolwer W, Gaebel W. Facial affect recognition and visual scanning behavior in the course of schizophrenia. Schizophr Res. 1997;24(3):311–317. [PubMed]
7. Salem J-E, Kring A-M, Kerr S-L. More evidence for generalized poor performance in facial emotion perception in schizophrenia. J Abnorm Psychol. 1996;105(3):480–483. [PubMed]
8. Kerr SL, Neale JM. Emotion perception in schizophrenia: specific deficit or further evidence of generalized poor performance? J Abnorm Psychol. 1993;102(2):312–318. [PubMed]
9. Brekke J, Kay DD, Lee KS, Green MF. Biosocial pathways to functional outcome in schizophrenia. Schizophr Res. 2005;80(2–3):213–225. [PubMed]
10. Kee KS, Green MF, Mintz J, Brekke JS. Is emotion processing a predictor of functional outcome in schizophrenia? Schizophr Bull. 2003;29(3):487–497. [PubMed]
11. Kee KS, Horan WP, Wynn JK, Mintz J, Green MF. An analysis of categorical perception of facial emotion in schizophrenia. Schizophr Res. 2006;87:228–237. [PubMed]
12. Tsoi DT, Lee KH, Khokhar WA, et al. Is facial emotion recognition impairment in schizophrenia identical for different emotions? A signal detection analysis. Schizophr Res. 2008;99(1–3):263–269. [PubMed]
13. Schneider F, Gur RC, Koch K, et al. Impairment in the specificity of emotion processing in schizophrenia. Am J Psychiatry. 2006;163(3):442–447. [PubMed]
14. Turetsky BI, Kohler CG, Indersmitten T, Bhati MT, Charbonnier D, Gur RC. Facial emotion recognition in schizophrenia: when and why does it go awry? Schizophr Res. 2007;94(1–3):253–263. [PMC free article] [PubMed]
15. Caharel S, Bernard C, Thibaut F, et al. The effects of familiarity and emotional expression on face processing examined by ERPs in patients with schizophrenia. Schizophr Res. 2007;95(1–3):186–196. [PubMed]
16. Wynn JK, Lee J, Horan WP, Green MF. Using event-related potentials to explore stages of facial affect recognition deficits in schizophrenia. Schizophr Bull. 2008;34(4):679–687. [PMC free article] [PubMed]
17. Kosmidis MH, Bozikas VP, Giannakou M, Anezoulaki D, Fantie BD, Karavatos A. Impaired emotion perception in schizophrenia: a differential deficit. Psychiatry Res. 2007;149(1–3):279–284. [PubMed]
18. Norton D, McBain R, Holt DJ, Ongur D, Chen Y. Association of impaired facial affect recognition with basic facial and visual processing deficits in schizophrenia. Biol Psychiatry. 2009;65:1094–1098. [PubMed]
19. Calder AJ, Young AW, Keane J, Dean M. Configural information in facial expression perception. J Exp Psychol Hum Percept Perform. 2000;26:527–551. [PubMed]
20. Goffaux V, Hault B, Michel C, Vuong QC, Rossion B. The respective role of low and high spatial frequency in supporting configural and featural processing of faces. Perception. 2005;34(1):77–86. [PubMed]
21. Smith ML, Cottrell GW, Gosselin F, Schyns P-G. Transmitting and decoding facial expression. Psychol Sci. 2005;16(3):184–189. [PubMed]
22. Gosselin F, Schyns P-G. Bubbles: a technique to reveal the use of information in recognition tasks. Vision Res. 2001;41:2261–2271. [PubMed]
23. Oliva A, Schyns P-G. Coarse blobs, or fine scale edges? Evidence that information diagnosticity changes the perception of complex visual stimuli. Cogn Psychol. 1997;34:72–107. [PubMed]
24. First MB, Spitzer RL, Gibbon M, WIlliams JBW. The Structured Clinical Interview for DSM-IV Axis I Disorders—Patient Edition. New York, NY: Biometrics Research; 1997.
25. First MB, Spitzer RL, Gibbon M, Williams JBW. Structured Clinical Interview for DSM-IV Axis I Disorders—Patient Edition. New York, NY: Biometrics Research Department, New York State Psychiatric Institute; 1997.
26. First MB, Gibbon M, Spitzer RL, Williams JBW, Benjamin L. Structured Clinical Interview for DSM-IV Avis II Personality Disorders. New York, NY: Biometrics Research Department, New York State Psychiatric Institute; 1996.
27. Ventura J, Green MF, Shaner A, Liberman RP. Training and quality assurance with the Brief Psychiatric Rating Scale: “the drift busters.” Int J Methods Psychiatr Res. 1993;3:221–224.
28. Ventura J, Lukoff D, Nuechterlein KH, Liberman RP, Green MF, Shaner A. Brief Psychiatric Rating Scale (BPRS) expanded version: scales, anchor points, and administration manual. Int J Methods Psychiatr Res. 1993;3:227–243.
29. Andreasen NC. Negative symptoms in schizophrenia. Definition and reliability. Arch Gen Psychiatry. 1982;39(7):784–788. [PubMed]
30. Brainard DH. The Psychophysics Toolbox. Spat Vis. 1997;10:433–436. [PubMed]
31. Simoncelli EP. Image and Multi-Scale Pyramid Tools [computer program]. Version. 1999 New York: Author.
32. Simon D, Craig KD, Gosselin F, Belin P, Rainville P. Recognition and discrimination of prototypical dynamic expressions of pain and emotions. Pain. 2008;135(1–2):55–64. [PubMed]
33. Schyns P-G, Bonnar L, Gosselin F. Show me the features! Understanding recognition from the use of visual information. Psychol Sci. 2002;13:402–409. [PubMed]
34. Watson AB, Pelli DG. QUEST: a Bayesian adaptive psychometric method. Percept Psychophys. 1983;33:113–120. [PubMed]
35. Chauvin A, Worsley KJ, Schyns P-G, Arguin M, Gosselin F. Accurate statistical tests for smooth classification images. J Vis. 2005;5:659–667. [PubMed]
36. Silverstein SM, All SD, Kasi R, et al. Increased fusiform area activation in schizophrenia during processing of spatial frequency-degraded faces, as revealed by fMRI. Psychol Med. 2009:1–11. [PubMed]
37. Shin YW, Na MH, Ha TH, Kang DH, Yoo SY, Kwon JS. Dysfunction in configural face processing in patients with schizophrenia. Schizophr Bull. 2008;34(3):538–543. [PMC free article] [PubMed]
38. Joshua N, Rossell S. Configural face processing in schizophrenia. Schizophr Res. 2009;112(1–3):99–103. [PubMed]
39. Morris JS, deBonis M, Dolan RJ. Human amygdala responses to fearful eyes. Neuroimage. 2002;17:214–222. [PubMed]
40. Adams RB, Gordon HL, Baird AA, Ambady N, Kleck RE. Effects of gaze on amygdala sensitivity to anger and fear faces. Science. 2003;300:1536. [PubMed]
41. Adolphs R, Gosselin F, Buchanan TW, Tranel D, Schyns P-G, Damasio AR. A mechanism for impaired fear recognition after amygdala damage. Nature. 2005;433:68–72. [PubMed]
42. Gur RE, Longhead J, Kohler CG, et al. Limbic activation associated with misidentification of fearful faces and flat affect in schizophrenia. Arch Gen Psychiatry. 2007;64(12):1356–1366. [PubMed]
43. Rosetti R, Mattay VS, Wiedholz LM, et al. Evidence that altered amygdala activity in schizophrenia is related to clinical state and not genetic risk. Am J Psychiatry. 2009;166(2):216–225. [PMC free article] [PubMed]
44. Spezio ML, Adolphs R, Hurley RS, Piven J. Abnormal use of facial information in high-functioning autism. J Autism Dev Disord. 2007;37(5):929–939. [PubMed]
45. Adolphs R, Spezio ML, Parlier M, Piven J. Distinct face-processing strategies in parents of autistic children. Curr Biol. 2008;18(14):1090–1093. [PMC free article] [PubMed]
46. Loughland CM, Williams LM, Gordon E. Visual scanpaths to positive and negative facial emotions in an outpatient schizophrenia samples. Schizophr Res. 2002;55(1–2):159–170. [PubMed]
47. Doniger GM, Foxe JJ, Murray MM, Higgins BA, Javitt DC. Impaired visual object recognition and dorsal/ventral stream interaction in schizophrenia. Arch Gen Psychiatry. 2002;59(11):1011–1020. [PubMed]
Articles from Schizophrenia Bulletin are provided here courtesy of
Oxford University Press