Search tips
Search criteria 


Logo of schbulschizophrenia bulletinsubscriptionscontact uscurrent issuemy basketarchivemy accountsearchcontact this journaloxford journalsabout this journal
Schizophr Bull. 2008 July; 34(4): 679–687.
Published online 2008 May 21. doi:  10.1093/schbul/sbn047
PMCID: PMC2632462

Using Event Related Potentials to Explore Stages of Facial Affect Recognition Deficits in Schizophrenia


Schizophrenia patients show impairments in identifying facial affect; however, it is not known at what stage facial affect processing is impaired. We evaluated 3 event-related potentials (ERPs) to explore stages of facial affect processing in schizophrenia patients. Twenty-six schizophrenia patients and 27 normal controls participated. In separate blocks, subjects identified the gender of a face, the emotion of a face, or if a building had 1 or 2 stories. Three ERPs were examined: (1) P100 to examine basic visual processing, (2) N170 to examine facial feature encoding, and (3) N250 to examine affect decoding. Behavioral performance on each task was also measured. Results showed that schizophrenia patients’ P100 was comparable to the controls during all 3 identification tasks. Both patients and controls exhibited a comparable N170 that was largest during processing of faces and smallest during processing of buildings. For both groups, the N250 was largest during the emotion identification task and smallest for the building identification task. However, the patients produced a smaller N250 compared with the controls across the 3 tasks. The groups did not differ in behavioral performance in any of the 3 identification tasks. The pattern of intact P100 and N170 suggest that patients maintain basic visual processing and facial feature encoding abilities. The abnormal N250 suggests that schizophrenia patients are less efficient at decoding facial affect features. Our results imply that abnormalities in the later stage of feature decoding could potentially underlie emotion identification deficits in schizophrenia.

Keywords: face processing, emotion identification, ERP


One of the defining characteristics of schizophrenia is poor social functioning.1 Considerable efforts have been made to further understand the role determinants of social functioning in schizophrenia, with many studies finding that poor neurocognition in schizophrenia is related to poor functional outcome.24 More recent efforts have been made to understand the role social cognition plays in functional outcome in schizophrenia because recent findings have shown that deficits in social cognition play a mediating role between neurocognition and social functioning.58 Social cognition, broadly defined, is the ability to construct representations of oneself, others, and relationships between oneself and others.9 It is well known that schizophrenia patients exhibit deficits in various domains of social cognition, including social cue perception,1012 theory of mind,1315 and facial affect perception.1618 However, studies are only starting to explore the neural bases of social cognition deficits in schizophrenia. The current study examines one particular facet of social cognition, facial, and facial affect processing in schizophrenia patients using event-related potential (ERPs).

Processing of faces and facial affect can be roughly broken into 3 stages: initial visual processing, encoding of facial features, and decoding of emotional content. Three ERP components, the P100, N170, and N250, are indications of these 3 processing stages. The P100 is a positive waveform peaking approximately after 80–120 ms post-stimulus in occipital sites and is one of the earliest components identified with basic visual processing. P100 deficits in schizophrenia patients have been observed using a number of nonface stimuli.1923 The literature is mixed with regards to P100 deficits during processing of faces or facial affect, with most not seeing a deficit2426 but some noting a deficit,27,28 While the results of P100 during face processing in schizophrenia are not conclusive, the majority of these studies imply that face or facial affect processing deficits in schizophrenia occur after the earliest stages of visual information processing.

The N170 is a negative waveform peaking approximately 150–180 ms post-stimulus and is observed at occipito-temporal sites. The N170 component is commonly thought to represent the earliest stage of facial structure encoding.29,30 Source localization of the N170 has shown that its neural generator is located in the fusiform gyrus31,32 (but see Itier and Taylor33 and Joyce and Rossion34), a region thought to be selectively activated by viewing of faces.35 Recent studies have found that schizophrenia patients exhibit reduced N170 amplitude during face and facial affect processing2426 (but see Streit et al36). These studies suggest that schizophrenia patients’ deficits in emotion decoding are due to deficits in structural encoding of facial features.

An affect-related negative ERP peaking at approximately 250 ms in fronto-central sites (termed the N250) has been identified as being sensitive to the emotional content of a face.36,37 The literature is mixed with regards to the N250 response in schizophrenia patients. One study found normal N170 but reduced N250 responses in schizophrenia patients,38 suggesting that deficits in facial affect recognition are not due to abnormalities in facial feature encoding, but rather, they are due to specific deficits in decoding of emotional information. Other studies of facial affect processing have found abnormal N170 responses but normal N250 responses in schizophrenia patients,25,26 implying that facial feature encoding is abnormal but affect decoding is unaffected.

While there is a growing interest in using of ERPs to examine facial affect processing in schizophrenia, only one study26 has fully examined the temporal course of face or facial affect processing in the same sample. However, they used a very brief presentation of stimuli (100 ms), a design choice that may have made face processing much more difficult for patients. The current study will examine 3 stages of face and facial affect processing: basic visual processing of faces (using the P100), facial feature encoding (using the N170), and decoding of facial features used to identify an emotion (using the N250). By exploring these 3 stages of face and facial affect processing, we expect to gain a better understanding of when in the course of processing facial affect identification deficits occur in schizophrenia patients. Based on the literature, we do not expect to see differences between patients and controls in the basic visual processing stage (P100). We are predicting that patients will exhibit deficits in the facial encoding stage (N170) that are associated with deficits in the affect decoding stage (N250).



Twenty-six (5 females) patients with schizophrenia and 27 (3 females) normal control subjects participated in the study. All subjects were participating in a larger study of early visual processing (Early Visual Processing in Schizophrenia; P. I.: Michael F. Green). Schizophrenia patients were recruited from outpatient treatment clinics at the Veterans Affairs (VA) Greater Los Angeles Healthcare System and through presentations at facilities in the community. Patients between the ages of 18 and 60 were recruited and needed to meet criteria for schizophrenia based on the Structured Clinical Interview for DSM-IV Axis I Disorders (SCID39). Exclusion criteria for patients included substance abuse or dependence in the last 6 months, mental retardation, a history of loss of consciousness for more than 1 hour, an identifiable neurological disorder, or not sufficiently fluent in English. Twenty-two patients were receiving atypical antipsychotic medication, 2 patients were receiving typical antipsychotic medication, and 2 were receiving both types of medication.

Normal control participants between the ages of 25 and 55 were recruited through flyers posted in the local community and advertisements in local newspapers and on websites. An initial screening interview excluded potential normal controls who had any identifiable neurological disorder or head injury, had a first-degree relative with schizophrenia or other psychotic disorder, were not sufficiently fluent in English, had a history of schizophrenia or other psychotic disorder, bipolar disorder, recurrent depression, or had history of substance dependence or any substance abuse in the last 6 months. Potential normal control participants were interviewed with the SCID and portions of the Structured Clinical Interview for DSM-IV Axis II Disorders (SCID-II 40). Potential normal controls were excluded if they had any of the following Axis II disorders: avoidant, borderline, paranoid, schizoid, or schizotypal.

All SCID interviewers were trained through the Treatment Unit of the Department of Veterans Affairs VISN 22 Mental Illness Research, Education, and Clinical Center (MIRECC) to a minimum kappa of 0.75 for key psychotic and mood items. All participants had the capacity to give informed consent and provided written informed consent after all procedures were fully explained in accordance with procedures approved by the Institutional Review Boards at UCLA and the VA Greater Los Angeles Healthcare System.

Table 1 lists the demographics of the patient and control groups as well as symptom ratings on the BPRS 41 and SANS global scores for patients. Because most of our patient participants were recruited from VA clinics, the sample is predominantly male. Patients were clinically stable and exhibited mild clinical symptoms.

Table 1.
Group Demographics and Symptom Ratings for Schizophrenia Patients


Participants performed 3 different classification tasks while viewing black and white pictures of either faces or buildings. In separate blocks participants were asked to identify the gender of a face, the emotion of a face, or whether a building was 1 or 2 stories.

Pictures of faces were taken from Ekman and Friesen's42 Pictures of Facial Affect Set. There were 18 female and 18 male faces depicting 1 of 6 different emotions: afraid (7 pictures), angry (6 pictures), ashamed (4 pictures), happy (11 pictures), sad (4 pictures), and surprised (4 pictures). These stimuli were intact, full presentations of the face and hair and were not modified in any way. Eighteen photographs of 1-story and 18 pictures of 2-story houses were downloaded from an internet website ( Face and building pictures were black and white photographs, and all were resized to the same dimension (20 × 25 cm). The face and building pictures, however, were not normalized on other visual properties such as contrast or brightness levels. All stimuli were presented on a cathode-ray tube monitor placed 1 m in front of the subject.

Subjects received 6 blocks of 36 pictures each, in a fixed block order (gender identification, emotion identification, building identification) with the order presented twice, for a total of 72 pictures presented for each identification task. Pictures within each block were presented in a random order for each subject. Each trial consisted of a fixation cross presented for 400 ms, a blank screen for 500 ms, a picture for 500 ms, and a wait period of 1000 ms. At the end of each trial, a screen appeared prompting the subject to make their choice. A list of choices for each respective task was presented (eg, male or female? 1 or 2 stories?). The subject said their response aloud, and it was entered by the tester, at which point the next trial began. These behavioral data were collected simultaneously with the electroencephalographic (EEG) recordings.

EEG Recording

Participants had their EEG activity recorded during viewing of all stimuli. Stimulus presentation and data synchronization with the EEG were accomplished using E-Prime (Psychology Software Tools, Inc., Pittsburgh, Pennsylvania). EEG activity was collected using a Neuroscan NuAmps amplifier (Compumedics USA, El Paso, Texas) continuously throughout the session. Data were sampled at 1000 Hz with filter settings of 0.5–100 Hz. Thirty-two cap-mounted, sintered Ag-AgCl electrodes (Falk Minow Services, Germany) were positioned using a modified international 10–20 system placement scheme. Additionally, 4 electrodes were used to measure horizontal electrooculogram (EOG; placed on the outer canthus of the left and right eye) and vertical EOG (placed above and below the left eye). All electrodes were referenced to the nose and a forehead ground was employed.

All data were processed offline using Neuroscan Scan 4.3 software. Eyeblinks were removed from the data using established mathematical procedures.43 Data were band-pass filtered at 1–30 Hz and then epoched to 100-ms pre-stimulus and 500-ms post-stimulus. Baseline correction on the 100 ms prior to stimulus presentation was applied. Artifact rejection was performed for any trial that exceeded ±50 μV at electrode sites F7, F8, FP1, FP2, F3, F4, and Fz. Out of 216 total trials, an average of 192 trials were accepted for schizophrenia patients and 200 for normal controls, a statistically nonsignificant difference between the groups.

Data Analysis

ERP waveforms were created by averaging all accepted trials separately for each condition (gender identification, emotion identification, and building identification). We examined 3 separate ERP components: P100, N170, and N250. Mean amplitudes for each ERP component were calculated: mean activity in the time range of 100–130 ms was calculated for the P100, mean activity in the time range of 140–220 ms was calculated for the N170, and mean activity in the time range of 215–300 ms was calculated for the N250. Electrodes were chosen based on maximal activity seen by inspection of the topographical maps. These electrodes closely matched those examined in other studies (eg, P100,24 N170,24,26 N25026). For the P100, activity from O1 (left hemisphere) and O2 (right hemisphere) was analyzed. For the N170, the mean of activity in electrodes P7 and PO9 (left hemisphere) and P8 and PO10 (right hemisphere) was analyzed. For the N250, the mean of activity in electrodes C3, F3, and FC1 (left hemisphere) and C4, F4, and FC2 (right hemisphere) was analyzed. Peak latency (the latency at the largest amplitude in the latency ranges for each component described above) for each component was analyzed separately. For each component, a 2 (group: patients vs controls) × 3 (task: emotion identification, gender identification, building identification) × 2 (hemisphere: left vs right) analysis of variance (ANOVA) was run. Additionally, performance (number correctly identified out of 72 trials) on the identification tasks was analyzed.

All mean values are presented as mean (SD). All statistical analyses used an a priori 2-tailed significance level of 0.05 to determine significant results.



Table 1 shows the demographic data for both groups and symptom ratings for the schizophrenia patients. As can be seen, the groups were fairly evenly matched on gender distribution, age, and education, with no statistical difference between groups on any demographic measure. As a group, the schizophrenia patients had relatively mild symptom levels at the time of testing.

Performance Data

Performance data were analyzed with a 2 (group: patients vs controls) × 3 (task: emotion identification, gender identification, building identification) ANOVA. Results revealed a main effect of task, F(2, 102) = 235.7, P < .001, but no significant group main effect, F(1, 51) = 1.0, P < .40, or group X task interaction, F(2, 102) = 1.2, P < .40. For both groups, Bonferroni-corrected multiple comparisons showed that all the comparisons were significant, with performance on the gender identification task highest and performance on the emotion identification task lowest and with performance on the building identification task falling between the 2 other tasks. Mean (SE) correct responses out of 72 trials for the building identification, gender identification, and emotion identification tasks can be seen in figure 1.

Fig. 1.
Behavioral Performance Data on the Emotion Identification, Gender Identification, and Building Identification Tasks.

Event-Related Potentials

Mean Amplitude Analysis.

Figure 2 shows example waveforms for each task for the patient and control groups. For display purposes, the waveforms shown are of linear derivations (ie, mean) of all the electrodes used in the analysis of each waveform. The P100, N170, and N250 waveforms are clearly visible in their expected regions (denoted with arrows). The means and SEs for each component and task are displayed in figure 3 (data are collapsed across hemisphere to simplify display).

Fig. 2.
Examples of the P100, N170, and N250 Waveforms for Each Group and Each Task. Green line = emotion identification task, blue line = gender identification task, red line = building identification task. Panel A: P100 waveform. Panel B: N170 waveform. Panel ...
Fig. 3.
Average Amplitude of Each Task for Each Group. Data for the P100 response are seen on the top, data for the N170 response are seen in the middle, and data for the N250 response are seen on the bottom.


Results from the ANOVA on the P100 data revealed a main effect of task, F(2, 102) = 4.7, P < .02. The main effects of group and hemisphere did not approach statistical significance (P‘s < 0.32 and 0.40, respectively). None of the interactions approached statistical significance. Bonferroni-corrected multiple comparisons showed that P100 activity during the emotion identification task was significantly larger compared with activity during the building identification task, P < .01, but there were no other significant differences.


Results from the ANOVA on the N170 data revealed a significant main effect of task, F(2, 102) = 74.9, P < .01, and a significant main effect of hemisphere, F(1, 51) = 7.3, P < .01. The main effect of group did not approach statistical significance (P < 0.13). None of the interactions approached statistical significance. N170 activity was greater in the right hemisphere compared with the left hemisphere. Bonferroni-corrected multiple comparisons showed that N170 activity during the emotion and gender identification tasks was significantly larger compared with activity during the building identification task, P < .001.


Analysis of N250 activity showed a significant main effects for group, F(1, 51) = 9.2, P < 0.01, and task, F (2,102) = 9.1, but no other main effects or interactions approached significant. N250 was significantly greater for the controls compared with the patients across all 3 conditions. Bonferroni-corrected multiple comparisons showed that N250 activity during the emotion identification task was significantly larger compared with the gender and building identification tasks (P’s < .05), and activity during the gender identification task was significantly larger compared with the building identification tasks (P < .05).

Peak Latency Analysis.

The means and SDs for peak latency for each component and task are displayed in table 2 (data are collapsed across hemisphere to simplify display).

Table 2.
Mean (SD) Peak Latencies for Patients and Controls on the P100, N170, and N250 Waveforms


Analysis of P100 peak latency revealed a main effect of task, F(2, 102) = 18.6, P < .001, a main effect of group, F(1, 51) = 7.0, P < 0.02, and a task X hemisphere interaction, F(2, 102) = 3.5, P < .05. Bonferonni-corrected multiple comparisons showed that the emotion identification and gender identification tasks had longer latencies compared with the building identification task. The group main effect revealed that overall patients showed longer P100 latencies compared with the normal controls. The task X hemisphere interaction was due to shorter latencies in the left hemisphere during the emotion identification task but shorter latencies in the right hemisphere during the building identification task.


Analysis of N170 peak latency revealed a main effect of task, F(2, 102) = 17.7, P < .001. No other main effects or interactions approached significance. Bonferonni-corrected multiple comparisons showed that latencies in the emotion identification task were significantly faster comapred to the gender and building identification tasks.


Analysis of N250 peak latency revealed no main effects or interactions.


Patients and controls both showed the expected waveforms for each of the ERPs during the 3 identification tasks. Furthermore, the controls’ waveforms were all significantly greater than zero, showing that valid waveforms were obtained. Both groups demonstrated initial P100 responses during all 3 identification tasks, with the largest P100 seen during the emotion identification task. The expected N170 response was also seen in both groups while viewing faces, with significantly larger N170s found during the emotion and gender identification tasks compared with the building identification task. Finally, both groups showed an N250 response that was largest during the emotion identification task, smallest during the building identification task, and intermediate during the gender identification task. Thus, the paradigms elicited valid data in both healthy controls and patients.

The results of the present study revealed that schizophrenia patients differed from the comparison group in the N250, an ERP component that is sensitive to the decoding of facial affect features, but exhibited normal P100 and N170 that reflect basic encoding and visual processing of facial features, respectively. These findings help clarify the temporal course of facial affect processing deficits in schizophrenia.

The finding of no difference between groups in the P100 component during face processing is consistent with the majority of studies of this waveform in schizophrenia.2426 While previous studies of visual processing using nonface (see eg, Schechter et al,19 Haenschel et al,21 Yeap et al,22 and Butler et al23) as well as face (see eg, Caharel et al27 and Campanella et al28) stimuli have found P100 deficits in schizophrenia patients, the lack of a P100 deficit using facial stimuli in the current study implies that any deficits in facial affect recognition in schizophrenia patients occurs at a later stage of visual processing. The finding of an increased P100 during emotion identification was not expected. However, Turetsky et al26 reported a similar effect of emotion identification on the P100. It is not entirely clear why the P100 increased during emotion identification. It is possible that attentional factors could have influenced the P100,44,45 in particular on the more difficult emotion identification task. It is also possible that differences in stimulus properties between the face and gender buildings may have affected the P100, which is very sensitive to properties such as contrast, spatial frequency, and luminance.46,47. While this may account for the differences in the P100 elicited by the faces and the buildings, it cannot account for the differences between the P100 elicited by the gender identification task and the emotion identification task as the stimuli were exactly the same. Further studies will be necessary to determine the effects of emotion identification on early visual processing ERP components

N170 deficits in schizophrenia patients during processing of neutral or emotional faces have been previously reported in some studies,2426 but we did not find this deficit in the current study, similar to Streit et al38 The lack of an N170 deficit in our schizophrenia patients could be due to our use of different stimulus parameters or analysis methods compared with the studies that have found N170 deficits. For example, some studies have only examined the N170 to neutral faces (see eg, Herrmann et al24) while others have examined the N170 to specific emotions, such as happy or sad (see eg, Turetsky et al26). In addition, other studies finding an N170 deficit in schizophrenia patients used familiar vs unfamiliar faces (eg, Caharel et al27), or an oddball-type task, where subjects had to detect a deviant face (either based on identity or emotion) presented among a series of standard neutral faces (see eg, Campanella et al48). Different analysis methods have also been used. For example, Turetsky et al26 examined mean global field power whereas we examined mean amplitude. We also utilized different latency windows to examine our waveforms compared with studies that have found P100 or N170 deficits (see eg, Caharel et al27and Campanella et al48). Despite these methodological and analytical differences, the facial affect processing disturbances that patients in the current study demonstrated are not easily attributable to deficits in the basic visual processing or facial feature encoding stages. Future studies will need to be conducted to determine which stimulus parameters (eg, stimulus length, emotional vs neutral faces, specific emotions used) may affect N170 responses in schizophrenia.

The main finding of the current study is that we found a deficit in the N250 waveform in schizophrenia patients during all 3 identification tasks. Moreover, we found that the N250 was larger in both groups during the emotion identification task compared with the other 2 identification tasks, though for the patients their amplitude was still lower compared with the controls. While the N250 deficit in schizophrenia patients is not specific to emotion identification, it is important to note that the N250 wave is sensitive to emotion identification in schizophrenia patients, implying that it might be a useful biomarker for treatment studies.

We also found that patients had a prolonged latency in the P100 ERP during all 3 identification tasks, but comparable latencies for the N170 and N250 ERPs. The longer latency in the patients on the P100 wave may be reflective of their general inefficiency in the earliest stages of visual information processing. It is possible that delays in the earliest stages of visual information processing may influence later stages of face or facial affect processing though this does not seem likely. Amplitudes of the P100 and the latency and amplitude of the subsequent N170 were of equal magnitude. While the patients might have been slower in the earliest stages of visual information processing they produced comparable neural activity to the controls during early visual processing and during face and facial affect encoding.

Contrary to numerous other studies, we did not find a performance deficit in visual affect recognition in schizophrenia patients. We were surprised by the lack of group differences, although other null findings have been previously noted.49,50 The absence of performance differences may reflect some unique methodological features of the paradigm used in this study. Subjects were initially exposed to the face stimuli during the gender identification task before completing the emotion identification task. Furthermore, after all 3 conditions, the entire procedure was repeated using the same stimuli. Thus, unlike standard facial affect perception tasks, the paradigm provided repeated exposure to the face stimuli. This increased familiarity with the test stimuli could result in enhanced performance levels. Although unexpected, the absence of performance deficits in the patients on this task has the interpretive advantage in that the group difference in the N250 cannot be attributed to poorer performance in the patient group.

This study has a few limitations. First, as mentioned earlier, we did not use emotionally neutral faces to assess ERP responses. Second, it is possible that our use of the same faces for both the gender identification task and emotion identification task may have affected our ability to detect N170 deficits in schizophrenia patients. Second, all patients were medicated at the time of assessment and had relatively chronic illness. While it is possible that a medication effect may have reduced patients’ N250 response, this explanation would be difficult to reconcile with the patients’ normal P100 and N170 waves. Future research on ERP responses to faces in unmedicated and recent-onset patients would help to address limitations associated with medications and chronicity. Finally, it will be necessary to determine if the ERP components are affected by changes in clinical state or if N170 and/or N250 deficits are stable trait markers.

In conclusion, the current study furthers our understanding of social cognition in general, and facial affect processing in particular, in schizophrenia. The results imply that schizophrenia patients’ deficits in processing facial affect occur at a later stage of processing beyond basic visual and facial feature processing. These results point to the importance of studying the time course of processing of facial affect in schizophrenia. They also imply that ERP components are possible biomarkers of underlying facial affect identification deficits and may be more sensitive to medication-induced changes than performance measures. Identifying neural substrates associated with the social cognitive deficits of schizophrenia can facilitate the development of new treatments that enhance social functioning.


Veterans Integrated Service Network 22 Mental Illness Research, Education and Clinical Center (to Wynn); National Alliance for Research on Schizophrenia and Depression Young Investigator Award (to Wynn); National Institute of Mental Health (MH-43292, MH-65707 to Green).


The authors would like to thank Shelly Crosby and Poorang Nori for their assistance in data collection and Catherine Sugar, PhD, for consultation on data analyses.


1. Bellack AS, Green MF, Cook JA, et al. Assessment of community functioning in people with schizophrenia and other sever mental illnesses: a white paper based on an NIMH-sponsored workshop. Schizophr Bull. 2007;33:805–822. [PMC free article] [PubMed]
2. Green MF. What are the functional consequences of neurocognitive deficits in schizophrenia? Am J Psychiatry. 1996;153:321–330. [PubMed]
3. Green MF, Kern RS, Braff DL, Mintz J. Neurocognitive deficits and functional outcome in schizophrenia: are we measuring the “right stuff”? Schizophr Bull. Special Issue: Psychosocial treatment for schizophrenia. 2000;26:119–136. [PubMed]
4. Green MF, Kern RS, Heaton RK. Longitudinal studies of cognition and functional outcome in schizophrenia: implications for MATRICS. Schizophr Res. 2004;72:41–51. [PubMed]
5. Addington J, Saeedi H, Addington D. Influence of social perception and social knowledge on cognitive and social functioning in early psychosis. Br J Psychiatry. 2006;189:373–378. [PubMed]
6. Brekke JS, Hoe M, Long J, Green MF. How neurocognition and social cognition influence functional change during community-based psychosocial rehabilitation for individuals with schizophrenia. Schizophr Bull. 2007;33:1247–1256. [PMC free article] [PubMed]
7. Sergi MJ, Rassovsky Y, Nuechterlein KH, Green MF. Social perception as a mediator of the influence of early visual processing on functional status in schizophrenia. Am J Psychiatry. 2006;163:356–358. [PubMed]
8. Vauth R, Rusch N, Wirtz M, Corrigan PW. Does social cognition influence the relation between neurocognitive deficits and vocational functioning in schizophrenia? Psychiatry Res. 2004;128:155–165. [PubMed]
9. Adolphs R. The neurobiology of social cognition. Curr Opin Neurobiol. Special Issue: Cognitive neuroscience 2001;11:231–239. [PubMed]
10. Addington J, Addington D. Facial affect recognition and information processing in schizophrenia and bipolar disorder. Schizophr Res. 1998;32:171–181. [PubMed]
11. Hellewell JSE, Connell J, Deakin JFW. Affect judgment and facial recognition memory in schizophrenia. Psychopathology. 1994;27:255–261. [PubMed]
12. Pollard VB, Hellewell JSE, Deakin JFW. Performance of schizophrenic subjects on tests of recognition memory, perception and face processing. Schizophr Res. 1995;15:122.
13. Frith CD, Corcoran R. Exploring “theory of mind” in people with schizophrenia. Psychol Med. 1996;26:521–530. [PubMed]
14. Sprong M, Schothorst P, Vos E, Hox J, van Engeland H. Theory of mind in schizophrenia: meta-analysis. Br J Psychiatry. 2007;191:5–13. [PubMed]
15. Uhlhaas PJ, Phillips WA, Schenkel LS, Silverstein SM. Theory of mind and perceptual context-processing in schizophrenia. Cogn Neuropsychiatry. 2006;11:416–436. [PubMed]
16. Addington J, Saeedi H, Addington D. Facial affect recognition: a mediator between cognitive and social functioning in psychosis? Schizophr Res. 2006;85:142–150. [PubMed]
17. Kohler CG, Bilker W, Haendoorn M, Gur RE, Gur RC. Emotion recognition deficit in schizophrenia: association with symptomatology and cognition. Biol Psychiatry. 2000;15:127–136. [PubMed]
18. Mandal MK, Pandey R, Prasad AB. Facial expressions of emotions and schizophrenia: a review. Schizophr Bull. 1998;24:399–412. [PubMed]
19. Schechter I, Butler PD, Zemon V, et al. Impairments of early-stage transient visual evoked potentials to magno- and parvocellular-selective stimuli in schizophrenia. Clin Neurophysiol. 2005;116:2204–2215. [PMC free article] [PubMed]
20. Doniger GM, Foxe JJ, Murray MM, Higgins BA, Javitt DC. Impaired visual object recognition and dorsal/ventral stream interaction in schizophrenia. Arch Gen Psychiatry. 2002;59:1011–1020. [PubMed]
21. Haenschel C, Bittner RA, Haertling F, et al. Contribution of impaired early-stage visual processing to working memory dysfunction in adolescents with schizophrenia: a study with event-related potentials and functional magnetic resonance imaging. Arch Gen Psychiatry. 2008;64:1229–1240. [PubMed]
22. Yeap S, Kelly SP, Sehatpour P, et al. Early visual sensory deficits as endophenotypes for schizophrenia: high-density electrical mapping in clinically unaffected first-degree relatives. Arch Gen Psychiatry. 2006;63:1180–1188. [PubMed]
23. Butler PD, Martinez A, Foxe JJ, et al. Subcortical visual dysfunction in schizohprenia drives secondary cortical impairments. Brain. 2007;130:417–430. [PMC free article] [PubMed]
24. Herrmann MJ, Ellgring H, Fallgatter AJ. Early-stage face processing dysfunction in patients with schizophrenia. Am J Psychiatry. 2004;161:915–917. [PubMed]
25. Johnston PJ, Stojanov W, Devir H, Schall U. Functional MRI of facial emotion recognition deficits in schizophrenia and their electrophysiological correlates. Eur J Neurosci. 2005;22:1221–1232. [PubMed]
26. Turetsky BI, Kohler CG, Indersmitten T, et al. Facial emotion recognition in schizophrenia: when and why does it go awry? Schizophr Res. 2007;94:253–263. [PMC free article] [PubMed]
27. Caharel S, Bernard C, Thibaut F, et al. The effects of familiarity and emotional expression on face processing examined by ERPs in patients with schizophrenia. Schizophr Res. 2007;95:186–196. [PubMed]
28. Campanella S, Montedoro C, Streel E, Berbanck P, Rosier V. Early visual components (P100, N170) are disrupted in chronic schizophrenic patients: an event-related potentials study. Clin Neurophysiol. 2006;36:71–78. [PubMed]
29. Bentin S, Allison T, Puce A, Perez E, McCarthy G. Electrophysiological studies of face perception in humans. J Cogn Neurosci. 1996;8:551–565. [PMC free article] [PubMed]
30. Eimer M. The face-specific N170 component reflects late stages in the structural encoding of faces. Cogn Neurosci. 2000;11:2319–2324. [PubMed]
31. Shibata T, Nishijo H, Tamura R, et al. Generators of visual evoked potentials for faces and eyes in the human brain as determined by dipole localization. Brain Topography. 2002;15:51–63. [PubMed]
32. Herrmann MJ, Ehlis AC, Muehlberfer A, Fallgatter AJ. Source localization of early stages of face processing. Brain Topography. 2005;18:77–85. [PubMed]
33. Itier RJ, Taylor MJ. Source analysis of the N170 to faces and objects. NeuroReport. 2004;15:1261–1265. [PubMed]
34. Joyce C, Rossion R. The face-sensitive N170 and VPP components manifest the same brain processes: the effect of reference electrode site. Clin Neurophysiol. 2005;116:2613–2631. [PubMed]
35. Kanwisher N, McDermott J, Chun MM. The fusiform face area: a module in human extrastriate cortex specialized for face perception. J Neurosci. 1997;17:4302–4311. [PubMed]
36. Streit M, Ioannides A, Sinneman T, et al. Disturbed facial affect recognition in patients with schizophrenia associated with hypoactivity in distributed brain regions: a magnetoencephalographic study. Am J Psychiatry. 2001;158:1429–1436. [PubMed]
37. Streit M, Ioannides AA, Liu L, et al. Neurophysiological correlates of the recognition of facial expressions of emotion as revealed by magnetoencephalography. Brain Res. 1999;7:481–491. [PubMed]
38. Streit M, Wölwer W, Brinkmeyer J, Ihl R, Gaebel W. EEG-correlates of facial affect recognition and categorisation of blurred faces in schizophrenia patients and healthy volunteers. Schizophr Res. 2001;49:145–155. [PubMed]
39. Frist MB, Gibbons M, Sptizer RL, Williams JBW. Users guide for the Structured Clinical Interview for DSM-IV Axis I Disorders-Research Version (SCID-1, Version 2.0, February 1996 Final Version) 1996 New York: Biometrics Research, New York State Psychiatric Institute.
40. Frist MB, Gibbons M, Spitzer RL, Williams JBW. Structure Clinical Interview for DSM-IV Axis II Personality Disorders. 1996 New York: Biometrics Research, New York State Psychiatric Institute.
41. Overall JE, Gorham DR. The Brief Psychiatric Rating Scale (BPRS): recent developments in ascertainment and scaling. Psychopharmacol Bull. 1988;24:97–99.
42. Ekman P, Friesen WV. Pictures of Facial Affect. Palo Alto, CA: Consulting Psychologists Press; 1976.
43. Semlitsch HV, Anderer P, Schuster P, Presslich O. A solution for reliable and valid reduction of ocular artifacts, applied to the P300 ERP. Psychophysiology. 1986;23:695–703. [PubMed]
44. Di Russo F, Spinelli D. Electrophysiological evidence for an early attentional mechanism in visual processing in humans. Vision Res. 1999;39:2975–2985. [PubMed]
45. Hillyard SA, Teder-Salejarvi WA, Munte TF. Temporal dynamics of early perceptual processing. Curr Opin Neurobiol. 1998;8:202–210. [PubMed]
46. Rebai M, Bernard C, Lannou J, Jouen F. Spatial frequency and right hemisphere: an electrophysiological investigation. Brain Cogn. 1998;1998:21–29. [PubMed]
47. Tobimatsu S, Kurita-Tashima S, Nakayama-Hiromatsu M, Akazawa K, Kato M. Age-related changes in pattern visual evoked potentials: differential effects of luminance, contrast and check size. Electroencephalogr Clin Neurophysiol. 1993;88:12–19. [PubMed]
48. Campanella S, Montedoro C, Streel E, Verbanck P, Rosier V. Early visual components (P100, N170) are disrupted in chronic schizophrenic patients: an event-related potentials study. Clin Neurophysiol. 2006;36:71–78. [PubMed]
49. Bellack AS, Blanchard JJ, Mueser KT. Cue availability and affect perception in schizophrenia. Schizophr Bull. 1996;22:535–544. [PubMed]
50. Vaskinn A, Sundet K, Friis S, et al. The effect of gender on emotion perception in schizophrenia and bipolar disorder. Acta Psychiatr Scand. 2007;116:263–270. [PubMed]

Articles from Schizophrenia Bulletin are provided here courtesy of Oxford University Press