Patients and controls both showed the expected waveforms for each of the ERPs during the 3 identification tasks. Furthermore, the controls’ waveforms were all significantly greater than zero, showing that valid waveforms were obtained. Both groups demonstrated initial P100 responses during all 3 identification tasks, with the largest P100 seen during the emotion identification task. The expected N170 response was also seen in both groups while viewing faces, with significantly larger N170s found during the emotion and gender identification tasks compared with the building identification task. Finally, both groups showed an N250 response that was largest during the emotion identification task, smallest during the building identification task, and intermediate during the gender identification task. Thus, the paradigms elicited valid data in both healthy controls and patients.
The results of the present study revealed that schizophrenia patients differed from the comparison group in the N250, an ERP component that is sensitive to the decoding of facial affect features, but exhibited normal P100 and N170 that reflect basic encoding and visual processing of facial features, respectively. These findings help clarify the temporal course of facial affect processing deficits in schizophrenia.
The finding of no difference between groups in the P100 component during face processing is consistent with the majority of studies of this waveform in schizophrenia.24–26
While previous studies of visual processing using nonface (see eg, Schechter et al,19
Haenschel et al,21
Yeap et al,22
and Butler et al23
) as well as face (see eg, Caharel et al27
and Campanella et al28
) stimuli have found P100 deficits in schizophrenia patients, the lack of a P100 deficit using facial stimuli in the current study implies that any deficits in facial affect recognition in schizophrenia patients occurs at a later stage of visual processing. The finding of an increased P100 during emotion identification was not expected. However, Turetsky et al26
reported a similar effect of emotion identification on the P100. It is not entirely clear why the P100 increased during emotion identification. It is possible that attentional factors could have influenced the P100,44,45
in particular on the more difficult emotion identification task. It is also possible that differences in stimulus properties between the face and gender buildings may have affected the P100, which is very sensitive to properties such as contrast, spatial frequency, and luminance.46,47
. While this may account for the differences in the P100 elicited by the faces and the buildings, it cannot account for the differences between the P100 elicited by the gender identification task and the emotion identification task as the stimuli were exactly the same. Further studies will be necessary to determine the effects of emotion identification on early visual processing ERP components
N170 deficits in schizophrenia patients during processing of neutral or emotional faces have been previously reported in some studies,24–26
but we did not find this deficit in the current study, similar to Streit et al38
The lack of an N170 deficit in our schizophrenia patients could be due to our use of different stimulus parameters or analysis methods compared with the studies that have found N170 deficits. For example, some studies have only examined the N170 to neutral faces (see eg, Herrmann et al24
) while others have examined the N170 to specific emotions, such as happy or sad (see eg, Turetsky et al26
). In addition, other studies finding an N170 deficit in schizophrenia patients used familiar vs unfamiliar faces (eg, Caharel et al27
), or an oddball-type task, where subjects had to detect a deviant face (either based on identity or emotion) presented among a series of standard neutral faces (see eg, Campanella et al48
). Different analysis methods have also been used. For example, Turetsky et al26
examined mean global field power whereas we examined mean amplitude. We also utilized different latency windows to examine our waveforms compared with studies that have found P100 or N170 deficits (see eg, Caharel et al27
and Campanella et al48
). Despite these methodological and analytical differences, the facial affect processing disturbances that patients in the current study demonstrated are not easily attributable to deficits in the basic visual processing or facial feature encoding stages. Future studies will need to be conducted to determine which stimulus parameters (eg, stimulus length, emotional vs neutral faces, specific emotions used) may affect N170 responses in schizophrenia.
The main finding of the current study is that we found a deficit in the N250 waveform in schizophrenia patients during all 3 identification tasks. Moreover, we found that the N250 was larger in both groups during the emotion identification task compared with the other 2 identification tasks, though for the patients their amplitude was still lower compared with the controls. While the N250 deficit in schizophrenia patients is not specific to emotion identification, it is important to note that the N250 wave is sensitive to emotion identification in schizophrenia patients, implying that it might be a useful biomarker for treatment studies.
We also found that patients had a prolonged latency in the P100 ERP during all 3 identification tasks, but comparable latencies for the N170 and N250 ERPs. The longer latency in the patients on the P100 wave may be reflective of their general inefficiency in the earliest stages of visual information processing. It is possible that delays in the earliest stages of visual information processing may influence later stages of face or facial affect processing though this does not seem likely. Amplitudes of the P100 and the latency and amplitude of the subsequent N170 were of equal magnitude. While the patients might have been slower in the earliest stages of visual information processing they produced comparable neural activity to the controls during early visual processing and during face and facial affect encoding.
Contrary to numerous other studies, we did not find a performance deficit in visual affect recognition in schizophrenia patients. We were surprised by the lack of group differences, although other null findings have been previously noted.49,50
The absence of performance differences may reflect some unique methodological features of the paradigm used in this study. Subjects were initially exposed to the face stimuli during the gender identification task before completing the emotion identification task. Furthermore, after all 3 conditions, the entire procedure was repeated using the same stimuli. Thus, unlike standard facial affect perception tasks, the paradigm provided repeated exposure to the face stimuli. This increased familiarity with the test stimuli could result in enhanced performance levels. Although unexpected, the absence of performance deficits in the patients on this task has the interpretive advantage in that the group difference in the N250 cannot be attributed to poorer performance in the patient group.
This study has a few limitations. First, as mentioned earlier, we did not use emotionally neutral faces to assess ERP responses. Second, it is possible that our use of the same faces for both the gender identification task and emotion identification task may have affected our ability to detect N170 deficits in schizophrenia patients. Second, all patients were medicated at the time of assessment and had relatively chronic illness. While it is possible that a medication effect may have reduced patients’ N250 response, this explanation would be difficult to reconcile with the patients’ normal P100 and N170 waves. Future research on ERP responses to faces in unmedicated and recent-onset patients would help to address limitations associated with medications and chronicity. Finally, it will be necessary to determine if the ERP components are affected by changes in clinical state or if N170 and/or N250 deficits are stable trait markers.
In conclusion, the current study furthers our understanding of social cognition in general, and facial affect processing in particular, in schizophrenia. The results imply that schizophrenia patients’ deficits in processing facial affect occur at a later stage of processing beyond basic visual and facial feature processing. These results point to the importance of studying the time course of processing of facial affect in schizophrenia. They also imply that ERP components are possible biomarkers of underlying facial affect identification deficits and may be more sensitive to medication-induced changes than performance measures. Identifying neural substrates associated with the social cognitive deficits of schizophrenia can facilitate the development of new treatments that enhance social functioning.