Past face perception researchers have assumed that race and expression information are processed by independent, parallel streams, but this question has rarely been empirically addressed (Bruce & Young, 1986
; Haxby et al., 2000
). The research that has directly examined the issue demonstrates that race and emotion cues can affect each other (Hugenberg & Bodenhausen, 2003
), but these effects have primarily been obtained when one of the cues was ambiguous and therefore difficult to process. Moreover, only explicit judgments have been assessed. In order to fully inform our understanding of face perception, we were primarily interested in understanding how these cues are processed when both sources of information are relatively unambiguous. We were also interested in combining the use of ERP and behavioral measures to track the time courseof different aspects of social perception.
The pattern of ERP results we obtained is consistent with past research on the perception of race and emotion. At the earliest stages of processing that we assessed, N100 and P200 effects clearly replicate past research suggesting greater attention to the racial group that is most distinctive and/or negative. In the case of the present sample, which was non-Black and largely White, this resulted in larger N100s and P200s to Blacks than Whites. The larger P200s to angry than neutral expressions, and tendency for larger P200s to angry than happy expressions is also consistent with an attentional distinctiveness/threat effect.
At the next stage of processing reflected in the N200, results were consistent with past research in showing larger N200s to Whites than Blacks and to happy than angry expressions. These effects are consistent with past researcher’s suggestions that the N200 reflects individuation and deeper processing when viewing faces, which we would expect to be greater for faces of Whites (because they are members of a culturally powerful group and/or the ingroup) and faces displaying happy, approachable expressions (Holmes et al., 2003
; Ito & Urland, 2003
; James et al., 2001
; Tanaka et al., 2004
; Walker & Tanaka, 2003
As perception continued, P300s results also replicated past research showing larger P300s to the most motivationally relevant and arousing stimuli (angry expressions). P300 effects were limited to the emotion task, when task demands served to focus attention on this cue.
Finally, in the later explicit categorization stage, latencies to make explicit race and expression judgment replicated as well as extended past effects. As in previous research, reaction times were faster to Blacks and happy expressions when that dimension was task relevant. That is, participants were faster to respond to Black faces in the race task, and to happy expressions in the emotion task. We extend these past findings by showing that when explicitly attending to facial expressions, participants were faster to categorize White than Black targets. This shows that participants were sensitive to race information in both tasks but that different processes may govern response latency in the two tasks. Lower contact with Blacks and the viewing of Whites as a cultural racial default (Levin, 2000
; Stroessner, 1996
) likely make Black faces more discriminable than White ones. The faster responses to Blacks in the race task may indicate that discriminability plays a large role in this task. On the other hand, our participants may have had more experience decoding the emotional expressions of Whites than Blacks. The faster responses to Whites in the emotion task may indicate that familiarity plays a larger role in this task.
It is interesting to note that task effects were more evident later in processing, emerging in the P300 and explicit judgments. In the N100, P200, and N200, race and emotion effects were generally identical in both task conditions. By contrast, in response latencies, the race effects went in opposite directions as a function of task. Task differences where also seen for emotional expression. In both the P300 and response latencies, expression had an effect only when emotion was explicitly task relevant. These results suggest that the N100, P200, and N200 are reflecting more obligatory aspects of processing, leading to sensitivity to race and emotion information even when not explicitly task relevant. Processes that occur later in processing and are reflected in the P300 and in explicit judgments are likely more thoughtful, and are more influenced by task goals. Thus, these results show that race and emotion processing is both automatic yet affected by explicit processes depending on the point in processing.
Despite clear evidence that race and emotion affected both early perceptual and later explicit processes, there was little evidence that the two types of cues interact to affect perception. Of importance, there were no interactions between race and emotion in the P200, N200, P300, or response latencies for explicit judgments. This supports models of face perception suggesting that these two types of information are typically processed independently and in parallel. As Haxby et al. (2000)
noted, face perception involves encoding of both relatively invariant, unchangeable aspects of a face such as eye shape and lip width as well as changeable aspects of the face such as whether the eyes are open wide and the mouth is turned up or down (see also Bruce & Young, 1986
). Some aspects of face perception rely more heavily on one type of cue. Recognition of a unique individual, for instance, depends more heavily on the invariant aspects of a face, and needs to be achieved under different configurations of the changeable aspects of the face (e.g., both when the person is smiling or frowning). Likewise, perception of the changeable aspects of the face, as during expression or speech recognition, needs to be achieved across unique individuals. For these reasons, social category judgments and expression perception have typically been associated with functionally distinct aspects of face perception, capable of being processed in parallel. Our results suggest that under relatively common viewing conditions – when both the race and expression information are easy to perceive – the cues have little effect on each other. This is true both of relatively early aspects of perception, as reflected in the ERP responses, as well as more explicit judgments.
This is not to say that the two types of cues cannot interact. Situations of ambiguity in one or both sources of information may be resolved by integrating the two types of information. As Hugenberg and Bodenhausen (2003
have shown, expression information can influence race categorization when there is difficulty in determining an individual’s race and vice versa. Moreover, to the degree that impression formation integrates across different types of information, both from the face as well as from other sources (e.g., behavioral information), interactions between race and expression seem likely. Consistent with this, Chiu, Ambady, and Deldin (2004)
showed that both race and emotion affected judgments about whether participants reported wanting to work with different targets. Measures of anticipated cognitive effort were also affected by the interaction between race and expression information.
Somewhat surprisingly, the only evidence of a race by emotion interaction in the present study was the marginal interaction obtained in the N100. Follow-up contrasts showed that N100s were larger to Blacks than Whites for angry and neutral expressions, but did not differ as a function of race for happy expressions. This pattern could reflect a vigilance effect, where the increased attention generally shown to Black faces is attenuated when the face displays a clearly positive and therefore non-threatening expression. On the other hand, no similar effect was seen in any other ERP component or in response latencies. Moreover, it is surprising that this effect was seen in our earliest measure, and not in any of the subsequent measures. We are therefore cautious in interpreting its implications. Although this may reflect a moderation of vigilance by emotion, it could also be due to some physical properties of the stimuli. For instance, angry and neutral expressions both tended to differ from happy expressions because the latter involved an open mouth that revealed teeth. It is possible that the N100 was reflecting physical properties such as this and not race and emotion perception per se. Considering all the measures together, then, our results support the general conclusion that race and emotion cues are processed independently beginning at least as early as 170 ms (the mean latency of the P200) and continuing through to explicit race and emotion judgments. Although there may be situations where perceivers are forced to draw upon other cues to inform their decision, when cues are clearly interpretable, processing appears to occur quickly and in parallel.
A second goal of this study was to examine how attention relates to later categorization. The regression analyses allow us to track how these early perceptual processes relate to explicit categorization decisions. These analyses revealed that sensitivity to race at earlier points in perception affects response latencies in making race judgments, and that sensitivity to expression at earlier points in perceptions affects response latencies in making emotion judgments. The relation between attentional differences as a function of race and response latencies in the race task are consistent with suggestions that outgroup faces are processed in terms of race-specifying features at the expense of more individuated processing (Levin, 2000
). We found that the degree to which N200s were larger to Whites than Blacks predicted faster responses to Black than White faces in the race categorization task. N200 amplitude has been associated with depth of processing in face perception, so this effect suggests that differences in an ERP component associated with variations in depth of processing are associated with variations in an explicit measure sensitive to the salience of race-specifying information. Note that these effects were limited to the race categorization task. No comparable relation was found when participants were explicitly attending to expression.
Considering the emotion effects obtained in both tasks, larger amplitude differences in favor of angry expressions were associated with slower responses to happy relative to angry expressions. As noted previously, this may reflect a trade-off between initial attentional engagement and response latency. That is, initially directing more attention to one class of stimuli (i.e., angry expressions) may slow subsequent responses to other stimuli (i.e., happy expressions) due to difficulties in attentional disengagement from threatening faces (Koster et al., 2004
; Fox et al., 2001
Looking across the regression analyses, we see that effects were obtained with three different ERP components (N100, P200, N200). The N100 and P200 often show similar effects of race and emotion, and both components have been associated with vigilance and sensitivity to novelty. The N200, as noted, has been associated with depth of processing/individuation. The fact that N100 and P200 effects predicted differences in response latency as a function of emotional expression whereas the N200 predicted differences in response latency as a function of target race may reflect the nature of processing most important for each stimulus dimension. That is, differences in processes related to vigilance and novelty may be more integral in differentiating responses to faces showing different expressions, but differences in processes related to individuation may contribute more to variance in responses to faces of different races. In considering why the emotion effects were seen in the N100 in one analysis and in the P200 in the other, this could simply reflect conceptually similar results that manifested at slightly different points in time. This could relate to something systematic (i.e., the effects occur earlier in one task than another), or simply to some feature in this particular sample. Differentiating between the two explanations would require examining the same issue in a different sample.
Finally, the inclusion of the spatial location task with the blurred images extends past research on race and emotion by showing that race and emotion effects on ERPs reflect something about the perception of race and emotion information themselves, and not just the low-level physical features that may covary with race and emotion information. This conclusion is supported first by the absence of race or emotion effects in the ERP components recorded during the spatial location task. In addition, the morphology of the waveforms was very different. It was particularly notable that the P200 and N200 obtained when the stimuli could readily be perceived as faces were not elicited by the blurred images.
Researchers interested in social perception have just begun to investigate how cues relevant to social inferences are perceived in combination. This is a particularly important question when applied to face perception given the many different types of information available from a single face. Looking across the different measures we assessed, our results suggest that perceivers very quickly process both race and emotion information, even when it is not explicitly task relevant. Each of these types of information appears to be processed independently and in parallel. Moreover, the psychological processes governing perception at different points in time may differ, starting first with a coarse vigilance process (seen in the N100 and P200) that eventually changes to reflect differences in individuation (seen in the N200). Analyses comparing the ERP and response latency results suggest that individual variation in the vigilance processes may be more important in making explicit emotion categorization judgments whereas individual variation in the individuation processes may be more important in making explicit race categorization judgments. While effects are relatively more obligatory at the earliest stages of processing, explicit task goals begin to affect responses by at least 445 ms (the mean latency of the P300), and continue into explicit judgments. There is also evidence of asymmetry between the race and emotion effects, with race effects occurring regardless of task (albeit in different directions), but emotion effects being confined at these later points in processing to instances where emotion was explicitly task relevant.
This is the first examination to our knowledge of the processing of both race and emotion cues over time, using both explicit measures as well as implicit neurological indices. We found support for the assertion that these cues are processed quickly and independently at virtually all stages of processing. These findings are a step toward understanding how social category information is processed in combination, providing an initial framework from which to proceed. Future face perception models should strive to include this valuable and rich information that we extract from faces everyday in order to fully capture all facets of face perception.