PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptNIH Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
J Exp Psychol Hum Percept Perform. Author manuscript; available in PMC Apr 1, 2011.
Published in final edited form as:
PMCID: PMC2852199
NIHMSID: NIHMS171913
The impact of facial emotional expressions on behavioral tendencies in females and males
Eva-Maria Seidel,ab* Ute Habel,ab Michaela Kirschner,c Ruben C. Gur,d and Birgit Derntlabc
aDepartment of Psychiatry and Psychotherapy, RWTH Aachen University, Aachen, Germany
bJülich Aachen Research Alliance, Translational Brain Medicine, Germany
cInstitute for Clinical, Biological and Differential Psychology, Faculty of Psychology, University of Vienna, Vienna, Austria
dNeuropsychiatry Division, Department of Psychiatry, University of Pennsylvania and the Philadelphia Veterans Administration Medical Center, Philadelphia, PA, United States
* Correspondence and requests for material should be addressed to Eva-Maria Seidel, MSc: Department of Psychiatry and Psychotherapy, University of Aachen, Pauwelsstrasse 30, 52074 Aachen, Germany. Tel: +49-241-8089354, Fax: +49-241-8082401, eseidel/at/ukaachen.de
Emotional faces communicate both the emotional state and behavioral intentions of an individual. They also activate behavioral tendencies in the perceiver, namely approach or avoidance. Here, we compared more automatic motor to more conscious rating responses to happy, sad, angry and disgusted faces in a healthy student sample. Happiness was associated with approach and anger with avoidance. However, behavioral tendencies in response to sadness and disgust were more complex. Sadness produced automatic approach but conscious withdrawal, probably influenced by interpersonal relations or personality. Disgust elicited withdrawal in the rating task whereas no significant tendency emerged in the joystick task, probably driven by expression style. Based on our results it is highly relevant to further explore actual reactions to emotional expressions and to differentiate between automatic and controlled processes since emotional faces are used in various kinds of studies. Moreover, our results highlight the importance of gender of poser effects when applying emotional expressions as stimuli.
Keywords: Behavioral Tendencies, Approach, Avoidance, Emotional Expression, Gender
Facial emotional expressions are salient social cues in everyday interaction. Behavioral data suggest that human facial expressions communicate both the emotional state of the poser and behavioral intentions or action demands to the perceiver (Horstmann, 2003). Facial emotional expressions are used in various fields of research, e.g., social cognition, psychology, neuroscience. A lot of work has been done in how we process these facial signals and how we recognize the emotional meaning (e.g., Adolphs, 2006). However, in this regard it is also highly important to objectify how the perceiver of an emotional display responds behaviorally. How do we actually react to the displays of various emotional states of an opponent? What kind of behavior do these social signals trigger in the receiver? A better understanding of the effects on the perceiver might add important insights on the interpretation of any findings related to these kinds of social stimuli. Moreover, some studies divided the different emotional expressions in negative and positive cues based on theoretical speculations without directly testing their impact on behavior, thus lacking a justification of this categorization on the valence dimension.
Regarding behavioral tendencies, two opposite poles of human behavior and motivation, approach and avoidance, are most pertinent. Gray's theory (Gray, 1982) of a behavioral approach (BAS) and a behavioral inhibition system (BIS) has been examined most extensively. This model supposes two antipodal motivational systems: one appetitive (approach) and one aversive (avoidance) both forming the basis of human behavior (Puca, Rinkenauer, & Breidenstein, 2006).
In their review of historical conceptualizations of approach and avoidance motivation, Elliot and Covington (2001) concluded that automaticity in evaluating incoming stimuli and generating an adequate behavioral response poses an evolutionary advantage. In humans, however, this link might at least be influenced by other, more conscious, motivational mechanisms, such as relationship and mood state. In accordance with the emotion-motivation model of Lang, Bradley and Cuthbert (1990), Bargh (1997) hypothesized that these systems build an interface between perception and action and are directly activated by perceived stimuli. Following this assumption, responsiveness of the BIS and BAS triggers behavioral tendencies of approach or avoidance and associated emotions.
Studies investigating behavioral tendencies mostly followed Cacioppo, Priester and Berntson (1993), who demonstrated that affective evaluation of neutral stimuli can be influenced by motor manipulations such as isometric arm flexion and extension. Thus, neutral Chinese ideographs are rated more positively during arm flexion and vice versa. Therefore, it has been deduced that pushing a lever (extension) might be faster than pulling (flexion) in response to aversive stimuli and pulling is faster than pushing in response to appetitive stimuli. Chen and Bargh (1999) showed that both explicit categorization and implicit processing of positive and negative words directly elicit behavioral tendencies such that positive evaluations result in approach whereas negative evaluations produce avoidance behavior. Presenting novel graphic images, Duckworth, Bargh, Garcia and Chaiken (2002) also observed such direct behavioral consequences of an automatic evaluation. Using two different paradigms (proprioceptive and visual cues) to induce approach and avoidance illusions, Neumann and Strack (2000) showed that perceived approach or avoidance movements correspondingly facilitate valence categorization of positive and negative words. They also observed influences of exteroceptive illusions on non-affective judgments. This effect was not mediated by extremity of the presented words, indicating that approach and avoidance systems are activated in an all-or-nothing fashion.
Marsh, Ambady and Kleck (2005) investigated how facial expressions (fear and anger) affect the perceiver's behavior in an explicit categorization task. While anger expressions were associated with avoidance, fear expressions elicited approach tendencies. The authors hypothesized that expressing fear indicates submission rather than provoking threat, and serves to pacify the opponent (cf. Hess, Blairy, & Kleck, 2000). However, a preceding study questioned the assumed automatic association between evaluation and behavior (Rotteveel & Phaf, 2004). These authors failed to find any influence of the valence of facial expressions (happy and angry) on action tendencies when attention was drawn to non-affective features of the targets (gender discrimination). Utilizing an indirect joystick task (without classification of the expressions) in a sample of socially anxious individuals, Heuer, Rinck, and Becker (2007) observed pronounced avoidance tendencies in response to angry and happy faces in anxious patients but not in the control group. According to the authors, it seems that the non-anxious controls were able to ignore the task-irrelevant dimension of emotional content and responded only to the relevant dimension (puzzle versus face), while response behavior in the anxious group seemed to be significantly influenced by the emotional expression.
Several studies indicated that gender influences emotional processing (e.g., Hall & Matsumoto, 2004; Thayer & Johnson, 2000). Regarding effects of gender of poser, Marsh et al. (2005) observed that their participants responded faster to female than to male faces and most quickly to women expressing fear. On the other hand, Rotteveel and Phaf (2004) reported that their female sample reacted faster to male than to female faces, particularly to angry male faces.
Due to the heterogeneity of previous results, one major aim of the present study was to clarify behavioral reactions in response to four basic emotions (happiness, sadness, anger, disgust) thereby extending the findings of Marsh et al. (2005) who examined posed expressions of anger and fear. However, contrary to previous studies, we applied evoked facial emotional expressions due to their higher ecological validity and naturalness. Comparing the conflicting findings of Marsh et al. (2005), Rotteveel and Phaf (2004) as well as Heuer et al. (2007), it remains to be clarified whether emotional expressions automatically elicit approach or avoidance behavior and to which extent this is influenced by conscious processes (e.g., task instruction). Therefore, another major aim of this study was to elucidate possible discrepancies between more automatic and more controlled processes in behavioral tendencies to emotional expressions. To examine this, we added a rating task of the emotional expressions to compare the consciously reported tendencies (explicit) with the tendencies based on the reaction time differences (implicit), which reflects a more automatic evaluation.
Based on rating data of a large sample of adults, Horstmann (2003) reported that sad faces signal a request for help or comfort (approach), whereas disgust and anger faces communicate the request to go away (avoidance). The expression of happiness was perceived as an invitation to reciprocate smiles and to cooperate. Therefore, we hypothesize that facial expressions of sadness and happiness elicit approaching behavior whereas angry and disgusted faces initiate avoidance. However, comparing more automatic (joystick task) to more controlled (rating task) processes, we expect to elucidate differential processes in reactions to these diverging facial emotional expressions.
Finally, due to the inconsistencies in published reports on the effects of gender of poser, we also aimed at elucidating the influence of gender of poser and rater on behavioral tendencies by applying a gender-balanced task in a representative gender-balanced sample.
Participants
Fifty-five female and 49 male Caucasian Vienna University students participated in our study. We investigated Caucasian students to obtain a homogenous sample concerning cultural background, age (males: mean=25.08 (2.605); females: mean=24.21 (2.562)) and level of education (years; males: mean=17.18 (2.21); females: mean=16.59 (2.00)) as these variables have been shown to influence facial emotional processing (e.g., Hess, Blairy & Kleck, 2000; Calder et al., 2003). All participants had normal or corrected to normal vision. Male and female participants did not differ in age, t(99) = 1.70, ns, or years of education, t(99) = 1.42, ns.
Materials and Procedure
We utilized the same joystick task (using the ST290 PRO Joystick, Saitek, Munich, Germany) and reaction time software (DirectRT; Jarvis, 2004) as implemented and described in detail by Marsh et al. (2005). The joystick was placed on a rubber pad on the desk in front of the participants, who were instructed not to change the general position of the joystick. Furthermore, participants were instructed to move the lever of the joystick during the experiment only directly forward or backward as far and as quickly as possible. DirectRT software recorded the time (with a precision of 10 ms) after each stimulus presentation at which the lever reached its maximal point from the baseline position.
Participants viewed 24 colored photographs of evoked facial expressions displayed by Caucasian actors (balanced for gender) with the following four basic emotions: happiness, sadness, anger and disgust. All expressions were taken from a stimulus set that has been standardized and used repeatedly as neurobehavioral probes in neuropsychological studies (see Gur et al., 2002a for development of stimuli; Derntl, Kryspin-Exner, Fernbach, Moser, & Habel, 2008a; Derntl et al., 2008b; Fitzgerald, Angstadt, Jelsone, Nathan, & Phan, 2006; Habel et al., 2007).
We composed two different sets of pictures, one containing happiness and disgust and the other containing sadness and anger, based on our hypothesis that happiness and sadness would be associated with approach and anger and disgust with avoidance. Thus, we paired two possible antipodal emotions in these stimulus sets. Participants were shown the same 12 emotional expressions (e.g., six happy vs. six disgusted or six sad vs. six angry) twice, while the instruction once was to pull the lever in response to approach-inducing emotion (e.g., happy) and to push it in response to avoidance-inducing expressions (e.g., disgust), and for the second run the instruction was reversed. Therefore, we had four different conditions, which we presented in randomized order. To avoid any confusion of instructions possibly increasing mistakes, when the reversed instruction with the same emotions directly follows, we just randomized the order of the emotional pairs across participants. This means that we for example presented pair A (happiness and disgust) with the “congruent” instruction (happy-pull and disgust-push), followed by pair B (sadness and anger) with the “incongruent” instruction, then pair A again with the “incongruent” instruction and pair B with the “congruent” instruction.
Every stimulus was presented until the first response occurred but maximally for two seconds, followed by a black screen for two seconds. Before and between every emotional condition a joystick practice run was conducted instructing the participants to alternately push or pull the lever in response to an asterisk stimulus.
Additionally, we implemented a nine-point rating scale (ranging from +4 to -4), where participants were asked to imagine standing face to face to the person and explicitly rate their tendency to approach or avoid the person showing the particular emotional expression. Behavioral anchors for each scale point were the number of steps participants would make towards (+) or away (-) from the person expressing the particular emotion observed on the screen. Participants were told not to refer their rating to the attractiveness or trustworthiness of the person but only to the emotional expression. This rating scale measured the explicit, conscious behavioral tendency, in contrast to the implicit, more automatic motor reaction in the joystick task.
The order of the explicit rating and the implicit joystick task was randomized across participants.
See Figure 1 for experimental setup of the two tasks.
Figure 1
Figure 1
Illustration of experimental setup for the joystick task (a) and the explicit rating paradigm (b).
Statistical Analysis
Statistical analyses were performed using SPSS (Statistical Packages for the Social Sciences, Version 14.0, SPSS Inc., USA).
Data of the joystick task were analyzed with a 2 (gender of participant) × 2 (gender of poser) × 4 (emotional expression) × 2 (lever direction) ANOVA with repeated measures on the averages of the log-transformed reaction times for correct responses to each type of emotional expression. We also computed Wilcoxon signed-rank tests to analyze whether the error frequencies differed as a function of lever direction for every type of emotional expression.
The analysis of the data of the explicit rating task was calculated with a 2 (gender of participant) × 2 (gender of poser) × 4 (emotional expression) ANOVA with repeated measures, with the emotion specific averages of the ratings serving as dependent measures. In the case of post-hoc multiple comparisons p-value corrections were conducted using the Bonferroni method, multiple post-hoc t-test were Bonferroni-Holm corrected. For significant effects estimates of effect size are listed: for the ANOVAs the partial-eta squared and for the post-hoc t-tests d-values are given.
Joystick task
Mean and standard errors of reaction times of pushing and pulling are illustrated in Figure 2a.
Figure 2
Figure 2
Mean reaction times (ms) of pushing and pulling a lever in response to emotional expressions (a) and mean reaction times of both pushing and pulling in response to male and female faces depicting emotional expressions (b).
The ANOVA showed a significant main effect for emotional expression, F(3,276) = 118.28, p < .001, η2 = 0.56, with fastest reactions to happiness followed by disgust, anger and sadness. A main effect of gender of poser, F(1,92) = 70.11, p < .001, η2 = 0.43, indicated that all participants responded more quickly to male than to female faces. There was no main effect of participant gender, F(1,92) = 1.306, ns, nor for lever direction, F(1,92) = 0.78, ns, the latter, confirming a balanced responding in the lever paradigm.
Notably, the hypothesized interaction between lever direction and emotional expression was significant, F(3,276) = 9.05, p < .001, η2 = .09, indicating differential direction for the polar emotions. The interaction between gender of poser and emotional expression reached significance, F(3,276) = 17.32, p < .001, η2 = 0.16, showing that response times differed for male and female posers depending on the specific emotion. Moreover, a trend for a gender of poser by lever direction interaction occurred, F(1,92) = 3.56, p = .062, η2 = .037, such that in response to female faces pulling was faster than pushing and in response to male faces pushing was faster than pulling.
Further elucidating the significant lever direction by emotional expression interaction, post-hoc Bonferroni-Holm-corrected paired-sample t-tests revealed that this interaction was significant for sadness, t(99) = 4.03, p < .001, d = 0.36, and happiness, t(98) = 2.52, p = .039, d = 0.25, indicating approach tendencies. Anger expressions tended to elicit avoidance behavior, t(99) = 2.02, p = .046 (corr. p = .092). For the expression of disgust the post-hoc comparison was not significant, t(98) = 1.55, p = .124.
Post-hoc Bonferroni-Holm-adjusted paired-sample t-tests to clarify the significant gender of poser by emotional expression interaction demonstrated that participants responded (irrespective of lever direction) more quickly to angry, t(96) = 8.72, p < .001, d = 0.73 and disgusted, t(97) = 6.61, p < .001, d = 0.44, male compared to female faces.
No other main effects or interactions of this ANOVA reached significance.
Mean and standard errors of response times (of both pushing and pulling) to male and female faces are presented in Figure 2b.
The Wilcoxon signed-rank tests on the error frequencies for each lever direction in response to every type of emotional expression did not reach significance, all ps > .40.
Rating of emotional expressions
Across all emotional expressions 13.3 % of the participants had each a standard deviation greater than 2.5, 53.5 % > 2, 75.8% > 1.8 and 92.9 % > 1.5 in their explicit responses. This result indicates that the whole scale length was used to rate the explicit approach/avoidance reaction (see Figure 3 for an illustration of responses).
Figure 3
Figure 3
Scatterplot of explicit ratings (mean of ratings and a line indicating mean group ratings) illustrating that the whole scale length was used for the responses. Note that equal ratings of participants are covered by only one data point.
The 2 (gender of participant) × 2 (gender of poser) × 4 (emotional expression) ANOVA with repeated measures on the rating data of the emotional expressions revealed a significant main effect of emotional expression, F(3,297) = 405.13, p < .001, η2 = 0.804, with happy faces eliciting approaching behavior, whereas sadness, disgust and anger prompted avoidance. Neither a main effect for gender of poser, F(1,99) = 2.26, ns, nor for gender of participant, F(1,99) = 0.19, ns, emerged.
Furthermore, the analysis yielded a significant interaction between emotional expression and gender of poser, F(3,297) = 36.82, p < .001, η2 = 0.27, and a significant interaction between gender of poser and gender of participant, F(1,99) = 6.70, p = .011, η2 = 0.06.
Disentangling the significant gender of poser by gender of participant interaction revealed that women, t(51) = 2.78, p = .016, d = 0.32, rated female faces more positive than male faces.
Post-hoc Bonferroni-Holm-adjusted t-tests of the emotional expression by gender of poser interaction indicated that happy, t(100) = -4.74, p < .001, d = 0.37, and sad male faces, t(100) = -3.97, p < .001, d = 0.33, were rated more positive than corresponding female faces, but angry, t(100) = 6.16, p < .001, d = 0.63, and disgusted female faces, t(100) = 5.54, p < .001, d = 0.56, were rated more positive than male faces. Mean and standard errors of the rating data across female and male posers are illustrated in Figure 4.
Figure 4
Figure 4
Results of the explicit rating task displayed by male and female actors visualizing the significant emotional expression by gender of poser interaction
The aim of the present study was to explore behavioral approach and avoidance tendencies in response to evoked expressions of four basic emotions displayed by male and female actors. Moreover, in contrast to previous studies (e.g., Marsh et al., 2005; Rotteveel & Phaf, 2004), we directly compared rather automatic with more conscious behavioral reactions to these salient emotional cues.
Emotional expressions
Data from the joystick task measuring rather implicit behavioral tendencies revealed that happiness activates the hypothesized behavioral approach system. This finding is in accordance with previous reports of happy faces communicating an invitation to cooperate (Horstmann, 2003).
Angry expressions elicited pronounced avoidance according to our rating data, but this tendency did not reach significance in the joystick data. However, Marsh et al. (2005) showed that angry faces facilitated avoidance behaviour and Horstmann (2003) concluded from his rating data that anger communicates the request to go away. This idea also has intuitive appeal since in a social interaction anger is mostly elicited by a negative action of the opponent. Our explicit rating data are congruent with the notion that angry expressions activate the behavioral avoidance system and the implicit joystick data strongly point into the same direction.
However, comparison of results for sadness in the joystick task (more automatic evaluation) and the rating task (conscious processing) revealed diverging tendencies. In the rating paradigm, subjects indicated that their conscious preference is to keep their distance from the sad emotional face, indicating avoidance. In contrary, results of the joystick task revealed approaching behavior. This inconsistence is really astonishing and has several important implications for studies using sad expressions as negative cues. Based on the joystick data, we assume that the automatic response to sadness is to approach a sad person, who is communicating a request for help (Horstman, 2003). However, this does not imply that this first response of approach is triggered by sadness being positively valenced for the perceiver. Sadness is a distress cue which is supported by the fact that with enhanced cognitive processing our participants seem to evaluate the conveyed emotional message more negatively and react with avoidance. We suppose an underlying conflict between the more automatic tendency to support others and the tendency to avoid distress cues which might be triggered by social learning processes. Moreover, previous experiences may reinforce a tendency not to burden oneself with others' distress.
Generally, our findings seem somehow contrary to the model of Lang et al. (1990) who proclaimed that the mere existence of an emotional stimulus will trigger the corresponding motivational system and thereby elicits an automatic behavioral response. However, Graziano, Habashi, Sheese and Tobin (2007) reported that perceived relationship is an important factor in helping behavior. In their study, kin received more help than did nonkin. Therefore, we suppose that the non-familiarity of our stimulus faces influenced the avoidance tendencies in the conscious rating task. Notably, in the implicit joystick task non-familiarity does not seem to affect the automatic approaching tendencies when facing a person in need. Moreover, analysis of the impact of different personality traits on helping behavior e.g., altruism (Wang & Wang, 2008) and agreeableness (Graziano et al., 2007), might further elucidate those inconsistencies.
According to our data, approach might be the automatic reaction to expressions of sadness but this tendency is interrupted and re-evaluated when higher conscious processes are involved, such as familiarity with the sad person, personality traits, etc. Hence, our data strongly support the significance of differentiating more controlled and more automatic processes of interpersonal behavior to gain new insights about basic mechanisms of social interaction.
Behavioral reactions to expressions of disgust also seem rather complex. While the lever-pushing data were not significant, the rating data showed a significant association with the avoidance system. From an evolutionary point of view, disgust signals a request to avoid, e.g., the food just consumed. However, the recipient himself may cause the disgust, for example by his or her appearance. In this case, the expression of disgust may signal the request to go away. Rozin, Lowery and Ebert (1994) showed differences between those two forms of disgust in facial expressions such that food-offense disgust is associated with tongue extrusion whereas a raised upper lip signals individual-related disgust. Our unclear finding concerning behavioral tendencies may be due to the inclusion of both of these two disgust expressions in our stimulus material. Food-offense disgust might lead to approach whereas expressions of individual-related disgust may prompt avoidance by signaling the person is not tolerated here and should keep some distance.
Reaction times
Happy faces generally elicited the fastest responses followed by disgust, anger and sadness. Happiness appeared to be the emotion best recognized, which might have influenced the response times of the joystick task. Hence, our data support Duckworth et al. (2002) who also observed fastest responses to positive words, probably due to their simple and unambiguous meaning and the fact that happy expressions are clearly characterized by the smiling mouth and were the only positive emotion presented.
Despite its low accuracy in previous studies (e.g., Derntl et al., 2008a, b) disgust elicited rather quick categorization and behavioral responses. This might have been due to the pairing with happiness, enabling a quick categorization of disgust which otherwise is hard to recognize. However, from an evolutionary point of view, disgust has a high relevance for surviving by communicating the information of uneatable food. A distinction between the subtypes of disgust expression (food-offense and individual-related disgust) may help differentiate its communicative role and the thereby elicited diverging behavioral responses in the perceiver.
Gender of poser
Although female and male participants did not differ in their behavioral tendencies, gender of poser seems to be an important influencing factor on both conscious and more automatic behavioral tendencies. In the explicit rating task, male sad and happy expressions were rated more positively than the corresponding female faces. Contrarily, angry and disgusted male faces elicited faster responses in the joystick task and were rated more negatively than the corresponding female faces. This effect might be linked to the observed interaction of gender of poser and gender of rater, where women tend to evaluate female faces more positively than male faces in the rating task. Generally, it is assumed that men learn to hide helplessness and express aggression related emotions such as anger more clearly, whereas women are reinforced when showing helplessness and suppressing anger (Fischer, 1993). Therefore, social learning possibly turned male faces expressing negative emotions (e.g., anger and disgust) to more salient social cues, which communicate the explicit message to go away.
Limitations
There were several methodological limitations of our study that have to be discussed. Our participants saw the same pictures twice, which might have had an impact on both joystick and rating data. In particular, performing the rating task before the joystick task might have influenced the reactions in the joystick task, but we did not observe any difference between the two groups (with different sequences of the tasks). The fixed combinations (anger with sadness / disgust with happiness) in the presentation of the four emotional expressions might also have had an impact on our results. One can argue that combining possible antipodal emotions intensifies the effect by enhancing the possible valence difference in the direct comparison. Future studies should clarify this by combining different emotional expressions in a more unsystematic way.
Moreover, we used static emotional expressions despite their unrealistic character. In order to generate more ecologically valid results, the use of dynamic facial expressions or whole-body stimuli should be considered in future studies. Additionally, the neutral experimental context might have influenced the behavioral tendencies. Therefore, the implementation of a joystick task in a more realistic experimental situation (e.g., virtual reality) may help clarify approach and avoidance behavior.
Future prospects
Responding to a different dimension (e.g., gender or puzzles versus faces) than emotional valence (Rotteveel & Phaf, 2004; Heuer, et al., 2007) measures more automatic behavioral tendencies than an explicit emotion categorization paradigm as has been used in this study and by Marsh et al. (2005). Rotteveel and Phaf (2004) could not find any RT differences in response to the different emotional expressions in their approach-avoidance task. However, Heuer et al. (2007) reported a significant influence of emotional expressions on motor reactions but only in the socially anxious group. Our paradigm aimed at eliciting actual motor responses to emotional expressions recognized as such, possibly increasing the impact of emotional expressions on behavioral tendencies. Further examination of indirect measures of behavioral tendencies in the joystick task is highly relevant to resolve these issues and to clarify whether the mere presence of an emotional expression is sufficient to elicit approach or avoidance responses.
In a very recent comprehensive theoretical and experimental framework, Eder and Rothermund (2008) introduce the so-called evaluative coding theory which conceptualizes approach and avoidance reactions as positively and negatively coded motor behaviors. They hypothesize that evaluative implications of semantic action labels and goals assign affective codes to motor responses on a representational level and if there is a match in the evaluative stimulus-response (S-R) combination then response facilitation is predicted. The evaluative coding theory assumes that evaluative response codes should have a greater influence in evaluative discriminations than in gender decisions about facial expressions thereby offering a possible explanation for the conflicting findings of Rotteveel and Phaf (2004). Similarly, more subtle evaluative connotations of response labels should influence the response selection process only if they serve to discriminate the response alternatives (Ansorge & Wühr, 2004; Lavender & Hommel, 2007). However, this theory questions the assumed association of approach and avoidance reactions to motivational systems (e.g., Chen & Bargh, 1999) by linking the affective response activations to any motor behavior that relies on evaluative response codes. Nevertheless, it underlines the importance of an interdisciplinary discussion about the theoretical definitions and implications of different research directions examining S-R compatibility and behavioral tendencies.
Apart from theoretical speculations (e.g., Gray, 1982; Pickering & Gray, 1999) little is known about the neuronal correlates of the behavioral approach and avoidance systems. One functional imaging study investigating social-motivational behavior emphasized for example the role of the orbitofrontal cortex in the voluntary control of approach and avoidance reactions (Roelofs, Minelli, Mars, van Peer & Toni, 2008). Thus further application of neuroimaging techniques is meritorious, especially in elucidating the neural substrates of the observed discrepancies between more automatic and more controlled processes.
Additionally, it would be informative to investigate implicit and explicit behavioral tendencies in clinical populations such as depression, which is hypothesized to be characterized by an increased BIS and decreased BAS (e.g., Gray, 1982; Kasch, Rottenberg, Arnow, & Gotlib, 2002). Heuer et al. (2007) reported interesting findings in a sample of socially anxious individuals, who responded highly avoidant to both angry and happy facial expressions, but only in the indirect joystick task. In a rating task of the valence of the emotional expressions the socially anxious individuals did not differ significantly form non anxious controls further indicating the need for exploring automatic and conscious behavioral tendencies separately.
Conlusions
In the context of earlier studies, our data indicate that happiness is strongly associated with approach and anger with avoidance whereas behavioral tendencies in response to sadness and disgust are more complex. Sadness induces automatic approach but conscious avoidance. From an evolutionary point of view, it seems reasonable that sadness communicates a request for help and elicits approach towards the sender, but prior social experiences may lead to a restraint. Disgust elicited clear avoidance behavior in the rating task, but reactions in the joystick task were rather ambiguous, probably influenced by expression style. The expression of disgust serves an evolutionary advantage as an important signal in avoiding noxious stimuli but can also be interpersonally related to the opponent, thereby eliciting avoidance.
Men and women seem to react similarly to emotional expressions but there are differences in reactions to male and female faces, which may be due to varying socialization processes. This study shows the importance of differentiating between more automatic and more controlled processes in social interaction since we observed that perceivers can rate an expression as avoidance-eliciting while still showing pronounced automatic approach tendencies towards it. Based on these results it is highly relevant to further explore actual reactions to emotional expressions and emotional stimuli in general since they are widely used in various kinds of studies without fully understanding the reactions they actually release in the perceiver. This may also further help to clarify findings on neural correlates of these different emotions, in which approach and avoidance conflicts may explain divergent findings. Moreover, integrating theoretical and experimental results from cognitive psychology research might further clarify existing inconsistencies. Further investigation of perceivers' reactions to emotional facial expressions could also focus on gender of poser effects and enhance our understanding of gender related social interaction.
Acknowledgments
We are grateful to two anonymous reviewers and Gernot Horstmann for their thoughtful comments on the first version of this manuscript. E.M.S. was financially supported by the Faculty of Medicine, RWTH Aachen University (START 690811 to B.D.), B.D. and U.H. were supported by the German Research Foundation (DFG, IRTG 1328, KFO 112) and R.C.G. was supported by the NIMH grant MH 60722.
Footnotes
The following manuscript is the final accepted manuscript. It has not been subjected to the final copyediting, fact-checking, and proofreading required for formal publication. It is not the definitive, publisher-authenticated version. The American Psychological Association and its Council of Editors disclaim any responsibility or liabilities for errors or omissions of this manuscript version, any version derived from this manuscript by NIH, or other third parties. The published version is available at www.apa.org/pubs/journals/xhp.
  • Adolphs R. Perception and Emotion. How We Recognize Facial Expressions. Current Directions in Psychological Science. 2006;15(5):222–226.
  • Ansorge U, Wühr P. A response-discrimination account of the Simon effect. Journal of Experimental Psychology: Human Perception and Performance. 2004;30:365–377. [PubMed]
  • Bargh JA. The automaticity of everyday life. In: Wyer RS, editor. Advances in social cognition. Vol. 10. Mahwah, NJ: Erlbaum; 1997. pp. 1–62.
  • Cacioppo JT, Priester JR, Berntson GG. Rudimentary determinants of attitudes. II: Arm flexion and extension have differential effects on attitudes. Journal of Personality and Social Psychology. 1993;65(1):5–17. [PubMed]
  • Calder AJ, Keane J, Manly T, Sprengelmeyer R, Scott S, Nimmo-Smith I, et al. Facial expression recognition across the adult life span. Neuropsychologia. 2003;41:195–202. [PubMed]
  • Chen M, Bargh JA. Consequences of Automatic Evaluation: Immediate Behavioral Predispositions to Approach or Avoid the Stimulus. Personality and Social Psychology Bulletin. 1999;25(2):215–224.
  • Derntl B, Kryspin-Exner I, Fernbach E, Moser E, Habel U. Emotion recognition accuracy in healthy young females is associated with cycle phase. Hormones and Behavior. 2008a;53(1):90–95. [PubMed]
  • Derntl B, Windischberger C, Robinson S, Lamplmayr E, Kryspin-Exner I, Gur RC, et al. Facial emotion recognition and amygdala activation are associated with menstrual cycle phase. Psychoneuroendocrinology. 2008b;33:1031–1040. [PubMed]
  • Duckworth KL, Bargh JA, Garcia M, Chaiken S. The automatic evaluation of novel stimuli. Psychological Science. 2002;13(6):513–519. [PubMed]
  • Eder AB, Rothermund K. When do motor behaviors (mis)match affective stimuli? An evaluative coding view of approach and avoidance reactions. Journal of Experimental Psychology: General. 2008;137(2):262–281. [PubMed]
  • Elliot AJ, Covington MV. Approach and Avoidance Motivation. Educational Psychology Review. 2001;13(2):73–92.
  • Fischer A. Sex differences in emotionality: Fact or stereotype? Feminism & Psychology. 1993;3:303–318.
  • Fitzgerald DA, Angstadt M, Jelsone LM, Nathan PJ, Phan KL. Beyond threat: amygdala reactivity across multiple expressions of facial affect. Neuroimage. 2006;30(4):1441–1448. [PubMed]
  • Gray JA. The neuropsychology of anxiety: an enquiry into the functions of the septo-hippocampal system. Oxford: Oxford University Press; 1982.
  • Graziano WG, Habashi MM, Sheese BE, Tobin RM. Agreeableness, empathy, and helping: a person X situation perspective. Journal of Personality and Social Psychology. 2007;93(4):583–599. [PubMed]
  • Gur RC, Sara R, Hagendoorn M, Marom O, Hughett P, Macy L, et al. A method for obtaining 3-dimensional facial expressions and its standardization for use in neurocognitive studies. Journal of Neuroscience Methods. 2002a;115:137–143. [PubMed]
  • Gur RC, Schroeder L, Turner T, McGrath C, Chan RM, Turetsky BE, et al. Brain activation during facial emotional processing. Neuroimage. 2002b;16(3 (1)):651–662. [PubMed]
  • Habel U, Windischberger C, Derntl B, Robinson S, Kryspin-Exner I, Gur RC, et al. Amygdala activation and facial expressions: explicit emotion discrimination versus implicit emotion processing. Neuropsychologia. 2007;11(45 (10)):2369–77. [PubMed]
  • Hall JA, Matsumoto D. Gender differences in judgments of multiple emotions from facial expression. Emotion. 2004;4(2):201–206. [PubMed]
  • Hess U, Blairy S, Kleck RE. The influence of facial emotion displays, gender, and ethnicity on judgments of dominance and affiliation. Journal of Nonverbal Behavior. 2000;24(4):265–283.
  • Heuer K, Rinck M, Becker ES. Avoidance of emotional facial expressions in social anxiety: The Approach-Avoidance Task. Behaviour Research and Therapy. 2007;45:2090–3001. [PubMed]
  • Horstmann G. What do facial expressions convey: feeling states, behavioral intentions, or action requests? Emotion. 2003;3(2):150–66. [PubMed]
  • Jarvis BG. DirectRT research software (Version 2004) New York: Empirisoft; 2004. Computer software.
  • Kasch KL, Rottenberg J, Arnow BA, Gotlib IH. Behavioral activation and inhibition systems and the severity and course of depression. Journal of Abnormal Psychology. 2002;111(4):589–597. [PubMed]
  • Lang PJ, Bradley MM, Cuthbert BN. Emotion, attention, and the startle reflex. Psychological Review. 1990;97:377–395. [PubMed]
  • Lavender T, Hommel B. Affect and action: Towards an event-coding account. Cognition and Emotion. 2007;21:1270–1296.
  • Marsh AA, Ambady N, Kleck RE. The effects of fear and anger facial expressions on approach- and avoidance-related behaviors. Emotion. 2005;5(1):119–24. [PubMed]
  • Neumann R, Strack F. Approach and avoidance: the influence of proprioceptive and exteroceptive cues on encoding of affective information. Journal of Personality and Social Psychology. 2000;79(1):39–48. [PubMed]
  • Pickering AD, Gray JA. The neuroscience of personality. In: Pervin LA, John OP, editors. Handbook of Personality Theory and Research. New York: Guilford Press; 1999. pp. 277–299.
  • Puca RM, Rinkenauer G, Breidenstein C. Individual Differences in Approach and Avoidance Movements: How the Avoidance Motive Influences Response Force. Journal of Personality. 2006;74(4):979–1014. [PubMed]
  • Roelofs K, Minelli A, Mars RB, van Peer J, Toni I. On the neural control of social emotional behaviour. Social Cognitive and Affective Neuroscience. 2008;4(1):50–58. [PMC free article] [PubMed]
  • Rotteveel M, Phaf RH. Automatic affective evaluation does not automatically predispose for arm flexion and extension. Emotion. 2004;4(2):156–72. [PubMed]
  • Rozin P, Lowery L, Ebert R. Varieties of disgust faces and the structure of disgust. Journal of Personality and Social Psychology. 1994;66(5):870–881. [PubMed]
  • Thayer JF, Johnson BH. Sex differences in judgement of facial affect: A multivariate analysis of recognition errors. Scandinavian Journal of Psychology. 2000;41:243–246. [PubMed]
  • Wang CC, Wang CH. Helping others in online games: prosocial behavior in cyberspace. Cyber Psychology & Behavior. 2008;11(3):344–346. [PubMed]