Search tips
Search criteria 


Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
Neuropsychologia. Author manuscript; available in PMC 2011 October 1.
Published in final edited form as:
PMCID: PMC2949471

Face-related ERPs are modulated by point of gaze


This study examined the influence of gaze fixation on face-sensitive ERPs. A fixation crosshair presented prior to face onset directed visual attention to upper, central, or lower face regions while ERPs were recorded. This manipulation modulated a face-sensitive component (N170) but not an early sensory component (P1). Upper and lower face fixations elicited enhanced N170 amplitude and longer N170 latency. Results expand upon extant hemodynamic research by demonstrating early effects at basic stages of face processing. These findings distinguish attention to facial features in context from attention to isolated features, and they inform electrophysiological studies of face processing in clinical populations.

Keywords: Social neuroscience, N170, event-related potential, electroencephalogram, EEG, face perception


Human face perception is a vital social function subserved by specialized brain mechanisms. Event-related potential (ERP) studies reveal a face-sensitive negative component peaking approximately 170 ms after viewing a face (N170; Bentin, Allison, Puce, Perez & McCarthy, 1996). Relative to other visual stimuli, the N170 elicited by faces tends to be larger in amplitude and shorter in latency. Because of its short latency, sensitivity to perturbations in face configuration (e.g., face inversion), and relative insensitivity to higher order features (e.g., identity and emotion), the N170 is hypothesized to mark structural encoding, an early stage of face perception (Eimer, 2000). Isolated face parts also evoke an N170, with eyes eliciting the greatest amplitude, followed by whole faces and then noses and mouths (Bentin et al., 1996). This differential responsiveness to the eyes and an accelerated developmental maturation of the N170 elicited by eyes (Taylor, Edmonds, McCarthy & Allison, 2001) has spurred speculation that the component may reflect activity in brain regions specifically subserving eye detection. With respect to N170 latency, intact faces evoke the shortest latencies, followed by eyes then noses and mouths (Bentin et al., 1996).

Though previous ERP research has examined N170 response to isolated facial features, attention to facial features within the context of an intact face remains unexplored via electrophysiological methods. Functional magnetic resonance imaging research (fMRI) indicates that manipulating attention to the eyes and mouths modulates hemodynamic activity in face-related areas, such as the fusiform gyrus (FG), with attention to these regions most strongly activating the FG in typical adults (Morris, Pelphrey & McCarthy, 2007). These findings bear relevance to understanding face-related ERPs, as source estimation and co-recording of ERP and fMRI suggest neural generators of the N170 in FG (Itier & Taylor, 2004; Rossion, Joyce, Cottrell & Tarr, 2003; Sadeh, Podlipsky, Zhdanov & Yovel, 2010; Shibata et al., 2002).

The current study extends extant neuroimaging work by using electrophysiological methods to investigate the influence of point of gaze on face-related brain activity. This approach expands upon current understanding by (a) extricating the influence of differential attention to eyes versus mouths, (b) examining attention to faces in a more natural presentation, i.e., without a superimposed fixation crosshair, and (c) applying the temporal resolution of ERP to specifically examine modulation at the earliest stages of face perception. ERPs were recorded as typical adults viewed neutral faces without a fixation crosshair and with a variable fixation crosshair directing attention to the upper, central, or lower face. We hypothesized that visual fixations to the eyes would elicit N170 with enhanced amplitude and shorter latency relative to other fixation positions. Though previous research has not consistently revealed face-selective effects at an earlier sensory component, the P1, we explored this ERP component to determine whether gaze manipulation effects might be exerted through low-level visuoperceptual mechanisms. We did not predict P1 modulation by point of gaze.


Participants included 28 typically developing adults enrolled in an ongoing ERP study in the Developmental Electrophysiology Laboratory at the Yale Child Study Center ( Participants were screened by self-report for current or historical brain injury or disease and for normal or corrected-to-normal visual acuity. Thirteen participants' data were excluded from analysis for: equipment failure (n=1), visual impairment (n=1), excessive EEG artifact (n=11). The final sample included 15 individuals (7 females, 8 males; mean age 22.9 years; all right handed). All procedures were conducted with the understanding and written consent of participants and with approval of the Human Investigation Committee at the Yale School of Medicine consistent with the 1964 Declaration of Helsinki.

Stimuli consisted of 204 distinct grayscale digital images of neutral faces (102 male, 102 female; drawn from the Center for Vital Longevity Face Database; Minear & Park, 2004) and houses. All stimuli were presented in frontal view and at a standardized viewing size (10.6° by 8.1°) on a uniform black background. Faces were cropped within an oval frame to remove non-face features as per Gronenschild, Smeets, Vuurman, van Boxtel & Jolles (2009).

Stimuli were presented on a 51 cm color monitor (75-Hz, 1024×768 resolution) with E-Prime 1.2 software (Schneider, Eschman & Zuccolotto, 2002) at a viewing distance of 91 cm in a sound attenuated room with low ambient illumination. EEG was recorded continuously at 250 Hz using NetStation 4.3. A 128 lead Geodesic Sensor Net 200 (Electrical Geodesics Incorporated; Tucker, 1993) was fitted on the participant's head according to the manufacturer's specifications. Impedances were kept below 40 kilo-ohms.

To manipulate visual attention, the vertical position of a fixation crosshair directed attention to either the (a) upper, (b) central, or (c) lower regions of the stimulus; a fourth presentation condition used (d) no fixation crosshair (See Figure 1). The horizontal position of the crosshair was held constant, and vertical position was equiprobable and varied randomly among trials. The fixation crosshair was presented for a time period varying randomly between 500 and 1000 ms and was followed immediately by a randomly selected face or house stimulus for 500 ms and then a 700 ms blank screen. To monitor attention, participants pressed a button upon detection of randomly interspersed red-shaded target stimuli (10 houses, 10 faces, 20 crosshairs). All participants detected at least 95% of targets, and target trials were excluded from analysis. The 15 minute experiment included 428 total trials in random sequence: 204 distinct faces (10 preceded by target crosshair), 204 distinct houses (10 preceded by target crosshair), 10 face targets, 10 house targets.

Figure 1
Example face stimulus with overlaid crosshair displaying (a) upper, (b) central, and (c) lower positions. A fourth fixation condition entailed (d) no crosshair display. Note that, during the experiment, the crosshair preceded presentation of faces without ...

Data were averaged for each participant, digitally filtered (30 Hz low-pass), and transformed to correct for baseline shifts. The segmentation epoch was 100 ms before to 600 ms after stimulus onset. NetStation artifact detection settings were set to 200 μv for bad channels, 150 μv for eye blinks, and 100 μv for eye movements. Channels with artifacts on more than 25% of trials were marked as bad channels and replaced through spline interpolation. Segments that contained eye blinks, eye movement, or more than 10 bad channels were marked as bad and excluded. Participants with more than 25 percent bad channels were excluded from analysis. Data were averaged across six electrodes over the left (58, 59, 64, 65, 69, 70) and right (90, 91, 92, 95, 96, 97) lateral posterior scalp; electrodes were selected based on maximal observed amplitude of the N170 to faces and precedent (McPartland, Dawson, Webb, Panagiotides & Carver, 2004).

Time windows for ERP analysis were chosen by visual inspection of grand averaged data and confirmed for individual averages. Resultant time windows extended from 63 - 149 ms for the P1 and 103 - 203 ms for the N170. Peak amplitude and latency for each component were averaged across each electrode group for each participant and exported to SPSS for analysis (SPSS 16.0 for Windows, 2008). ERP parameters were analyzed using repeated measures ANOVA, with experimental condition (i.e., stimulus or fixation position) and hemisphere as within-subjects factors as per Picton and colleagues (2000).


Neural response to faces

To confirm the presence of a face-selective N170, faces and houses were compared across viewing conditions using univariate repeated measures ANOVA with condition (face/house) and hemisphere (left/right) as within-subjects factors. For amplitude, N170 to faces but not houses was reflected in a main effect of condition [F(1,14)=31.11, p<.01, !2partial=.69, observed power=.99], indicating larger amplitude to faces across hemispheres. Faces also elicited an N170 with shorter latency in the right hemisphere, as reflected in the omnibus model by a condition by hemisphere interaction [F(1,14)=5.84, p<.05, !2partial=.29, observed power=.61] and confirmed with a post-hoc paired t-test [t(14)=2.43, p<.05, !2partial=.30, observed power=.62]. No other significant effects for N170 latency or amplitude were detected [all p>.05, !2partial <.19]. Figure 2 displays waveforms to faces and houses.

Figure 2
Grand averaged ERP waveforms depicting response to houses and faces. Data shown are averaged across the 12 electrodes of interest across hemispheres, and fixation conditions are collapsed.

Point of gaze and ERP response

To examine the potential influence of point of gaze on neural response to faces, separate univariate repeated measures ANOVAs with fixation position (upper/central/lower/absent) and hemisphere (left/right) as within-subjects factors were calculated for P1 and N170 latency and amplitude.


No significant effects were detected for P1 amplitude or latency [all p>.05, !2partial<.18].


For amplitude, a main effect of fixation position indicated that N170 amplitude varied with point of gaze across hemisphere [F(3, 42)=5.24, p<.01, !2partial=.27, observed power=.90]. Post-hoc paired t-tests revealed that both upper and lower fixations elicited larger N170 amplitudes than central fixation position [Upper: t(14)=4.16, p<.05, !2partial=.55, observed power=.97; Lower: t(14)=3.14, p<.05, !2partial=.41, observed power=.83] or no crosshair [Upper: t(14)=2.28, p<.05, !2partial=.27, observed power=.57; Lower: t(14)=2.78, p<.05, !2partial=.34, observed power=.70]. For latency, a main effect of fixation position indicated that N170 latency varied with point of gaze across hemisphere [F(3, 42)=4.52, p<.01, !2partial=.24, observed power=.85]. Post-hoc paired t-tests revealed that N170 latencies were shorter in the absent crosshair condition compared to the upper [t(14)=2.44, p<.05, !2partial=.30, observed power=.62] and lower conditions [t(14)=2.88, p<.05, !2partial=.37, observed power=.77]. No other significant effects for N170 latency or amplitude were detected [all p>.05, !2partial<.21]. Figure 3 displays waveforms elicited by each fixation position across hemisphere.

Figure 3
Grand averaged ERP waveforms depicting response to faces when point of gaze was directed to upper face, central face, lower face, and when point of gaze was not directed with a fixation crosshair. Data shown are averaged over the 12 electrodes of interest ...


The present study examined electrophysiological brain response to faces while manipulating point of gaze. Prior hemodynamic work has shown increased activity in face-related brain regions associated with viewing of the internal features of the face; research has not yet addressed whether directing attention to eyes versus mouths differentially modulates activity, whether modulation is contingent upon superimposition of a crosshair on a face, or whether modulation occurs at early stages of face processing. Consistent with prior research, the current study detected an N170 elicited by faces that, relative to non-face stimuli, was enhanced in amplitude over lateral posterior scalp and was shorter in latency over right hemisphere electrodes.

Though low-level visual perception, indexed by the P1, did not vary as a function of fixation position, both the amplitude and latency of the N170 were modulated by point of gaze on the face. As predicted, fixation to upper face regions, corresponding to the eyes, was associated with enhanced N170 amplitude relative to fixations to the central face or undirected fixations in the absence of a crosshair. An unpredicted effect indicated that fixation to lower face regions, corresponding to the mouth, was also associated with enhanced N170 amplitude relative to central or undirected face fixations. Comparable amplitude effects for both eyes and mouths in the context of an intact face contrast with prior ERP research on isolated facial features, which showed relatively enhanced N170 amplitude to eyes but not mouths (Bentin et al., 1996). Because enhanced amplitude was also observed for mouths in this study, results are inconsistent with interpretations of the N170 as representative of an eye detector or the combination of face-sensitive and eye-sensitive neurons. Attention to parts of the face in the context of an intact face elicited processing distinct from attention to features in isolation; this concords with research implicating the N170 as a marker for configural perception (Rossion et al., 2000).

N170 latency was also modulated by fixation position. When no crosshair was presented, N170 occurred at a shorter latency than when fixations were directed to the eyes or mouth. This may reflect a differential disengagement effect. When no crosshair was presented (viewing a blank screen before onset of the face stimulus), disengagement was unnecessary; attention to a percept would initiate with face onset. In contrast, when a crosshair preceded a face, viewers needed to disengage with the crosshair and subsequently engage with the face. This latency effect may also reflect the contribution of cognitive processing demands, as crosshairs were also targets in the attention monitoring task.

With respect to both N170 amplitude and latency, attention to eyes and mouths elicited comparable brain activity. Differential responsiveness to eyes and mouths relative to mid-face may reflect the relatively increased social salience of these features or the quantity of visual information present in these regions of the face. Alternatively, this selective responsiveness to both eyes and mouths, the regions of the face in which motion is most likely to occur, may reflect the contribution of brain regions subserving biological motion, such as the superior temporal sulcus (STS; Pelphrey, Morris, Michelich, Allison & McCarthy, 2005). The N170 is responsive to facial movements (Puce, Smith & Allison, 2000) and has been localized to several sources in inferotemporal cortex, including the STS (Itier & Taylor, 2004). Results are consistent with the hypothesis that the N170 reflects an amalgamation of temporally overlapping activity in FG and STS (and, potentially, other proximal locations in inferotemporal cortex; Itier, Alain, Sedore & Mcintosh, 2007). This would account for subtle distinctions in response patterns of the N170 and hemodynamic FG activation despite broad similarity in terms of face-selectivity, as well as evidence of direct correlations between N170 amplitude and latency and FG activation (Iidaka, Matsumoto, Haneda, Okada & Sadato, 2006).

Our findings bear relevance to studies of face perception in clinical populations, such as those with autism spectrum disorder (ASD). Individuals with ASD have been shown to exhibit atypical brain activity during face perception (McPartland et al., 2004; Schultz et al., 2000), which has been hypothesized to simply reflect differential viewing strategies, such as looking to the mouth instead of the eyes (Dalton et al., 2005). Though present results make clear that fixation does indeed influence ERP response to faces, our findings of similar response patterns for eye and mouth fixations suggest that differential tendencies to look at eyes or mouths would not account for differences between individuals with ASD and typical counterparts. It is, of course, also possible that manipulation of point of gaze affects individuals with ASD in an entirely distinct way, in which case the current results from typical individuals might not apply. Ongoing work in our laboratory is examining the influence of point of gaze on face perception in ASD.

Understanding of the neural mechanisms of face perception also informs studies of social cognition. ERP and hemodynamic indices of face perception have also been implicated as markers of experience with non-face visual stimuli (Gauthier, Skudkarski, Gore & Anderson, 2000; Tanaka & Curran, 2001); it is not yet understood to what degree affective experience (e.g., developing a fondness for objects with which one frequently engages) contributes to these neural mechanisms. Likewise, FG activation is observed in social-perceptual tasks that do not involve faces (Schultz et al., 2003). Neuropeptides involved in social behavior also suggest interrelationships among face perceptual systems and broader social cognition; administration of oxytocin has been shown to increase eye gaze and pro-social sentiments in typical and clinical populations (Guastella, Mitchell & Dadds, 2008; Andari, Duhamel, Zalla, Herbrecht, Leboyer & Sirigu, 2010). Moving forward, studies of face perception should consider neurochemistry and must frame findings in light of overarching social behavior.

There are several limitations of the current study. Though the target detection task ensured that participants viewed the crosshair with high consistency, visual attention was not directly measured. We are now concurrently recording EEG and visual attention as measured by an eye-tracker to more precisely quantify the relationship between visual attention to faces and electrophysiological brain activity. The size of face stimuli and the relative proximity of crosshair positions in the current study leave open the possibility that foveation to each position resulted in overlapping visual fields. Though our significant results suggest distinct visual perception for each crosshair position, it is possible that effects would be more pronounced with larger faces that permitted greater separation of crosshair positions; this work is currently in progress in our laboratory. The results of the current study might also be attributable to differences in composition of the overall visual field induced by the fixation manipulation. Upper and lower fixations would both likely include more non-face portions of the screen than central fixation. If this interfered with configural perception, as in inversion, it could account for the delayed and enhanced N170 in these conditions. This explanation, however, fails to account for differential effects for central versus absent crosshair. Because eyes and mouths are necessarily at the periphery of a visual stimulus relative to central fixation, this possibility cannot be directly examined without rearranging the face or making it sufficiently large that the visual field remains entirely on the face during eye or mouth fixation; both of these options would likely introduce artifacts that would obscure the influence of gaze on the N170 (Bentin et al., 1996; Rousselet, Husk, Bennett & Sekuler, 2005). Finally, we employed grayscale stimuli; though this is consistent with the majority of ERP studies of face perception, color images may have yielded different results. For example, ERPs might be differentially elicited by attention to mouths with more realistic, pink lips.

This study demonstrated that face-sensitive ERPs are modulated by looking patterns to intact faces. Rather than the eye-specific effects observed in ERP studies of isolated features, attention to the eyes and mouths similarly influenced neural response at the earliest stages of face perception. These results may reflect the influence of richness of visual information, social salience, or the contribution of brain regions subserving biological motion perception. This was the first electrophysiological study to manipulate visual attention during face perception, and findings emphasize the import of ERP's temporal precision in understanding social perceptual processes.


This work was supported by the Anna Freud Centre-University College London-Yale Child Study Center program (CC), the Gustavus and Louise Pfeiffer Research Foundation (LCM), NIMH R03 MH079908 (JM), NIMH K23MH086785 (JM), a NARSAD Young Investigator Award (JM), and CTSA Grant Number UL1 RR024139 (JM, LCM) from the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH), and NIH roadmap for Medical Research (USA). Its contents are solely the responsibility of the authors and do not necessarily represent the official view of NCRR or NIH. The authors acknowledge the practical and conceptual contributions of Christopher Bailey, Robert Schultz, Kevin Pelphrey, and Cora Mukerji.


Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.


  • Andari E, Duhamel J, Zalla T, Herbrecht E, Leboyer M, Sirigu A. Promoting social behavior with oxytocin in high-functioning autism spectrum disorders. PNAS. 2010;107(9):4389–4394. [PubMed]
  • Bentin S, Allison T, Puce A, Perez E, et al. Electrophysiological studies of face perception in humans. Journal of Cognitive Neuroscience. 1996;8(6):551–565. [PMC free article] [PubMed]
  • Dalton KM, Nacewicz BM, Johnstone T, Schaefer HS, Gernsbacher MA, Goldsmith HH, et al. Gaze fixation and the neural circuitry of face processing in autism. Nature Neuroscience. 2005;8(4):519–526. [PMC free article] [PubMed]
  • Eimer M. The face-specific N170 component reflects late stages in the structural encoding of faces. Neuroreport. 2000;11(10):2319–2324. [PubMed]
  • Gauthier I, Skudkarski P, Gore J, Anderson A. Expertise for cars and birds recruits brain areas involved in face recognition. Nature Neuroscience. 2000;3(2):191–197. [PubMed]
  • Gronenschild EH, Smeets F, Vuurman EF, van Boxtel MP, Jolles J. The use of faces as stimuli in neuroimaging and psychological experiments: a procedure to standardize stimulus features. Behav Res Methods. 2009;41(4):1053–1060. [PubMed]
  • Guastella A, Mitchell PB, Dadds MR. Oxytocin increases gaze to the eye region of human faces. Biological Psychiatry. 2008;63:3–5. [PubMed]
  • Itier RJ, Taylor MJ. Source analysis of the N170 to faces and objects. Neuroreport. 2004;15(8):1261–1265. [PubMed]
  • Itier RJ, Alain C, Sedore K, Mcintosh AR. Early face processing specificity: it's in the eyes! Journal of Cognitive Neuroscience. 2007;19(11):1815–1826. [PubMed]
  • McPartland J, Dawson G, Webb SJ, Panagiotides H, Carver LJ. Event-related brain potentials reveal anomalies in temporal processing of faces in autism spectrum disorder. Journal of Child Psychology and Psychiatry. 2004;45(7):1235–1245. [PubMed]
  • Minear M, Park DC. A lifespan database of adult facial stimuli. Behavior Research Methods, Instruments & Computers. 2004;36:630–633. [PubMed]
  • Morris JP, Pelphrey KA, McCarthy G. Controlled scanpath variation alters fusiform face activation. Social Cognitive and Affective Neuroscience. 2007;2(1):31–38. [PMC free article] [PubMed]
  • Pelphrey KA, Morris JP, Michelich CR, Allison T, McCarthy G. Functional anatomy of biological motion perception in posterior temporal cortex: an FMRI study of eye, mouth and hand movements. Cerebral Cortex. 2005;15(12):1866–1876. [PubMed]
  • Picton TW, Bentin S, Berg P, Donchin E, Hillyard SA, Johnson R, Jr, et al. Guidelines for using human event-related potentials to study cognition: recording standards and publication criteria. Psychophysiology. 2000;37(2):127–152. [PubMed]
  • Puce A, Smith A, Allison T. ERPs evoked by viewing facial movements. Cognitive Neuropsychology. 2000;17(1-3):221–239. [PubMed]
  • Rossion B, Gauthier I, Tarr MJ, Despland P, Bruyer R, Linotte S, et al. The N170 occipito-temporal component is delayed and enhanced to inverted faces but not to inverted objects: an electrophysiological account of face-specific processes in the human brain. Neuroreport. 2000;11(1):69–74. [PubMed]
  • Rossion B, Joyce CA, Cottrell GW, Tarr MJ. Early lateralization and orientation tuning for face, word, and object processing in the visual cortex. Neuroimage. 2003;20(3):1609–1624. [PubMed]
  • Rousselet GA, Husk JS, Bennett PJ, Sekuler AB. Spatial scaling factors explain eccentricity effects on face ERPs. Journal of Vision. 2005;5(10):755–763. [PubMed]
  • Sadeh B, Podlipsky I, Zhdanov A, Yovel G. Event-related potential and functional MRI measures of face-selectivity are highly correlated: A simultaneous ERP-fMRI investigation. Hum Brain Mapp. 2010 2010 Feb 2. [Epub ahead of print] [PubMed]
  • Schneider W, Eschman A, Zuccolotto A. E-prime user's guide Pittsburg. Psychology Software Tools Inc.; 2002.
  • Schultz RT, Gauthier I, Klin A, Fulbright RK, Anderson AW, Volkmar F, et al. Abnormal ventral temporal cortical activity during face discrimination among individuals with autism and Asperger syndrome. Archives of General Psychiatry. 2000;57(4):331–340. see comment. [PubMed]
  • Schultz RT, Grelotti DJ, Klin A, Kleinman J, Van der Gaag C, Marois R, et al. The role of the fusiform face area in social cognition: implications for the pathobiology of autism. Philos Trans R Soc Lond B Biol Sci. 2003;358(1430):415–427. [PMC free article] [PubMed]
  • Shibata T, Nishijo H, Tamura R, Miyamoto K, Eifuku S, Endo S, et al. Generators of visual evoked potentials for faces and eyes in the human brain as determined by dipole localization. Brain Topogr. 2002;15(1):51–63. [PubMed]
  • SPSS for Windows Version 16.0.2. Chicago: SPSS, Inc.; 2008.
  • Tanaka JW, Curran T. A neural basis for expert object recognition. Psychol Sci. 2001;12(1):43–47. [PubMed]
  • Taylor MJ, Edmonds GE, McCarthy G, Allison T. Eyes first! Eye processing develops before face processing in children. Neuroreport. 2001;12(8):1671–1676. [PubMed]
  • Tucker DM. Spatial sampling of head electrical fields: the geodesic sensor net. Electroencephalography and Clinical Neurophysiology. 1993;87(3):154–163. [PubMed]