In the present study, we have shown that single neurons in the macaque ventral prefrontal cortex (VLPFC) respond differentially to changes in face-view/head orientation. Neurons in the VLPFC responded selectively to either the identity of the face presented (human or macaque) or to the view of the face/head, or to both identity and face-view. Neurons which were affected by the identity of the face most often showed an increase in firing in the second part of the stimulus period. Neurons that were selective for face-view were typically most responsive to the forward face stimuli (0° and 30° rotation). Our data indicated that the human forward face (0°) was decoded better than any other facial stimuli and also contained the most information relative to other face-views (). Furthermore, neurons that preferred the forward views (0° and 30°) were auditory responsive.
The preference for forward face-view by prefrontal neurons is interesting since forward face-view conveys a variety of social signals which prefrontal neurons might encode. First, direct gaze is considered a social threat for monkeys, especially of a human (Kalin and Shelton, 1989
). Thus, the neurons which responded best to the forward view faces, especially the human face, may have done so due to perceived threat by the viewer, whereas other face views may have been less likely to evoke a response because they were less threatening. In addition to the perceived threat of direct gaze, forward face-view stimuli also provide the most complete view of a number of communication-relevant facial features including the eyes and mouth, which are the most frequently viewed elements of a face, even in macaque monkeys (Haith et al., 1977
; Gothard et al., 2009
In fact, the appearance or disappearance of a salient feature is thought to underlie face-view preferences (Liu and Chaudhuri, 2002
; Stephan and Caine, 2007
). In Perrett et al., (1985)
forward face-view evoked responses most frequently, even when salient features, such as the eyes or the mouth were obscured. In the current study, the 0° and off-center, 30° face views were the preferred view most frequently (, ). The 30° or ¾ face-views have the advantage of conveying a 3-D image of the nose and face-shape, which are additional pieces of information that are valuable in recognition (Logie et al., 1987
). Some studies which have examined the neural response to face-view have found an advantage in recognition for the ¾ face-view (Logie et al., 1987
; Van der Linde and Watson, 2010
) and have shown activation by this view compared to others in a number of cortical regions including the inferior frontal gyrus (Kowatari et al., 2004
). The 30° is also less threatening. Thus, prefrontal neurons which are integrating many details and features of face information, including angle of gaze, position of the mouth, and position of ears for the purpose of evaluation of emotional expression or identity processing, would receive the greatest amount of information from the 0°and 30° face-view stimuli where these features are most visible.
Face-view and gaze are important elements in communication since attention to the face is a prerequisite for most verbal and non-verbal communication. An area of the brain that is involved in the process of social communication would be likely to receive information about both auditory and visual cues. In the current study, we found an association between face-view and auditory responsiveness. The neurons which preferred forward views (0° and 30°) were responsive to auditory stimuli whereas the neurons which preferred other views or which had no preference were not auditory responsive. We have previously shown that most auditory neurons in VLPFC are multisensory and that these multisensory neurons prefer face-vocalization combinations compared to non-face/non-vocalization stimuli (Sugihara et al., 2006
). The findings here, that such multisensory neurons are specialized for forward face-view, affirms a role for the non-human primate VLPFC in communication and not simply generalized audiovisual integration since forward face-view coupled with acoustic stimuli is most common during communication.
Prefrontal neurons were also affected by the identity of the faces in our study. Most identity responsive cells responded to the human face stimuli () although some cells preferred the monkey faces (). Across the population, the human forward face evoked a response that was significantly different from all other stimuli and was decoded best among the face-view responsive cells (). Nonetheless it is difficult to completely separate the factors of face-view and identity since we used only two different identity faces. The response to the 0° human face in fact, could be due to either factor. Nonetheless, previous studies have noted clear differences in the viewing of monkey versus human faces by monkeys which could support the results shown here. For example, monkeys show differences in their ability to recognize monkey versus human faces and use different perceptual strategies (Gothard et al, 2009
; Parr et al., 1999
). Scanpath analysis shows that monkeys examine the eye region of monkey faces more than the eye region in human faces and show a novelty preference for monkey but not human faces (Gothard et al., 2009
Importantly, the effect of identity was temporally specific in VLPFC neurons as more neurons showed an effect of identity on the late part of the stimulus period compared with the earlier part. One explanation is that the effect of identity may develop more slowly than face-view since it involves the integration and processing of information about many features as well as memories before enough information is accumulated for recognition. This longer processing time may appear as the late response in our data. In contrast, the earlier effect of face-view could be due to a rapid response to a salient feature such as the eyes. Differential temporal processing of different aspects of faces was demonstrated for inferotemporal neurons in the influential study by Sugase et al., (1999)
. In their study, global information about the general visual category (object, human face, monkey face) was conveyed in the earliest part of the response while fine information about identity or expression was conveyed in the late part of the neuronal response, similar to the results presented here.
In other studies, identity and facial expression or face-view have been localized to separate temporal lobe regions. Most commonly, identity and the physical features that define identity have been most often found to evoke responses in neurons within the inferotemporal cortex (Hasselmo, et al., 1989
; Young and Yamane, 1992
; Eifuku et al., 2004
). In contrast, facial expression and face-view sensitive neurons have been localized to the cortex in the superior temporal sulcus including the superior temporal polysensory area (Hasselmo, et al., 1989
; Eifuku et al., 2004
). In their study of face-view in the STS, Perrett and colleagues (1985)
, using stimuli similar to those shown here where gaze and head orientation changed together, showed different views of the face (frontal, profile, tiled upwards or downwards) maximally activated different populations of neurons in the STS. The integrated processing by the entire population can then account for all gaze directions. In their study, Freiwald and Tsao (2010)
used a combined single-unit neurophysiology and fMRI approach to localize face-responsive cells and patches in the temporal lobe. These were then tested with 8 different face-views. Their results indicate an increase in invariance to head orientation processing as information proceeds from ML/MF to AL and further to AM. The view invariance of these face-patches suggests a role for these areas in identity processing (Friewald and Tsao, 2010
). In contrast to the population processing of gaze and the view invariant responses in face-cell patches of the temporal lobe, the neurons in the VLPFC described here, respond best to forward gaze (at 0°or 30°), which suggests a different role in face processing for this region. Furthermore, the fact that forward face-view selectivity occurs in auditory responsive neurons conveys a role in communication or integrative processes. This is also suggested by the convergence of anatomical projections from face and vocalization processing areas in VLPFC.
Face processing areas in inferotemporal cortex and the STS have reciprocal connections with the frontal lobe so that identity, facial expression, and face-view information can easily converge on prefrontal neurons. Anatomical studies in non-human primates have noted connections between area 45 in ventral prefrontal cortex and inferotemporal cortex areas TE and TEO (Webster et al., 1994
; Barbas, 1992
). There are also dense reciprocal projections between the dorsal bank of the STS and orbital and lateral prefrontal cortex (Romanski et al., 1999a
; Petrides and Pandya, 2002
; Saleem et al., 2008
; Diehl et al., 2008
). Ventral prefrontal cortex also receives projections from the amygdala which has face (Leonard et al., 1985
; Gothard et al., 2007
), gaze (Tazumi et al., 2010
, Hoffman et al., 2007
) as well as face-vocalization (Kuraoka and Nakamura, 2007
) neurons. Direct and averted gaze in non-human primates are clear indicators of threat and submission, respectively, and it is not surprising that a region such as the amygdala which is involved in the signaling of fear or vigilance (Davis and Whalen, 2001
; LeDoux, 2007
) should show a change in neural activity with direct or averted gaze. Since there are connections between the VLPFC with the amygdala (Carmichael and Price, 1995
; Barbas, 2000
) information concerning the emotional aspects of gaze likely reach VLPFC neurons and could explain some of the selectivity shown in the current study for direct face-view.
A network of brain regions devoted to the processing of identity and face-view information would thus include the amygdala, the STS, and inferotemporal cortex. These areas can send specific information about various aspects of facial identity, facial expression, and face-view to VLPFC. While some of these areas may be specialized for the processing of faces, others are multisensory and have also been shown to respond to auditory information. These areas, together with vocalization processing regions in the auditory association cortex which send projections to VLPFC (Romanski et al.., 1999b
), would provide a rich array of face and vocalization information to the frontal lobe and would allow VLPFC to integrate identity, expression, gaze, and vocalization information during social communication and other executive processes.