Search tips
Search criteria 


Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
Hear Res. Author manuscript; available in PMC 2010 December 1.
Published in final edited form as:
PMCID: PMC2810529

Multimodal activity in the parietal cortex


Goal-directed behavior can be thought of as dynamic links between sensory stimuli and motor acts. Neural correlates of many of the intermediate events of both auditory and visual goal-directed behaviors are found in the posterior parietal cortex. Here, we review studies that have focused on how neurons in the lateral intraparietal area (area LIP) differentially process auditory and visual stimuli. Together, these studies suggest that area LIP contains a modality-dependent representation that is highly dependent on behavioral context.

Keywords: parietal cortex, rhesus, attention, salience, movement planning, multimodal

Goal-directed behavior is characterized by the flexible mapping of sensory stimuli onto actions (Snyder, 2000). A single stimulus can be arbitrarily mapped onto different actions. Alternatively, different stimuli can be mapped onto the same action. The dynamic quality of these links is the key to goal-directed behavior since it allows humans and other animals to respond adaptively to different environmental scenarios and not just through reflexive loops.

If goal-directed behavior is viewed as dynamic links, we encounter computational questions as to how the nervous system forms, maintains, and alters these links. For example, at any moment in time, we are bombarded with a variety of stimuli from different modalities. How do we choose which of these many stimuli will be the endpoint (cause) of an action? How do we choose which to ignore? Also, the mapping between stimuli and actions is often not one-to-one. Stimuli from different sensory modalities, for example, may elicit the same action. If you have just robbed a bank, the sound of police sirens or the sight of police lights may elicit the same action: run! In other situations, however, the same stimulus can elicit different actions. The sound of the same police sirens may elicit a different action than “run!” if you own the bank that has been robbed. Finally, the valence of a stimulus, the manner in which it is categorized, or even motivational state might affect which action is chosen or whether an action is even selected (Russ et al., 2007).

One cortical region that plays an important part in both auditory and visual goal-directed behavior is the parietal cortex. Parietal activity reflects many of the intermediate processes between sensation and action that are essential for goal-directed behavior.

Indeed, in functional subdivisions of the posterior parietal cortex of rhesus monkeys, such as the lateral intraparietal area (area LIP) and the medial intraparietal area (see below for more details on these parietal areas), neurons are modulated by attention and salience (Bisley et al., 2003; Cohen et al., 2004b; Colby et al., 1999; Gifford III et al., 2004; Goldberg et al., 2002; Gottlieb et al., 2005; Gottlieb et al., 1998; Ipata et al., 2008; Kusunoki et al., 2000; Powell et al., 2000), response selection (Baldauf et al., 2008; Cohen et al., 2004a; Gnadt et al., 1988; Platt et al., 1997; Snyder et al., 2000), coordinate transformations (Andersen et al., 1997; Buneo et al., 2008; Mullette-Gillman et al., 2005; Mullette-Gillman et al., In Press; Sabes et al., 2002; Snyder et al., 1998; Stricanne et al., 1996), and decisions to “act” on a sensory stimulus (Bendiksby et al., 2006; Klein et al., 2008; McCoy et al., 2005; Platt et al., 1999; Schall, 2004; Shadlen et al., 2001; Sugrue et al., 2004; Sugrue et al., 2005). These variables are also reflected in the activity of the human parietal cortex (Binkofski et al., 1998; Connolly et al., 2003; Connolly et al., 2000; DeSouza et al., 2000; Dinstein et al., 2008; Huettel et al., 2006; Jancke et al., 2001; Karnath et al., 2001; Kastner et al., 2000; Kawashima et al., 1996; Levy et al., 2007; Luna et al., 1998; Rushworth et al., 2001; Schluppeck et al., 2005; Schluppeck et al., 2006; Silver et al., 2005; Tosoni et al., 2008). Moreover, and key for this review, both non-human and human studies have demonstrated that the parietal cortex is activated during tasks that use auditory or visual stimuli (Ahveninen et al., 2006; Bremmer et al., 2001a; Bremmer et al., 2001b; Bushara et al., 2003; Bushara et al., 1999; Butters et al., 1970; Cohen et al., 2000; Cohen et al., 2004b; Crottaz-Herbette et al., 2004; Cusack et al., 2000; Deouell et al., 2000a; Deouell et al., 2000b; Gifford III et al., 2004; Griffiths et al., 1998; Grunewald et al., 1999; Karabanov et al., 2009; Linden et al., 1999; Mazzoni et al., 1996; Mullette-Gillman et al., 2005; Phan et al., 2000; Schlack et al., 2005; Stricanne et al., 1996; Warren et al., 2002).

Here, we focus on area LIP and review a series of studies on the auditory and visual properties of LIP neurons. First, though, we briefly review the anatomy of the posterior parietal cortex and area LIP.

Anatomy of the Posterior Parietal Cortex and Area LIP

In both humans and monkeys (Andersen, 1987; Hyvärinen, 1982), the posterior parietal cortex forms a circuit with sensory areas and with frontal, temporal, and limbic areas. The posterior parietal cortex contains a number of functional subdivisions including area 7a, area 7b, the medial temporal area, the medial superior temporal area, the medial lateral intraparietal area, and the lateral intraparietal area (LIP) (Andersen, 1987; Andersen et al., 1989).

Visual and auditory input to area LIP is well-described. Area LIP is classically considered to be part of the dorsal visual processing stream (Ungerleider et al., 1982). Consequently, LIP neurons receive input from neurons in extrastriate visual areas as well as cortical and brainstem areas involved with saccadic eye movements (Asanuma et al., 1985; Blatt et al., 1990; Lynch et al., 1985). The main source of auditory input to area LIP is the temporoparietal cortex (Divac et al., 1977; Hyvärinen, 1982; Pandya et al., 1969), which is part of the parabelt of auditory cortex (Kaas et al., 1998). Neurons in this region of the cortex are sensitive to the location of a sound (Leinonen et al., 1980; Pandya et al., 1985) and, consequently, may support any role that area LIP has in auditory spatial processing. Auditory input to area LIP may also arise via input from multimodal cortical areas (Baizer et al., 1991; Blatt et al., 1990; Seltzer et al., 1991). Indirect auditory input may also reach area LIP through its connections with the frontal cortex or the superior colliculus; neurons in these brain areas receive input from auditory areas and have responses that are modulated by auditory stimuli (Andersen et al., 1985; Andersen et al., 1990; Barbas, 1988; Barbas et al., 1981; Cavada et al., 1989; Hackett et al., 1999; Harting et al., 1980; Kaas et al., 1998; Romanski et al., 1999; Russo et al., 1994; Schall et al., 1995; Stanton et al., 1995; Vaadia, 1989).

We should note that since the prefrontal and parietal cortices are highly interconnected (Andersen et al., 1985; Barbas et al., 1981; Petrides et al., 1984; Schall et al., 1995; Stanton et al., 1995), it is not surprising that many of aforementioned intermediate processes that are found in the parietal cortex are also found in the prefrontal cortex. Indeed, it is hypothesized that parietal, prefrontal, and other cortical areas form a functional loop that potentially creates and maintains internal representations and possibly transforms these representations into motor acts (Barash et al., 2006; Chafee et al., 2000).

LIP Responses to Auditory Stimuli with Competing Visual Stimuli

In one of the earliest examinations of LIP auditory activity, Grunewald, Linden, and Andersen (Grunewald et al., 1999) tested how training history modulates LIP responses. They first asked “naïve” monkeys to fixate a light while an auditory stimulus was presented at different peripheral locations. In their study, naïve monkeys were those that had not been operantly trained to associate an auditory stimulus with an action and a subsequent reward; these monkeys, though, had been trained to associate a gaze shift to a visual-stimulus location for a juice reward. Grunewald et al. (1999) found that when the monkeys fixated a light, LIP activity was modulated by the locations of visual stimuli but was not modulated by auditory stimuli at comparable locations. However, following auditory training, LIP neurons were modulated by the locations of auditory stimuli when the monkeys participated in this visual-fixation task.

This pattern of results was interpreted to suggest that laboratory-based auditory training induced some form of “oculomotor salience” on the auditory stimuli. That is, the auditory responses reflected the fact that the monkeys learned to associate the stimuli with an action (shift gaze toward their location) to receive a reward (Assad, 2003; Gottlieb et al., 1998; Kusunoki et al., 2000; Linden et al., 1999).

However, if we hypothesize that LIP neurons reflect stimulus salience (Kusunoki et al., 2000), a different interpretation of the Grunewald et al. study emerges. Namely, auditory stimuli are salient stimuli but, in naïve monkeys, LIP auditory responses may be suppressed by a more salient central fixation light. Why would the fixation light suppress the auditory responses? Maybe, in primates, visual stimuli are inherently more salient stimuli than auditory stimuli (Posner et al., 1976). A second possibility is that whereas the auditory stimulus was irrelevant for successful completion of the task, the fixation light was highly salient since the monkeys were required to maintain their gaze at the light to receive a reward.

If this suppression hypothesis is correct, a natural prediction would be that if the central light is removed, LIP neurons should be modulated by auditory stimuli even in the absence of auditory training. To test this prediction, we (Gifford III et al., 2004) recorded LIP activity while naïve monkeys listened to auditory stimuli and either (1) fixated a central light or (2) fixated in the dark without a central light. Consistent with our prediction, LIP neurons were modulated by auditory stimuli and had spatially limited response fields.

We interpreted these data to suggest that LIP neurons reflect the relative salience of stimuli. In naïve monkeys, LIP neurons do not code auditory stimuli when they compete with a more salient visual stimulus. But, they do code these same stimuli when a competitive stimulus (i.e., the central fixation light) is removed from the environment. A recent human electrophysiological and behavioral study is consistent with these ideas: when visual stimuli are removed from the environment, neural processes are “freed” that allow for enhanced auditory processing (Haroush et al., 2008).

Differences Between Auditory and Visual Sensitivity

Why do LIP neurons respond differently to auditory and visual stimuli? One hypothesis is that differences between LIP auditory and visual activity relate to differences between the physical properties of stimuli. Thus, differences between LIP responsivity may simply reflect differences between the physical properties of these two classes of stimuli. One solution to this issue would be to equate the auditory and visual stimuli along a particular psychophysical axis. However, this solution is not as straightforward as it would appear since there is no principled way to determine which axis (e.g., bandwidth, intensity, etc.) is the “proper” one to equate the two stimuli (Spence et al., 2000).

A second hypothesis is that modality-dependent activity reflects differences between the auditory and visual perceptual systems. For instance, the visual system has a higher spatial acuity than the auditory system (Blauert, 1997; Brown et al., 1978a; Brown et al., 1978b; Brown et al., 1980; Recanzone et al., 1998; Wightman et al., 1993). In contrast, in other situations such as when timing information is critical or when visual-spatial information becomes unreliable, aspects of a auditory stimulus may be more perceptually salient than aspects of a visual stimulus (Alais et al., 2004; Fendrich et al., 2001; Shams et al., 2000; Welch et al., 1980). Consequently, depending on the nature of the task, information provided by the visual perceptual system may be more or less salient than information provided by the auditory system. This idea has been formalized computationally within the context of Bayesian inference, where multi-modal percepts are formed as a function of the more “reliable” stimulus (Burr et al., 2006; Deneve et al., 2004; Ma et al., 2008).

A third non-exclusive hypothesis is that modality-dependent LIP activity reflects differences between the relationship of a stimulus and the cognitive or behavioral requirements of a task. For example, during saccade tasks, LIP neurons may respond more to visual stimuli than to auditory stimuli (Linden et al., 1999) because, as discussed above, in the context of planning eye movements, visual stimuli are more salient (Kusunoki et al., 2000; Toth et al., 2002) than auditory stimuli. This possibility may exist despite the fact that the behavior of the monkeys is similar (e.g., saccade to a stimulus location). Indeed, similar outward behavior does not eliminate the possibility that animals use different cognitive strategies and different neural circuits to solve the analogous versions of the same task (Gibson et al., 1997).

To test these hypotheses and, in particular, the latter hypothesis, we had monkeys participate in the predictive-cueing task (Cohen et al., 2004b). The predictive-cueing task is a version of Posner's cueing paradigm (Posner, 1980) that tests the allocation of attention. In our version of the task, monkeys shifted their gaze to a visual target whose location was predicted by the location of an auditory or visual cue. As found in other cueing tasks (Driver et al., 1998; Posner, 1980), the monkeys' response latency was faster to the target when the cue predicted the target location than when the cue was not predictive. More importantly, this “predictive effect” was the same regardless of whether the cue was an auditory cue or a visual cue. This result suggested that, within the context of the task, the auditory cue and visual cue had the same task-related salience. However, despite this equivalence, the mean firing rate of LIP neurons was significantly higher when the visual cue was presented than when the auditory cue was presented.

This result suggests that the link between a stimulus and a task is not the only determinant in the level of LIP activity (Cohen et al., 2004b). If it was the only factor, then LIP activity in response to the auditory cue and visual cue should have been the same during the predictive cueing task. But is this difference between auditory and visual activity absolute or is it task dependent? To address this question, we had monkeys participate in a memory-guided saccade task; in this task, monkeys made saccades to the remembered locations of auditory stimuli or visual stimuli. We found that visual activity during the memory-guided saccade task was comparable to that seen during the predictive-cueing task. However, auditory activity during the memory-guided saccade task was substantially lower than that seen during the predictive-cueing task.

Since auditory activity was task dependent, it suggests that changes in LIP activity reflect differences between the behavioral or task context (salience) of an auditory stimulus (Kusunoki et al., 2000). A similar result was reported by Linden et al. (1999) who observed that LIP neurons were modulated more by an auditory stimulus that signaled the location of a future eye movement than by an auditory stimulus that did not signal the location of a future eye movement. Overall, it appears that the factors that contribute to LIP activity are complex and depend both on the stimulus modality and the behavioral task. However, these modality-dependent differences can be minimized (and hence differences between the representation of salience may be minimized) when auditory and visual stimuli are explicitly equated as they were in the predictive-cueing task.

Spatial Sensitivity and Representations of Auditory and Visual Activity

So far, we have discussed differences between how area LIP represents the more cognitive attributes of auditory and visual stimuli. In this section, we focus on more fundamental properties of the parietal cortex and area LIP. Namely, the auditory and visual spatial properties of LIP neurons (Mullette-Gillman et al., 2005; Mullette-Gillman et al., In Press), which underlie area LIP's role in forming extra-personal spatial representations of attention, salience, and other related factors.

Three important points emerge from the Mullette-Gillman studies (2005; In Press). First, bimodal auditory and visual LIP neurons code comparable regions of auditory space. Second, the reference frame of LIP activity during the presentation of a visual or an auditory stimulus is complex and differs from that previously reported (Cohen et al., 2002; Stricanne et al., 1996). That is, LIP neurons do not preferentially code visual and auditory spatial information in a canonical eye- or head-centered reference frame1, respectively. Instead, the reference frame of both visual and auditory activity can be best described as existing within a continuum of reference frames from eye-centered to head-centered representations. Within this continuum, the auditory and visual reference frames of bimodal LIP neurons are in rough correspondence. Finally, when the monkeys were actually saccading to the location of the auditory or visual target, we found that LIP activity was still not represented in an eye-centered reference frame but continued to exhibit this head-to-eye-centered continuum. We did, though, find that between the time of target presentation to the saccade time there was a slight improvement in the correspondence between visual and auditory signals: auditory signals shifted their coordinates to become slightly more similar to the coordinates of the visual signals. Whereas the rationale as to why the nervous system uses this continuum of reference frames is not known, it is seen in other parietal regions (Schlack et al., 2005) and other cortical systems (Batista et al., 2007; Wu et al., 2006; Wu et al., 2007) suggesting that it may be a ubiquitous computational format (Pouget et al., 1997). Another possibility is that trying to define the reference frame of cortical (or brainstem) activity in a format that is based on sensory properties, muscle forces, etc. may be too simplistic or even ultimately incorrect (Batista et al., 2007; Mullette-Gillman et al., In Press). That is, there is no a priori reason to believe that just because LIP neurons are involved in some aspect of spatial processing, they have to represent that information in a code that is dependent on the sensory stimulus or the eventual motor act (Batista et al., 2007).


An underlying premise of many LIP studies is that area LIP's role in goal-directed behavior is the same regardless of whether the stimulus is auditory or visual (Cohen et al., 2002). However, as we have discussed above, there are substantial functional differences between auditory and visual LIP activity; since these studies used a head-fixed preparation, a significant confound in the monkeys' behavior and the subsequent neural activity may have been introduced (Populin, 2006). Consequently, it is reasonable to hypothesize that these differences do not relate entirely to differences between stimulus saliency, which suggests an alternative role for auditory signals in area LIP.

We hypothesize that area LIP does not perform the same computations on auditory and visual signals (e.g., reflect their salience as a function of firing rate). Instead, auditory and visual signals may play substantially different roles. One possibility is to consider that area LIP is essentially a visual structure and that one of the main functions of auditory signals, and perhaps other extra-visual signals, is to modulate/enhance the computations that area LIP performs on visual stimuli. As such, we suggest that future studies test how the combined integrative effect of simultaneous auditory and visual presentation (Alais et al., 2004; Stein et al., 1993) modulates LIP neurons using more behaviorally-relevant tasks (Populin, 2006). For instance, if there are multiple visual stimuli, a concurrent auditory stimulus at the location of one of the visual stimuli may increase its salience (Kusunoki et al., 2000) and allow for attentional shifts (Goldberg et al., 2002) or eye-movement plans (Snyder et al., 2000) toward its location. Area LIP's primary role then may be to create a visual representation of extra-personal space in which extra-visual signals are used to modulate these representations.


YEC was supported by grants from the NIDCD-NIH and the NIMH-NIH.


1A reference frame can be thought of as a set of axes that describes the location of an object. In the earliest stages of sensory processing, auditory, visual, and other sensory signals are coded in different reference frames. For example, describing the location of an auditory stimulus depends initially on the brain's capacity to correlate differences between the time of arrival and intensity of a sound at the two ears with a location, as well as the brain's ability to correlate the location-dependent filtering properties of ears/head with a sound location. Consequently, identifying the location of a sound depends on the location of the head relative to the location of the sound. This reference-frame is referred to as a “head-centered” reference frame. In contrast, describing the location of a visual stimulus depends initially on the pattern of light that falls on the retinas and the resulting pattern of activity in the photoreceptors. That is, an “eye-centered” reference frame.

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.


  • Ahveninen J, Jaaskelainen IP, Raij T, Bonmassar G, Devore S, Hamalainen M, Levanen S, Lin FH, Sams M, Shinn-Cunningham BG, Witzel T, Belliveau JW. Task-modulated “what” and “where” pathways in human auditory cortex. Proc Natl Acad Sci USA 2006 [PubMed]
  • Alais D, Burr D. The ventriloquist effect results from near-optimal bimodal integration. Curr Biol. 2004;14:257–62. [PubMed]
  • Andersen RA. Inferior parietal lobule function in spatial perception and visuomotor integration. In: Plum F, Mountcastle VM, Geirger SR, editors. The Handbook of Physiology, Section 1: The Nervous System, VolumeV Higher Functions of the Brain. V. American Physiological Society; Bethesda, MD: 1987. pp. 483–518.
  • Andersen RA, Gnadt JW. Posterior parietal cortex. Reviews of Oculomotor Research. 1989;3:315–35. [PubMed]
  • Andersen RA, Asanuma C, Cowan WM. Callosal and prefrontal associational projecting cell populations in area 7A of the macaque monkey: a study using retrogradely transported fluorescent dyes. J Comp Neurol. 1985;232:443–55. [PubMed]
  • Andersen RA, Asanuma C, Essick G, Siegel RM. Corticocortical connections of anatomically and physiologically defined subdivisions within the inferior parietal lobule. J Comp Neurol. 1990;296:65–113. [PubMed]
  • Andersen RA, Snyder LH, Bradley DC, Xing J. Multimodal representation of space in the posterior parietal cortex and its use in planning movements. Ann Rev Neurosci. 1997;20:303–30. [PubMed]
  • Asanuma C, Andersen RA, Cowan WM. The thalamic relations of the caudal inferior parietal lobule and the lateral prefrontal cortex in monkeys: divergent cortical projections from cell clusters in the medial pulvinar nucleus. J Comp Neurol. 1985;241:357–81. [PubMed]
  • Assad JA. Neural coding of behavioral relevance in parietal cortex. Curr Opin Neurobiol. 2003;13:194–7. [PubMed]
  • Baizer JS, Ungerleider LG, Desimone R. Organization of visual inputs to the inferior temporal and posterior parietal cortex in macaques. J Neurosci. 1991;11:168–90. [PubMed]
  • Baldauf D, Cui H, Andersen RA. The posterior parietal cortex encodes in parallel both goals for double-reach sequences. J Neurosci. 2008;28:10081–9. [PMC free article] [PubMed]
  • Barash S, Zhang M. Switching of sensorimotor transformations: antisaccades and parietal cortex. Novartis Found Symp. 2006;270:59–71. discussion 71-4, 108-13. [PubMed]
  • Barbas H. Anatomic organization of basoventral and mediodorsal visual recipient prefrontal regions in the rhesus monkey. J Comp Neurol. 1988;276:313–42. [PubMed]
  • Barbas H, Mesulam MM. Organization of afferent input to subdivisions of area 8 in the rhesus monkey. J Comp Neurol. 1981;200:407–31. [PubMed]
  • Batista AP, Santhanam G, Yu BM, Ryu SI, Afshar A, Shenoy KV. Reference frames for reach planning in macaque dorsal premotor cortex. J Neurophysiol. 2007;98:966–83. [PubMed]
  • Bendiksby MS, Platt ML. Neural correlates of reward and attention in macaque area LIP. Neuropsychologia. 2006;44:2411–20. [PubMed]
  • Binkofski F, Dohle C, Posse S, Stephan KM, Hefter H, Seitz RJ, Freund HJ. Human anterior intraparietal area subserves prehension: a combined lesion and functional MRI activation study. Neurology. 1998;50:1253–9. [PubMed]
  • Bisley JW, Goldberg ME. Neuronal activity in the lateral intraparietal area and spatial attention. Science. 2003;299:81–6. [PubMed]
  • Blatt GJ, Andersen RA, Stoner GR. Visual receptive field organization and cortico-cortical connections of the lateral intraparietal area (area LIP) in the macaque. J Comp Neurol. 1990;299:421–45. [PubMed]
  • Blauert J. Spatial hearing: The psychophysics of human sound localization. MIT Press; Cambridge: 1997.
  • Bremmer F, Schlack A, Duhamel JR, Graf W, Fink GR. Space coding in primate posterior parietal cortex. Neuroimage. 2001a;14:S46–51. [PubMed]
  • Bremmer F, Schlack A, Jon Shah N, Zafiris O, Kubischik M, Hoffman KP, Zilles K, Fink GR. Polymodal motion processing in posterior parietal and premotor cortex: A human fMRI study strongly implies equivalencies between humans and monkeys. Neuron. 2001b;29:287–296. [PubMed]
  • Brown CH, Beecher MD, Moody DB, Stebbins WC. Localization of primate calls by old world monkeys. Science. 1978a;201:753–4. [PubMed]
  • Brown CH, Beecher MD, Moody DB, Stebbins WC. Localization of pure tones by old world monkeys. J Acoust Soc Am. 1978b;63:1484–92.
  • Brown CH, Beecher MD, Moody DB, Stebbins WC. Localization of noise bands by Old World monkeys. J Acoust Soc Am. 1980;68:127–32. [PubMed]
  • Buneo CA, Batista AP, Jarvis MR, Andersen RA. Time-invariant reference frames for parietal reach activity. Exp Brain Res. 2008;188:77–89. [PubMed]
  • Burr D, Alais D. Combining visual and auditory information. Prog Brain Res. 2006;155:243–58. [PubMed]
  • Bushara KO, Hanakawa T, Immisch I, Toma K, Kansaku K, Hallett M. Neural correlates of cross-modal binding. Nat Neurosci. 2003;6:190–5. [PubMed]
  • Bushara KO, Weeks RA, Ishii K, Catalan MJ, Tian B, Rauschecker JP, Hallett M. Modality-specific frontal and parietal areas for auditory and visual spatial localization in humans. Nat Neurosci. 1999;2:759–66. [PubMed]
  • Butters N, Samuels I, Goodglass H, Brody B. Short-term visual and auditory memory disorders after parietal and frontal lobe damage. Cortex. 1970;6:440–59. [PubMed]
  • Cavada C, Goldman-Rakic PS. Posterior parietal cortex in rhesus monkey: II. Evidence for segregated corticocortical networks linking sensory and limbic areas with the frontal lobe. J Comp Neurol. 1989;287:422–45. [PubMed]
  • Chafee MV, Goldman-Rakic PS. Inactivation of parietal and prefrontal cortex reveals interdependence of neural activity during memory-guided saccades. J Neurophysiol. 2000;83:1550–1566. [PubMed]
  • Cohen YE, Andersen RA. Reaches to sounds encoded in an eye-centered reference frame. Neuron. 2000;27:647–52. [PubMed]
  • Cohen YE, Andersen RA. A common reference frame for movement plans in the posterior parietal cortex. Nat Rev Neurosci. 2002;3:553–62. [PubMed]
  • Cohen YE, Andersen RA. Multisensory representations of space in the posterior parietal cortex. In: Calvert GA, Spence C, Stein BE, editors. Handbook of Multisensory Integration. MIT Press; Boston: 2004a. pp. 463–482.
  • Cohen YE, Cohen IS, Gifford GW., III Modulation of LIP activity by predictive auditory and visual cues. Cereb Cortex. 2004b;14:1287–1301. [PubMed]
  • Colby CL, Goldberg ME. Space and attention in parietal cortex. Ann Rev Neurosci. 1999;22:319–49. [PubMed]
  • Connolly JD, Andersen RA, Goodale MA. FMRI evidence for a ‘parietal reach region’ in the human brain. Exp Brain Res. 2003;153:140–5. [PubMed]
  • Connolly JD, Goodale MA, Desouza JF, Menon RS, Vilis T. A comparison of frontoparietal fMRI activation during anti-saccades and anti-pointing. J Neurophysiol. 2000;84:1645–55. [PubMed]
  • Crottaz-Herbette S, Anagnoson RT, Menon V. Modality effects in verbal working memory: differential prefrontal and parietal responses to auditory and visual stimuli. Neuroimage. 2004;21:340–51. [PubMed]
  • Cusack R, Carlyon RP, Robertson IH. Neglect between but not within auditory objects. J Cogn Neurosci. 2000;12:1056–65. [PubMed]
  • Deneve S, Pouget A. Bayesian multisensory integration and cross-modal spatial links. J Physiol Paris. 2004;98:249–58. [PubMed]
  • Deouell LY, Soroker N. What is extinguished in auditory extinction? Neuroreport. 2000a:3059–3062. [PubMed]
  • Deouell LY, Hamalainen H, Bentin S. Unilateral neglect after right-hemisphere damage: contributions from event-related potentials. Audiol Neurootol. 2000b;5:225–34. [PubMed]
  • DeSouza JFX, Dukelow SP, Gati JS, Menon RS, Andersen RA, Vilis T. Eye position signal modulates a human parietal pointing region during memory-guided movements. J Neurosci. 2000;20:5835–5840. [PubMed]
  • Dinstein I, Gardner JL, Jazayeri M, Heeger DJ. Executed and observed movements have different distributed representations in human aIPS. J Neurosci. 2008;28:11231–9. [PMC free article] [PubMed]
  • Divac I, Lavail JH, Rakic P, Winston KR. Heterogenous afferents to the inferior parietal lobule of the rhesus monkey revealed by the retrograde transport method. Brain Res. 1977:197–201. [PubMed]
  • Driver J, Spence C. Crossmodal attention. Curr Opin Neurobiol. 1998;8:245–53. [PubMed]
  • Fendrich R, Corballis PM. The temporal cross-capture of audition and vision. Percept Psychophys. 2001;63:719–25. [PubMed]
  • Gibson JR, Maunsell JH. Sensory modality specificity of neural activity related to memory in visual cortex. J Neurophysiol. 1997;78:1263–75. [PubMed]
  • Gifford GW, III, Cohen YE. The effect of a central fixation light on auditory spatial responses in area LIP. J Neurophysiol. 2004;91:2929–2933. [PubMed]
  • Gnadt JW, Andersen RA. Memory related motor planning activity in posterior parietal cortex of macaque. Exp Brain Res. 1988;70:216–20. [PubMed]
  • Goldberg ME, Bisley J, Powell KD, Gottlieb J, Kusunoki M. The role of the lateral intraparietal area of the monkey in the generation of saccades and visuospatial attention. Ann NY Acad Sci. 2002;956:205–15. [PubMed]
  • Gottlieb J, Kusunoki M, Goldberg ME. Simultaneous representation of saccade targets and visual onsets in monkey lateral intraparietal area. Cereb Cortex. 2005;15:1198–206. [PMC free article] [PubMed]
  • Gottlieb JP, Kusunoki M, Goldberg ME. The representation of visual salience in monkey parietal cortex. Nature. 1998;391:481–4. [PubMed]
  • Griffiths TD, Rees G, Rees A, Green GGR, Witton C, Rowe D, Buchel C, Turner R, Frackwiak RSJ. Right parietal cortex is involved in the perception of sound movement in humans. Nat Neurosci. 1998;1:74–79. [PubMed]
  • Grunewald A, Linden JF, Andersen RA. Responses to auditory stimuli in macaque lateral intraparietal area. I. Effects of training. J Neurophysiol. 1999;82:330–42. [PubMed]
  • Hackett TA, Stepniewska I, Kaas JH. Prefrontal connections of the parabelt auditory cortex in macaque monkeys. Brain Res. 1999;817:45–58. [PubMed]
  • Haroush K, Deouell LY, Hochstein S. Momentary fluctuations in cross-modal attention - Evidence from electrophysiological and behavioral experiments. Society for Neuroscience; Washington, D.C.: 2008. Program No. 852.16 Online.
  • Harting JK, Huerta MF, Frankfurter AJ, Strominger NL, Royce GJ. Ascending pathways from the monkey superior colliculus: an autoradiographic analysis. J Comp Neurol. 1980;192:853–82. [PubMed]
  • Huettel SA, Stowe CJ, Gordon EM, Warner BT, Platt ML. Neural signatures of economic preferences for risk and ambiguity. Neuron. 2006;49:765–75. [PubMed]
  • Hyvärinen J. The Parietal Cortex of Monkey and Man. Springer-Verlag; Berlin: 1982.
  • Ipata AE, Gee AL, Bisley JW, Goldberg ME. Neurons in the lateral intraparietal area create a priority map by the combination of disparate signals. Exp Brain Res 2008 [PMC free article] [PubMed]
  • Jancke L, Kleinschmidt A, Mirzazde S, Shah NJ, Freund HJ. The role of the inferior parietal cortex in linking tactile perception and manual construction of object shapes. Cereb Cortex. 2001;11:114–121. [PubMed]
  • Kaas JH, Hackett TA. Subdivisions of auditory cortex and levels of processing in primates. Audiol Neurootol. 1998;3:73–85. [PubMed]
  • Karabanov A, Blom O, Forsman L, Ullen F. The dorsal auditory pathway is involved in performance of both visual and auditory rhythms. Neuroimage. 2009;44:480–8. [PubMed]
  • Karnath HO, Ferber S, Himmelbach M. Spatial awareness is a function of the temporal not the posterior parietal lobe. Nature. 2001;411:950–3. [PubMed]
  • Kastner S, Ungerleider LG. Mechanisms of visual attention in the human cortex. Ann Rev Neurosci. 2000;23:315–341. [PubMed]
  • Kawashima R, Naitoh E, Matsumura M, Itoh H, Ono S, Satoh K, Gotoh R, Koyama M, Inoue K, Yoshioka S, Fukuda H. Topographic representation in human intraparietal sulcus of reaching and saccade. Neuroreport. 1996;7:1253–6. [PubMed]
  • Klein JT, Deaner RO, Platt ML. Neural correlates of social target value in macaque parietal cortex. Curr Biol. 2008;18:419–24. [PMC free article] [PubMed]
  • Kusunoki M, Gottlieb J, Goldberg ME. The lateral intraparietal area as a salience map: the representation of abrupt onset, stimulus motion, and task relevance. Vis Res. 2000;40:1459–68. [PubMed]
  • Leinonen L, Hyvärinen J, Sovijarvi AR. Functional properties of neurons in the temporo-parietal association cortex of awake monkey. Exp Brain Res. 1980;39:203–15. [PubMed]
  • Levy I, Schluppeck D, Heeger DJ, Glimcher PW. Specificity of human cortical areas for reaches and saccades. J Neurosci. 2007;27:4687–96. [PMC free article] [PubMed]
  • Linden JF, Grunewald A, Andersen RA. Responses to auditory stimuli in macaque lateral intraparietal area. II. Behavioral modulation. J Neurophysiol. 1999;82:343–58. [PubMed]
  • Luna B, Thulborn KR, Strojwas MH, McCurtain BJ, Berman RA, Genovese CR, Sweeney JA. Dorsal cortical regions subserving visually guided saccades in humans: an fMRI study. Cerebral Cortex. 1998;8:40–7. [PubMed]
  • Lynch JC, Graybiel AM, Lobeck LJ. The differential projection of two cytoarchitectonic subregions of the inferior parietal lobule of macaque upon the deep layers of the superior colliculus. J Comp Neurol. 1985;235:241–54. [PubMed]
  • Ma WJ, Pouget A. Linking neurons to behavior in multisensory perception: a computational review. Brain Res. 2008;1242:4–12. [PubMed]
  • Mazzoni P, Bracewell RM, Barash S, Andersen RA. Spatially tuned auditory responses in area LIP of macaques performing delayed memory saccades to acoustic targets. J Neurophysiol. 1996;75:1233–41. [PubMed]
  • McCoy AN, Platt ML. Expectations and outcomes: decision-making in the primate brain. J Comp Physiol A Neuroethol Sens Neural Behav Physiol. 2005;191:201–11. [PubMed]
  • Mullette-Gillman OA, Cohen YE, Groh JM. Eye-centered, head-centered, and complex coding of visual and auditory targets in the intraparietal sulcus. J Neurophysiol. 2005;94:2331–2352. [PubMed]
  • Mullette-Gillman OA, Cohen YE, Groh JM. Motor-related signals in a hybrid reference frame in intraparietal cortex. Cereb Cortex In Press. [PMC free article] [PubMed]
  • Pandya DN, Kuypers HG. Cortico-cortical connections in the rhesus monkey. Brain Res. 1969;13:13–36. [PubMed]
  • Pandya DN, Yeterian EH. Architecture and connections of cortical association areas. In: Peters A, Jones EG, editors. Cerebral Cortex, Volume IV: Association and auditory cortices. Plenum Press; New York: 1985. pp. 3–61.
  • Petrides M, Pandya DN. Projections to the frontal cortex from the posterior parietal region in the rhesus monkey. Journal of Comparative Neurology. 1984;228:105–16. [PubMed]
  • Phan ML, Schendel KL, Recanzone GH, Robertson LC. Auditory and visual spatial localization deficits following bilateral parietal lobe lesions in a patient with Balint's syndrome. J Cogn Neurosci. 2000;12:583–600. [PubMed]
  • Platt ML, Glimcher PW. Responses of intraparietal neurons to saccadic targets and visual distractors. J Neurophysiol. 1997;78:1574–89. [PubMed]
  • Platt ML, Glimcher PW. Neural correlates of decision variables in parietal cortex. Nature. 1999;400:233–8. [PubMed]
  • Populin LC. Monkey sound localization: head-restrained versus head-unrestrained orienting. J Neurosci. 2006;26:9820–32. [PubMed]
  • Posner MI. Orienting of attention. Q J Exp Psychol. 1980;32:3–25. [PubMed]
  • Posner MI, Nissen MJ, Klein RM. Visual dominance: an information-processing account of its origins and significance. Psychol Rev. 1976;83:157–71. [PubMed]
  • Pouget A, Sejnowski TJ. Spatial transformations in the parietal cortex using basis functions. J Cogn Neurosci. 1997:222–237. [PubMed]
  • Powell KD, Goldberg ME. Response of neurons in the lateral intraparietal area to a distractor flashed during the delay period of a memory-guided saccade. J Neurophysiol. 2000;84:301–10. [PubMed]
  • Recanzone GH, Makhamra SD, Guard DC. Comparison of relative and absolute sound localization ability in humans. J Acoust Soc Am. 1998;103:1085–97. [PubMed]
  • Romanski LM, Tian B, Fritz J, Mishkin M, Goldman-Rakic PS, Rauschecker JP. Dual streams of auditory afferents target multiple domains in the primate prefrontal cortex. Nat Neurosci. 1999;2:1131–6. [PMC free article] [PubMed]
  • Rushworth MF, Paus T, Sipila PK. Attention systems and the organization of the human parietal cortex. J Neurosci. 2001;21:5262–71. [PubMed]
  • Russ BE, Lee YS, Cohen YE. Neural and behavioral correlates of auditory categorization. Hear Res. 2007;229:204–212. [PubMed]
  • Russo GS, Bruce CJ. Frontal eye field activity preceding aurally guided saccades. J Neurophysiol. 1994;71:1250–3. [PubMed]
  • Sabes PN, Breznen B, Andersen RA. Parietal representation of object-based saccades. J Neurophysiol. 2002;88:1815–29. [PubMed]
  • Schall JD. On building a bridge between brain and behavior. Annu Rev Psychol. 2004;55:23–50. [PubMed]
  • Schall JD, Morel A, King DJ, Bullier J. Topography of visual cortex connections with frontal eye field in macaque: convergence and segregation of processing streams. J Neurosci. 1995;15:4464–87. [PubMed]
  • Schlack A, Sterbing-D'Angelo SJ, Hartung K, Hoffmann KP, Bremmer F. Multisensory space representations in the macaque ventral intraparietal area. J Neurosci. 2005;25:4616–25. [PubMed]
  • Schluppeck D, Glimcher P, Heeger DJ. Topographic organization for delayed saccades in human posterior parietal cortex. J Neurophysiol. 2005;94:1372–84. [PMC free article] [PubMed]
  • Schluppeck D, Curtis CE, Glimcher PW, Heeger DJ. Sustained activity in topographic areas of human posterior parietal cortex during memory-guided saccades. J Neurosci. 2006;26:5098–108. [PMC free article] [PubMed]
  • Seltzer B, Pandya DN. Post-rolandic cortical projections of the superior temporal sulcus in the rhesus monkey. J Comp Neurol. 1991;312:625–40. [PubMed]
  • Shadlen MN, Newsome WT. Neural basis of a perceptual decision in the parietal cortex (area LIP) of the rhesus monkey. J Neurophysiol. 2001;86:1916–36. [PubMed]
  • Shams L, Kamitani Y, Shimojo S. What you see is what you hear. Nature. 2000;408:788. [PubMed]
  • Silver MA, Ress D, Heeger DJ. Topographic maps of visual spatial attention in human parietal cortex. J Neurophysiol. 2005;94:1358–71. [PMC free article] [PubMed]
  • Snyder LH. Coordinate transformations for eye and arm movements in the brain. Curr Opin Neurobiol. 2000;10:747–754. [PubMed]
  • Snyder LH, Batista AP, Andersen RA. Intention-related activity in the posterior parietal cortex: a review. Vis Res. 2000;40:1433–41. [PubMed]
  • Snyder LH, Grieve KL, Brotchie P, Andersen RA. Separate body- and world-referenced representations of visual space in parietal cortex. Nature. 1998;394:887–91. [PubMed]
  • Spence C, Lloyd D, McGlone F, Nicholls ME, Driver J. Inhibition of return is supramodal: a demonstration between all possible pairings of vision, touch, and audition. Exp Brain Res. 2000;134:42–8. [PubMed]
  • Stanton GB, Bruce CJ, Goldberg ME. Topography of projections to posterior cortical areas from the macaque frontal eye fields. J Comp Neurol. 1995;353:291–305. [PubMed]
  • Stein BE, Meredith MA. The Merging of the Senses. MIT Press; Cambridge, MA: 1993.
  • Stricanne B, Andersen RA, Mazzoni P. Eye-centered, head-centered, and intermediate coding of remembered sound locations in area LIP. J Neurophysiol. 1996;76:2071–6. [PubMed]
  • Sugrue LP, Corrado GS, Newsome WT. Matching behavior and the representation of value in the parietal cortex. Science. 2004;304:1782–7. [PubMed]
  • Sugrue LP, Corrado GS, Newsome WT. Choosing the greater of two goods: neural currencies for valuation and decision making. Nat Rev Neurosci. 2005;6:363–75. [PubMed]
  • Tosoni A, Galati G, Romani GL, Corbetta M. Sensory-motor mechanisms in human parietal cortex underlie arbitrary visual decisions. Nat Neurosci. 2008;11:1446–1453. [PMC free article] [PubMed]
  • Toth LJ, Assad JA. Dynamic coding of behaviourally relevant stimuli in parietal cortex. Nature. 2002;415:165–8. [PubMed]
  • Ungerleider LG, Mishkin M. Two cortical visual systems. In: Ingle DJ, Goodale MA, Mansfield RJW, editors. Analysis of Visual Behavior. MIT Press; Cambridge, MA: 1982.
  • Vaadia E. Single-unit activity related to active localization of acoustic and visual stimuli in the frontal cortex of the rhesus monkey. Brain, Behavior & Evolution. 1989;33:127–31. [PubMed]
  • Warren JD, Zielinski BA, Green GG, Rauschecker JP, Griffiths TD. Perception of sound-source motion by the human brain. Neuron. 2002;34:139–48. [PubMed]
  • Welch RB, Warren DH. Immediate perceptual response to intersensory discrepancy. Psychological Bulletin. 1980;88:638–67. [PubMed]
  • Wightman FL, Kistler DJ. Sound Localization. In: Yost WA, Popper AN, Fay RR, editors. Human Psychophysics. Springer-Verlag; New York: 1993. pp. 155–192.
  • Wu W, Hatsopoulos N. Evidence against a single coordinate system representation in the motor cortex. Exp Brain Res. 2006;175:197–210. [PubMed]
  • Wu W, Hatsopoulos NG. Coordinate system representations of movement direction in the premotor cortex. Exp Brain Res. 2007;176:652–7. [PubMed]