Search tips
Search criteria 


Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
Hear Res. Author manuscript; available in PMC 2010 November 22.
Published in final edited form as:
PMCID: PMC2989528

Multisensory integration in auditory and auditory-related areas of cortex

Our sensory systems are bombarded by continuous streams of input from both correlated and uncorrelated sources, and these shape our perceptions of the world. In some contexts, the events acting on one sensory modality occur in relative isolation, but most often, they occur along with salient events involving one or more of the other modalities. In the auditory domain, for example, the sounds of speech are commonly accompanied by a visual representation of the speaker, who, in face-to-face communication, may also add tactile and even olfactory information to the experience. At other times, as when conversing over the telephone or listening to recorded music, auditory events are received in the absence of corresponding inputs from other sensory systems, which are likely engaged in the processing of uncorrelated visual or tactile events. These examples are typical of the myriad sensory events from which our nervous system extracts the meaningful information that is needed to guide behavior.

Considering the dynamic and remarkably rich sensory environment in which we are immersed, one of the problems facing sensory systems researchers concerns how multiple sensory inputs are integrated, perceived and used to guide behavior. The classic understanding of brain organization held that each of unimodal sensory cortices passed information on to higher order areas, which mediated the integration of this information. Within the last decade, however, it has become clear that even primary and secondary sensory areas in cortex receive substantial inputs from sources that carry information about events impacting other modalities. Auditory cortex is a case in point. In just a few years, in fact, the focus has shifted away from fundamental questions about whether these interactions actually occur in auditory cortex to an intensive effort, involving dozens of laboratories worldwide, to identify and describe the structural and functional mechanisms that are at work. Ironically, this all comes at a time when we are still struggling to determine how auditory cortex is organized and how even simple sounds are processed within each of its subdivisions.

With these questions in mind, we were asked by Jos Eggermont to edit this special issue of Hearing Research, which is centered on the structural and functional aspects of multisensory integration in auditory and auditory-related areas of cortex. Articles were solicited from investigators whose approaches to this subject have yielded important anatomical, neurophysiological, computational, and behavioral insights into the impact of non-auditory inputs on auditory cortical processing. Authors were asked to provide readers with a thorough review of their research area, and invited to briefly highlight new research findings relevant to the topic.

The first contribution is from Barry Stein, Terrence Stanford, and Benjamin Rowland who provides important background with their review of the neural basis of multisensory integration in the superior colliculus. The investigation of the principles of multisensory integration, pioneered by Barry Stein and colleagues over many years of work on the midbrain, is foundational for investigations of those processes in the forebrain.

This is followed by contributions from three different teams of investigators engaged in anatomical research in several species to establish the structural bases of multisensory interactions in cortex. Eike Budinger and Henning Scheich review evidence of multisensory interactions in A1 of the rodent, with an emphasis on their own work in the Mongolian gerbil. They demonstrate inputs to A1 from visual, somatosensory, and olfactory systems, and provide a detailed summary of the connections of auditory cortex in this species. Celine Cappe and Pascal Barone review anatomical and physiological studies in the marmoset and macaque monkeys. They highlight connections between primary and secondary sensory cortical areas of the auditory, somatosensory, and visual modalities along with a discussion of thalamocortical and corticothalamic interactions with multisensory nuclei in thalamus. These results are discussed with respect to behavioral and electrophysiological studies in monkeys in which somatosensory and visual inputs have been shown to modulate activity in auditory cortex. John Smiley and Arnaud Falchier focus on inputs to macaque monkey auditory cortex from the sources of non-auditory input in the thalamus and cortex, including somatosensory, visual, association, and limbic regions. They also discuss apparent species differences in multisensory connectivity, especially in regard to circuits involving the primary areas. They relate these connections to physiological studies in which visual and tactile interactions have been documented.

Three articles examine neurophysiological and behavioral correlates of multisensory interactions in non-primate animal models. The paper by Juliane Krueger, David Royal, Matthew Fister, and Mark Wallace highlights studies of spatial receptive field organization of neurons in the cat anterior ectosylvian sulcus and superior colliculus for auditory and visual stimuli. Although multisensory activity in the two regions are distinct and contribute to different functions, the architecture of unisensory and multisensory receptive fields is often very similar, suggesting a degree of continuity between the midbrain and some areas of cortex.

Jennifer Bizley and Andrew King review evidence of visual influences on signal processing in auditory cortex based on their work in the ferret. Populations of auditory, visual and bisensory neurons are located in all areas of ferret auditory cortex, but proportions vary between areas, as do the types of multisensory interactions that are observed. A key finding from their studies is that visual inputs to auditory cortex can serve to enhance processing of auditory signals. Alex Meredith, Brian Allman, Leslie Keniston, and Ruth Clemo follow up and greatly expand on the last theme by examining the impact of auditory activity on non-auditory cortex. They review evidence that auditory inputs to areas of another sensory modality are largely modulatory in nature, characterized by facilitative or suppressive effects. These effects may be relatively subtle in magnitude, but appear to provide meaningful adjustments to the ongoing activity of a given area.

Six papers discuss the underlying physiology and behavioral impact of multisensory processing based on findings in non-human primate models. Gabriella Musacchia and Charles Schroeder expand a theme raised in the anatomical papers (Budinger and Scheich, Cappe et al., Smiley and Falchier, this issue), emphasizing the potential importance of extralemniscal thalamic input for multisensory processing in auditory cortex. They connect this theme to timing factors that both enable and constrain multisensory interactions in non-human primates and then extrapolate these values to the human. They then expand the discussion of the fundamental distinction between driving and modulatory inputs in multisensory integration (Bizley et al., and Meredith et al., this issue), particularly as it relates to level of the system at which integration occurs. They end by briefly exploring the idea of music as a venue for multisensory research. Christoph Kayser, Christopher Petkov, and Nikos Logothetis provide an elegant complement to the anatomical connectivity studies (Budinger and Scheich, Cappe et al., Smiley and Falchier, this issue), reviewing evidence from functional magnetic resonance imaging experiments in humans and monkeys, that multisensory interactions are prominent at the earliest stages of auditory cortical processing, including primary areas. They also demonstrate the productivity of combining BOLD signal measurements, useful for identification of areas in which multisensory activity occurs, with electrophysiological investigations, which yield detailed information about such coding in single neurons or populations. Gregg Recanzone reviews studies that have utilized illusory percepts in order to better understand the integration of auditory and visual signals in human and nonhuman primates. In particular, he discusses results from human psychophysical experiments where visual stimuli alter the perception of acoustic space (the ventriloquism effect) along with the experiments in monkeys probing the underlying cortical mechanisms of this integration. He also considers similar psychophysical experiments in which auditory stimuli alter the perception of visual temporal processing. Yale Cohen reviews studies that have investigated auditory, visual, and auditory-visual interactions in the lateral intraparietal (LIP) area of monkeys, one of several that receive auditory and visual input in the posterior parietal cortex. Neuronal activity elicited by auditory and visual stimulation in LIP exhibits some correspondence and bimodal interactions, but auditory and visual signals do not impact behavior or neurophysiology in the same manner, indicating that their correspondence is not precise. It is suggested that the main role of audition in the lateral intraparietal area may be to modulate the visual representation of space. Joost X. Maier and Jennifer M. Groh draw our attention to the fact that at the input stage, vision and auditory operate with different reference coordinates, and that to integrate auditory and visual reference frames, several coordinate transformations are required. They review a number of studies that have focused on the neural basis underlying such transformations in the primate auditory system, with emphasis on the way auditory and visual information is used in guiding orienting movements. Asif Ghazanfar presents evidence that primates link visual and auditory communication signals in a behavioral context, reviewing studies on the interactions between the processing of faces and voices in auditory cortex and the superior temporal sulcus. He also discusses how the auditory cortex may contribute through its connections with association areas, and the potential role that proprioceptive, somatosensory and motor inputs may play in vocal communication.

Five papers addressed the broad topic of multisensory integration in human subjects. Micah M. Murray and Lucas Spierer further expand on a theme raised in the nonhuman primate studies (Musacchia and Schroeder; Recanzone; this issue), that of auditory timing patterns across areas, and their consequences for multisensory interactions in humans. They go on to overview the dual pathway model for the treatment of auditory spatial and object information, the constraints this architecture imposes on multisensory interactions, and on spatial and object processing by specific brain regions, and conclude by discussing open issues and directions for future research. Charles Spence and Valerio Santangelo note the rather surprising fact that it is not established whether multisensory cues are any more effective in capturing a person’s spatial attention than unimodal cues. They review the empirical literature on multisensory spatial cuing effects and highlight the important “attention-grabbing” advantages held by multisensory cues when the system is under conditions of high perceptual load. They end by considering the implications that this research has for the design of more effective warning signals in applied settings. Julien Besle, Olivier Bertrand, and Marie-Hélène Giard review studies examining the contribution of human electrophysiological studies (EEG, sEEG, and MEG) to the study of visual influence on processing in the auditory cortex. They highlight the importance of the additive model as a conceptual tool in the study of audiovisual interaction and show the importance of considering the spatial distribution of effects. Their analysis points to the probable role of sensory, attentional and task-related factors in modulating audiovisual interactions in the auditory cortex. Nienke van Atteveldt, Alard Roebroeck, and Rainer Goebel review neuroimaging studies designed to elucidate how speech and script are associated in the adult “literate” brain, focusing on the role of different stimulus and task factors and effective connectivity between different brain regions. They show that modulation of auditory cortex is most strongly mediated by feedback from heteromodal areas in the superior temporal cortex, while not excluding direct influences from visual cortex. They conclude with several recommendations and a model to guide future research. Kirsten Hötting and Brigitte Röder consider the idea that increased use of the auditory system results in compensatory behavior in the blind. They detail some of the evidence that neural plasticity at several levels of the auditory processing stream, and in particular, experience-dependent reorganization of multisensory brain areas, might underlie these behavioral benefits.

The articles in this issue represent a broad cross section of the efforts being brought to bear on subject of multisensory processing in and around auditory cortex. The virtual explosion of recent evidence on this theme indicates that even at its first processing stage, auditory cortex is subject to profound non-auditory influences, and obviously, this raises more questions than it answers. The papers in this issue highlight a number of questions that may help to guide future experiments:

  1. How well does a particular non-auditory input in cortex conform to principles of multisensory integration established for neurons in the midbrain, and what specific type of behavioral function (e.g., orientation vs discrimination), or cognitive function (e.g., spatial vs object representation) is at play?
  2. What are the specific anatomical characteristics (feedforward, feedback, lateral) and origins (cortical vs thalamic) of the input, and what specific neuron populations does it contact?
  3. Does a non-auditory influence reflect a “driving” input (i.e., one that directly elicits action potentials) or a “modulatory” input (one that affects the probability that an auditory input will drive action potentials). Otherwise stated, does the input inject more complex, multidimensional element into auditory processing, or does it simply help us hear better?
  4. What are the dynamics (e.g., the relative timing of auditory and non-auditory inputs), and to what extent is the non-auditory input predictive of an auditory input?
  5. How does a particular non-auditory input respond to sensory perturbation (e.g., deafening), practice and over-training, and how important is attention in controlling the expression and impact of a non-auditory input?

Several of the papers emphasized the issue of what the auditory system has to offer (e.g., temporal resolution) and what it must borrow (e.g., spatial resolution) from vision and somatosensation. In going forward, it is worthwhile to keep this issue firmly in mind. It is likewise important to keep in mind that in both vision and somatosensation, sensory stimulation is most often “acquired” as a result of eye and hand movements, and processing is thus enslaved to a motor plan, whereas in audition this is less often the case. This contrast again underscores the importance of attention on multisensory as well as unisensory aspects of auditory processing.

Contributor Information

Troy A. Hackett, Vanderbilt University School of Medicine 301 Wilson Hall, 111 21st Avenue South, Nashville, TN 37203, USA, Tel.: +1 615 322 7491, ude.tlibrednav@ttekcah.a.yort.

Charles E. Schroeder, Columbia University, Cog. Neurosci. & Schizophrenia Program, 140 Old Orangeburg Road, Orangeburg, NY 10962, USA, Tel.: +1 914 398 6539; fax: +1 914 398 6545. gro.hmfr.ikn@dorhcs.