When low (3–5 kHz) and high (7–9 kHz) frequency sounds of equal level occur simultaneously but originate from different locations, owls behave as though the sounds come from a single location. When the sound sources are separated in azimuth, owls tend to orient to the location of the low frequency sound; when they are separated in elevation, they tend to orient to the location of the high frequency sound. Thus, low frequency sounds dominate localization in azimuth, whereas high frequency sounds dominate localization in elevation.
The pattern of neural activity in the OT space map can explain these remarkable behavioral results. In response to two simultaneous low and high frequency sounds of approximately equal levels, the space map briefly represents the locations of both sounds, then shifts rapidly to a representation that heavily favors the dominant sound (the low frequency sound for sounds separated in azimuth and the high frequency sound for sounds separated in elevation) (,). Thus, in response to spatially discrepant simultaneous sounds, the auditory space map rapidly and automatically suppresses the spatial representation of the subordinate stimulus and maintains the spatial representation of the dominant stimulus. The frequency-dependence of this effect indicates that the underlying mechanism must operate before the level of the OT, which receives inputs that are already broadly tuned to frequency. In the midbrain pathway that leads to the OT, the likely site is the ICX, where information converges across frequencies to create a map of space 
The dominance of the low frequency sound over the high frequency sound, observed for sounds separated in azimuth, was not due to low frequency masking of high frequency information. Low frequency spectral masking refers to the ability of a lower frequency sound to disrupt perception of a higher frequency sound. Our data cannot be explained by spectral masking because the frequency of the stimulus that dominated sound localization depended on whether the stimuli were separated in azimuth or elevation. Low frequency masking would predict low frequency dominance regardless of the direction of the spatial separation.
Most of the data are based on recordings from multiple units. It is likely that some units in the multiunit recordings were responding more to the high frequency stimulus and others to the low frequency stimulus. This likelihood notwithstanding, unit responses to high frequency sounds were suppressed by the presence of a low frequency sound for azimuth separations, and unit responses to low frequency sounds were suppressed by the presence of a high frequency sound for elevation separations. Multiunit recordings increase confidence that this remarkable phenomenon is a general property of the entire population of tectal units.
The stimulus location that owls orient toward behaviorally corresponds to the location represented in the space map >16 ms after sound onset ( and ). The transition from an initial representation of both stimuli to a differential representation of just one stimulus progresses during the first 8 ms of the neural response in the OT (). This implies that the owl's decision of where to orient, if based on OT activity, is determined by the pattern of neural activity more than 16 ms after sound onset, at least when the representation of stimulus location is shifting dynamically () due to conflicting spatial cues.
When the level of the subordinate sound is much greater than that of the dominant sound, the owl's orientation responses to the paired stimuli become variable and, in some cases, bimodally distributed (). Neural recordings from the OT space map exhibit a similar pattern. The bimodal distribution of late neural responses when the level of the high frequency sound is much greater than that of the low frequency sound (, lower right), indicates that when the relative level of the subordinate sound increases sufficiently, the representation of the location of the subordinate sound increases.
A multiplicative rule can explain localization cue dominance
A multiplicative rule for input integration would enhance responses when cues are mutually consistent and would suppress responses when cues are mutually contradictory. A multiplicative rule has been shown previously to operate in the ICX, the processing step before the OT 
. A multiplicative rule, applied to the neural population data ( and ), can account qualitatively for the dominance of low frequency cues in azimuth as well as for the dominance of high frequency cues in elevation. In azimuth, the low frequency sound drives no responses when the stimulus is more than 25° to the side of the RF center (; L30°). According to a multiplicative rule, an absence of low frequency input would cancel the effect of the high frequency input, as observed in responses to both sounds together (; black curve). In contrast, the high frequency sound continues to drive responses when the stimulus is located 25° or more to the side of the RF center (; R30°). According to a multiplicative rule, the continued high frequency input would enhance responses to the low frequency sound, as observed. Similarly, the data from the individual site shown in are consistent with a multiplicative rule operating on sub-threshold inputs 
that exhibit the same spatial patterns as those of the population data.
A multiplicative rule could also account for the dominance of high frequency cues when sounds are separated in elevation (). The high frequency stimulus did not drive responses when the sound was located 30° or more above or below the RF center (, +30°). According to a multiplicative rule, an absence of high frequency input would cancel responses to the low frequency input, as observed. In contrast, the low frequency sound continued to drive responses when the stimulus was more than 30° from the RF center (, −30°), and this continued drive would enhance responses to the high frequency sound, as observed.
Relative weighting of low and high frequency cues
The information that is provided by a localization cue depends on its spatial resolution and on the spatial ambiguity in interpreting the cue. The spatial resolution of a cue depends both on the rate at which the cue's value changes with source location and on the ability of the auditory system to discriminate those values. The spatial ambiguity in interpreting cue values arises because most cue values are produced by sounds from many different locations.
We propose that, for localizing sounds in elevation, high frequency cues dominate over low frequency cues because of the superior spatial resolution of the high frequency cues and the higher neural gain afforded to high frequency channels. In elevation, the acoustic data show a 3-fold higher rate of change of the ILD cue (dB/deg) for the high frequency sound than for the low frequency sound (s). In addition, in the brainstem nucleus that measures ILD, frequencies above 5 kHz are over-represented and the ILD sensitivity of neurons tuned to frequencies above 5 kHz is greater than that of neurons tuned to lower frequencies 
. These factors are consistent with the sharper elevational tuning for the high frequency sound relative to the low frequency sound that we observed (). Another factor that favors the representation of the high frequency sound is that, on average, the high frequency sound drives nearly twice as many spikes as does the low frequency sound when each sound is presented alone (). The stronger response to the high frequency inputs may reflect the fact that only the higher frequencies contain high-resolution information about the elevation of the source. Since barn owls are aerial predators, information about the elevation of an auditory stimulus is essential to targeting prey.
Acoustic spatial cues generated by the low and high frequency sounds.
In contrast, both high spatial resolution and low spatial ambiguity favor the low frequency cues when localizing in azimuth. The rate of change of the high frequency IPD cue (radians/deg) is twice as large as that for the low frequency cue (s). However, the capacity of the auditory system to encode IPD declines sharply with increasing frequency 
. We found that the average azimuthal tuning for the high frequency sound was actually less sharp than that for the low frequency sound (width at half-max: high
32°±12°; ), implying that the decline in the auditory system's capacity to encode IPD at high frequencies is more severe than the increase in the rate of change in IPD with sound azimuth. Moreover, the interpretation of the high frequency IPD cue is ambiguous even in frontal space, since equivalent IPD values correspond to different azimuths separated by about 50° (, matching colors). We propose, therefore, that for sound localization in azimuth, low frequency cues dominate over high frequency cues, because of their superior spatial resolution and low spatial ambiguity.
The amplitude of the sound that provides the cue is another factor that influences the contribution of a cue in the determination of stimulus location. As the relative level of a frequency band increases, the neural representation of the sound's location becomes progressively more influenced by the spatial information provided by that frequency band (). This neurophysiological effect could explain the shift in the distribution of behavioral responses that was observed when the amplitude of the subordinate (high frequency) sound was increased to well beyond that of the dominant sound ().
In summary, the data indicate that when inferring the location of a sound source, the auditory system weights the information provided by different cues based on their relative spatial resolution, spatial ambiguity, and the relative amplitude of the sound that provided the cue.
The representation of multiple sound sources
In this study, we used multiple sound sources to create discrepant spatial cues. Previous studies have used multiple sounds for similar purposes. One group of studies employed the “precedence effect” whereby in response to slightly asynchronous sounds from different locations the auditory system attributes the location from the later sound to the location of the earlier sound. In this case, the auditory system groups the sounds, and differentially weights the spatial cues provided by the earlier sound 
. Neurophysiological studies have revealed a strong suppression in the representation of the second sound in the auditory space map 
In other studies, paired, simultaneous sounds with identical waveforms were presented from different locations to produce a “phantom image” in the auditory space map that was located in between the locations of the two individual sounds 
. This example of the “stereo effect” is due to acoustic interactions and not to neural processing.
Studies most similar to ours involved presenting owls with simultaneous sounds from different locations, but with overlapping frequency spectra 
. Unlike in the stereo experiments, the waveform microstructure differed between the two sounds in these experiments. Under these conditions, the owl's auditory system represents the locations of both stimuli. This is because the auditory system evaluates IPD and ILD cues on a millisecond time-scale and, when there are two sources, the relative amplitudes of each frequency component for each source varies on this time-scale. As a result, for any given frequency at any moment in time, one source tends to be represented preferentially and, over time, both sounds are represented. This suggests that the flickering of the represented IPD value and ILD value between two sets of values within frequency channels on a millisecond time-scale is a reliable indicator of the presence of two sources at different locations. In our study, we used non-overlapping frequency bands, thereby eliminating this within-frequency indicator of multiple sources. In the absence of this indicator, spatial information from simultaneous sounds may be integrated according to the rules for cue dominance revealed in this study.
Comparison with human psychophysics
When humans are presented with simultaneous sounds with non-overlapping spectra from different azimuths, localization is biased towards the location of the low frequency sound, an effect that is reminiscent of the effect we observed in barn owls. In humans, the results indicate that sound localization cues are weighted differentially according to the spatial resolution provided by each cue: the discriminability index (d'
) of each cue was sufficient to quantitatively predict the rules of integration. In addition, spatial ambiguity may also influence the relative weightings of cues in humans, although the contribution of this factor has not been tested. For humans, spatial ambiguity is not a factor for interpreting IPD because our auditory system does not measure IPDs for frequencies high enough to produce the same IPD value from different azimuths 
. Spatial ambiguity is a factor, however, for interpreting ILD cues. ILD cues follow complex spatial patterns, and the complexity of the spatial pattern increases with frequency 
. If spatial ambiguity contributes to the dominance hierarchy of cues for deriving sound source location in humans, as it appears to in owls, then lower frequencies should continue to dominate the localization of higher frequencies even above 4 kHz, for which ILD cues are most important for human localization. A similarity in the rules for the dominance hierarchy of sound localization cues in humans and owls suggests that, in both species, the auditory system infers the location of a sound by weighting differentially the highest resolution and least ambiguous cues.