PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-5 (5)
 

Clipboard (0)
None

Select a Filter Below

Journals
Authors
Year of Publication
Document Types
1.  Action enhances auditory but not visual temporal sensitivity 
Psychonomic bulletin & review  2013;20(1):108-114.
People naturally dance to music, and research has shown that rhythmic auditory stimuli facilitate production of precisely timed body movements. If motor mechanisms are closely linked to auditory temporal processing, just as auditory temporal processing facilitates movement production, producing action might reciprocally enhance auditory temporal sensitivity. We tested this novel hypothesis with a standard temporal-bisection paradigm, in which the slope of the temporal-bisection function provides a measure of temporal sensitivity. The bisection slope for auditory time perception was steeper when participants initiated each auditory stimulus sequence via a keypress than when they passively heard each sequence, demonstrating that initiating action enhances auditory temporal sensitivity. This enhancement is specific to the auditory modality, because voluntarily initiating each sequence did not enhance visual temporal sensitivity. A control experiment ruled out the possibility that tactile sensation associated with a keypress increased auditory temporal sensitivity. Taken together, these results demonstrate a unique reciprocal relationship between auditory time perception and motor mechanisms. As auditory perception facilitates precisely timed movements, generating action enhances auditory temporal sensitivity.
doi:10.3758/s13423-012-0330-y
PMCID: PMC3558542  PMID: 23090750
Action; Auditory temporal sensitivity; Visual temporal sensitivity
2.  Object-based auditory facilitation of visual search for pictures and words with frequent and rare targets 
Acta psychologica  2010;137(2):252-259.
Auditory and visual processes demonstrably enhance each other based on spatial and temporal coincidence. Our recent results on visual search have shown that auditory signals also enhance visual salience of specific objects based on multimodal experience. For example, we tend to see an object (e.g., a cat) and simultaneously hear its characteristic sound (e.g., “meow”), to name an object when we see it, and to vocalize a word when we read it, but we do not tend to see a word (e.g., cat) and simultaneously hear the characteristic sound (e.g., “meow”) of the named object. If auditory-visual enhancements occur based on this pattern of experiential associations, playing a characteristic sound (e.g., “meow”) should facilitate visual search for the corresponding object (e.g., an image of a cat), hearing a name should facilitate visual search for both the corresponding object and corresponding word, but playing a characteristic sound should not facilitate visual search for the name of the corresponding object. Our present and prior results together confirmed these experiential-association predictions. We also recently showed that the underlying object-based auditory-visual interactions occur rapidly (within 220 ms) and guide initial saccades towards target objects. If object-based auditory-visual enhancements are automatic and persistent, an interesting application would be to use characteristic sounds to facilitate visual search when targets are rare, such as during baggage screening. Our participants searched for a gun among other objects when a gun was presented on only 10% of the trials. The search time was speeded when a gun sound was played on every trial (primarily on gun-absent trials); importantly, playing gun sounds facilitated both gun-present and gun-absent responses, suggesting that object-based auditory-visual enhancements persistently increase the detectability of guns rather than simply biasing gun-present responses. Thus, object-based auditory-visual interactions that derive from experiential associations rapidly and persistently increase visual salience of corresponding objects.
doi:10.1016/j.actpsy.2010.07.017
PMCID: PMC3010345  PMID: 20864070
3.  Characteristic sounds make you look at target objects more quickly 
When you are looking for an object, does hearing its characteristic sound make you find it more quickly? Our recent results supported this possibility by demonstrating that when a cat target, for example, was presented among other objects, a simultaneously presented “meow” sound (containing no spatial information) reduced the manual response time for visual localization of the target. To extend these results, we determined how rapidly an object-specific auditory signal can facilitate target detection in visual search. On each trial, participants fixated a specified target object as quickly as possible. The target’s characteristic sound speeded the saccadic search time within 215–220 ms and also guided the initial saccade toward the target, compared to presentation of a distractor’s sound or to no sound. These results suggest that object-based auditory-visual interactions rapidly increase the target object’s salience in visual search.
doi:10.3758/APP.72.7.1736
PMCID: PMC3261720  PMID: 20952773
4.  Demand-based dynamic distribution of attention and monitoring of velocities during multiple-object tracking 
Journal of vision  2009;9(4):1.1-112.
The ability to track multiple moving objects with attention has been the focus of much research. However, the literature is relatively inconclusive regarding two key aspects of this ability, (1) whether the distribution of attention among the tracked targets is fixed during a period of tracking or is dynamically adjusted, and (2) whether motion information (direction and/or speed) is used to anticipate target locations even when velocities constantly change due to inter-object collisions. These questions were addressed by analyzing target-localization errors. Targets in crowded situations (i.e., those in danger of being lost) were localized more precisely than were uncrowded targets. Furthermore, the response vector (pointing from the target location to the reported location) was tuned to the direction of target motion, and observers with stronger direction tuning localized targets more precisely. Overall, our results provide evidence that multiple-object tracking mechanisms dynamically adjust the spatial distribution of attention in a demand-based manner (allocating more resources to targets in crowded situations) and utilize motion information (especially direction information) to anticipate target locations.
doi:10.1167/9.4.1
PMCID: PMC2756460  PMID: 19757910
attention; direction; localization; motion; multiple-object tracking; representational momentum; speed
5.  Characteristic sounds facilitate visual search 
Psychonomic bulletin & review  2008;15(3):548-554.
In a natural environment, objects that we look for often make characteristic sounds. A hiding cat may meow, or the keys in the cluttered drawer may jingle when moved. Using a visual search paradigm, we demonstrated that characteristic sounds facilitated visual localization of objects, even when the sounds carried no location information. For example, finding a cat was faster when participants heard a meow sound. In contrast, sounds had no effect when participants searched for names rather than pictures of objects. For example, hearing “meow” did not facilitate localization of the word cat. These results suggest that characteristic sounds cross-modally enhance visual (rather than conceptual) processing of the corresponding objects. Our behavioral demonstration of object-based cross-modal enhancement complements the extensive literature on space-based cross-modal interactions. When looking for your keys next time, you might want to play jingling sounds.
PMCID: PMC2647585  PMID: 18567253

Results 1-5 (5)