Search tips
Search criteria

Results 1-5 (5)

Clipboard (0)

Select a Filter Below

more »
Year of Publication
Document Types
1.  Misperception of exocentric directions in auditory space 
Acta psychologica  2008;129(1):72-82.
Previous studies have demonstrated large errors (over 30°) in visually perceived exocentric directions (the direction between two objects that are both displaced from the observer’s location; e.g., Philbeck et al., in press). Here, we investigated whether a similar pattern occurs in auditory space. Blindfolded participants either attempted to aim a pointer at auditory targets (an exocentric task) or gave a verbal estimate of the egocentric target azimuth. Targets were located at 20° to 160° azimuth in the right hemispace. For comparison, we also collected pointing and verbal judgments for visual targets. We found that exocentric pointing responses exhibited sizeable undershooting errors, for both auditory and visual targets, that tended to become more strongly negative as azimuth increased (up to −19° for visual targets at 160°). Verbal estimates of the auditory and visual target azimuths, however, showed a dramatically different pattern, with relatively small overestimations of azimuths in the rear hemispace. At least some of the differences between verbal and pointing responses appear to be due to the frames of reference underlying the responses; when participants used the pointer to reproduce the egocentric target azimuth rather than the exocentric target direction relative to the pointer, the pattern of pointing errors more closely resembled that seen in verbal reports. These results show that there are similar distortions in perceiving exocentric directions in visual and auditory space.
PMCID: PMC2614239  PMID: 18555205
manual pointing; auditory space perception; perception / action; perceived direction; spatial cognition
2.  Linear Path Integration Deficits in Patients with Abnormal Vestibular Afference 
Seeing and perceiving  2012;25(2):155-178.
Effective navigation requires the ability to keep track of one’s location and maintain orientation during linear and angular displacements. Path integration is the process of updating the representation of body position by integrating internally-generated self-motion signals over time (e.g., walking in the dark). One major source of input to path integration is vestibular afference. We tested patients with reduced vestibular function (unilateral vestibular hypofunction, UVH), patients with aberrant vestibular function (benign paroxysmal positional vertigo, BPPV), and healthy participants (controls) on two linear path integration tasks: experimenter-guided walking and target-directed walking. The experimenter-guided walking task revealed a systematic underestimation of self-motion signals in UVH patients compared to the other groups. However, we did not find any difference in the distance walked between the UVH group and the control group for the target-directed walking task. Results from neuropsychological testing and clinical balance measures suggest that the errors in experimenter-guided walking were not attributable to cognitive and/or balance impairments. We conclude that impairment in linear path integration in UVH patients stem from deficits in self-motion perception. Importantly, our results also suggest that patients with a UVH deficit do not lose their ability to walk accurately without vision to a memorized target location.
PMCID: PMC4086033  PMID: 22726251
Vestibular navigation; vestibular hypofunction; path integration; spatial orientation
3.  Medial Temporal Lobe Roles in Human Path Integration 
PLoS ONE  2014;9(5):e96583.
Path integration is a process in which observers derive their location by integrating self-motion signals along their locomotion trajectory. Although the medial temporal lobe (MTL) is thought to take part in path integration, the scope of its role for path integration remains unclear. To address this issue, we administered a variety of tasks involving path integration and other related processes to a group of neurosurgical patients whose MTL was unilaterally resected as therapy for epilepsy. These patients were unimpaired relative to neurologically intact controls in many tasks that required integration of various kinds of sensory self-motion information. However, the same patients (especially those who had lesions in the right hemisphere) walked farther than the controls when attempting to walk without vision to a previewed target. Importantly, this task was unique in our test battery in that it allowed participants to form a mental representation of the target location and anticipate their upcoming walking trajectory before they began moving. Thus, these results put forth a new idea that the role of MTL structures for human path integration may stem from their participation in predicting the consequences of one's locomotor actions. The strengths of this new theoretical viewpoint are discussed.
PMCID: PMC4011851  PMID: 24802000
4.  The Role of Spatial Memory and Frames of Reference in the Precision of Angular Path Integration 
Acta psychologica  2012;141(1):112-121.
Angular path integration refers to the ability to maintain an estimate of self-location after a rotational displacement by integrating internally-generated (idiothetic) self-motion signals over time. Previous work has found that non-sensory inputs, namely spatial memory, can play a powerful role in angular path integration (Arthur et al., 2007, 2009). Here we investigated the conditions under which spatial memory facilitates angular path integration. We hypothesized that the benefit of spatial memory is particularly likely in spatial updating tasks in which one’s self-location estimate is referenced to external space. To test this idea, we administered passive, nonvisual body rotations (ranging 40° – 140°) about the yaw axis and asked participants to use verbal reports or open-loop manual pointing to indicate the magnitude of the rotation. Prior to some trials, previews of the surrounding environment were given. We found that when participants adopted an egocentric frame of reference, the previously-observed benefit of previews on within-subject response precision was not manifested, regardless of whether remembered spatial frameworks were derived from vision or spatial language. We conclude that the powerful effect of spatial memory is dependent on one’s frame of reference during self-motion updating.
PMCID: PMC3436123  PMID: 22885073
spatial memory; path integration; vestibular navigation; manual pointing; perception and action
5.  Non-sensory inputs to angular path integration 
Non-sensory (cognitive) inputs can play a powerful role in monitoring one’s self-motion. Previously, we showed that access to spatial memory dramatically increases response precision in an angular self-motion updating task [1]. Here, we examined whether spatial memory also enhances a particular type of self-motion updating – angular path integration. “Angular path integration” refers to the ability to maintain an estimate of self-location after a rotational displacement by integrating internally-generated (idiothetic) self-motion signals over time. It was hypothesized that remembered spatial frameworks derived from vision and spatial language should facilitate angular path integration by decreasing the uncertainty of self-location estimates. To test this we implemented a whole-body rotation paradigm with passive, non-visual body rotations (ranging 40°–140°) administered about the yaw axis. Prior to the rotations, visual previews (Experiment 1) and verbal descriptions (Experiment 2) of the surrounding environment were given to participants. Perceived angular displacement was assessed by open-loop pointing to the origin (0°). We found that within-subject response precision significantly increased when participants were provided a spatial context prior to whole-body rotations. The present study goes beyond our previous findings by first establishing that memory of the environment enhances the processing of idiothetic self-motion signals. Moreover, we show that knowledge of one’s immediate environment, whether gained from direct visual perception or from indirect experience (i.e., spatial language), facilitates the integration of incoming self-motion signals.
PMCID: PMC2892260  PMID: 20448337
Spatial memory; path integration; vestibular navigation; manual pointing

Results 1-5 (5)