PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-3 (3)
 

Clipboard (0)
None

Select a Filter Below

Journals
Authors
more »
Year of Publication
Document Types
1.  Altered integration of speech and gesture in children with autism spectrum disorders 
Brain and Behavior  2012;2(5):606-619.
The presence of gesture during speech has been shown to impact perception, comprehension, learning, and memory in normal adults and typically developing children. In neurotypical individuals, the impact of viewing co-speech gestures representing an object and/or action (i.e., iconic gesture) or speech rhythm (i.e., beat gesture) has also been observed at the neural level. Yet, despite growing evidence of delayed gesture development in children with autism spectrum disorders (ASD), few studies have examined how the brain processes multimodal communicative cues occurring during everyday communication in individuals with ASD. Here, we used a previously validated functional magnetic resonance imaging (fMRI) paradigm to examine the neural processing of co-speech beat gesture in children with ASD and matched controls. Consistent with prior observations in adults, typically developing children showed increased responses in right superior temporal gyrus and sulcus while listening to speech accompanied by beat gesture. Children with ASD, however, exhibited no significant modulatory effects in secondary auditory cortices for the presence of co-speech beat gesture. Rather, relative to their typically developing counterparts, children with ASD showed significantly greater activity in visual cortex while listening to speech accompanied by beat gesture. Importantly, the severity of their socio-communicative impairments correlated with activity in this region, such that the more impaired children demonstrated the greatest activity in visual areas while viewing co-speech beat gesture. These findings suggest that although the typically developing brain recognizes beat gesture as communicative and successfully integrates it with co-occurring speech, information from multiple sensory modalities is not effectively integrated during social communication in the autistic brain.
doi:10.1002/brb3.81
PMCID: PMC3489813  PMID: 23139906
Autism spectrum disorders; fMRI; gesture; language; superior temporal gyrus
2.  Dynamic Visuomotor Transformation Involved with Remote Flying of a Plane Utilizes the ‘Mirror Neuron’ System 
PLoS ONE  2012;7(4):e33873.
Brain regions involved with processing dynamic visuomotor representational transformation are investigated using fMRI. The perceptual-motor task involved flying (or observing) a plane through a simulated Red Bull Air Race course in first person and third person chase perspective. The third person perspective is akin to remote operation of a vehicle. The ability for humans to remotely operate vehicles likely has its roots in neural processes related to imitation in which visuomotor transformation is necessary to interpret the action goals in an egocentric manner suitable for execution. In this experiment for 3rd person perspective the visuomotor transformation is dynamically changing in accordance to the orientation of the plane. It was predicted that 3rd person remote flying, over 1st, would utilize brain regions composing the ‘Mirror Neuron’ system that is thought to be intimately involved with imitation for both execution and observation tasks. Consistent with this prediction differential brain activity was present for 3rd person over 1st person perspectives for both execution and observation tasks in left ventral premotor cortex, right dorsal premotor cortex, and inferior parietal lobule bilaterally (Mirror Neuron System) (Behaviorally: 1st>3rd). These regions additionally showed greater activity for flying (execution) over watching (observation) conditions. Even though visual and motor aspects of the tasks were controlled for, differential activity was also found in brain regions involved with tool use, motion perception, and body perspective including left cerebellum, temporo-occipital regions, lateral occipital cortex, medial temporal region, and extrastriate body area. This experiment successfully demonstrates that a complex perceptual motor real-world task can be utilized to investigate visuomotor processing. This approach (Aviation Cerebral Experimental Sciences ACES) focusing on direct application to lab and field is in contrast to standard methodology in which tasks and conditions are reduced to their simplest forms that are remote from daily life experience.
doi:10.1371/journal.pone.0033873
PMCID: PMC3335037  PMID: 22536320
3.  Giving Speech a Hand: Gesture Modulates Activity in Auditory Cortex During Speech Perception 
Human brain mapping  2009;30(3):1028-1037.
Viewing hand gestures during face-to-face communication affects speech perception and comprehension. Despite the visible role played by gesture in social interactions, relatively little is known about how the brain integrates hand gestures with co-occurring speech. Here we used functional magnetic resonance imaging (fMRI) and an ecologically valid paradigm to investigate how beat gesture – a fundamental type of hand gesture that marks speech prosody – might impact speech perception at the neural level. Subjects underwent fMRI while listening to spontaneously-produced speech accompanied by beat gesture, nonsense hand movement, or a still body; as additional control conditions, subjects also viewed beat gesture, nonsense hand movement, or a still body all presented without speech. Validating behavioral evidence that gesture affects speech perception, bilateral nonprimary auditory cortex showed greater activity when speech was accompanied by beat gesture than when speech was presented alone. Further, the left superior temporal gyrus/sulcus showed stronger activity when speech was accompanied by beat gesture than when speech was accompanied by nonsense hand movement. Finally, the right planum temporale was identified as a putative multisensory integration site for beat gesture and speech (i.e., here activity in response to speech accompanied by beat gesture was greater than the summed responses to speech alone and beat gesture alone), indicating that this area may be pivotally involved in synthesizing the rhythmic aspects of both speech and gesture. Taken together, these findings suggest a common neural substrate for processing speech and gesture, likely reflecting their joint communicative role in social interactions.
doi:10.1002/hbm.20565
PMCID: PMC2644740  PMID: 18412134
gestures; speech perception; auditory cortex; magnetic resonance imaging; nonverbal communication

Results 1-3 (3)