To determine the areas involved in reorganization of language to the right hemisphere after early left hemisphere injury, we compared fMRI activation patterns during four production and comprehension tasks in post-surgical epilepsy patients with either left (LH) or right hemisphere (RH) speech dominance (determined by Wada testing) and healthy controls. Patient groups were carefully matched for IQ, lesion location and size. RH patients’ activation across all tasks was greatest in right hemisphere areas homotopic to areas activated by LH and control participants. Differences in right vs. left dominant hemisphere activation were limited to homologous areas typically activated by language tasks, supporting the hypothesis that language localization following transfer to the RH is the mirror-image of localization in the absence of transfer. The similarity of these findings to those in patients with larger, peri-sylvian lesions suggests that these areas in both hemispheres may be uniquely predisposed to subserve various language functions.
language reorganization; fMRI; epilepsy; Wada test; temporal lobectomy
The effect of exposure to the contextual features of the /pt/ cluster was investigated in native-English and native-Polish listeners using behavioral and event-related potential (ERP) methodology. Both groups experience the /pt/ cluster in their languages, but only the Polish group experiences the cluster in the context of word onset examined in the current experiment. The /st/ cluster was used as an experimental control. ERPs were recorded while participants identified the number of syllables in the second word of nonsense word pairs. The results found that only Polish listeners accurately perceived the /pt/ cluster and perception was reflected within a late positive component of the ERP waveform. Furthermore, evidence of discrimination of /pt/ and /pǝt/ onsets in the neural signal was found even for non-native listeners who could not perceive the difference. These findings suggest that exposure to phoneme sequences in highly specific contexts may be necessary for accurate perception.
Speech perception; Native-language; Phonotactics; Event-related potentials; Late positive component; P3a and P3b; English–Polish; Consonant clusters; /pt/ Cluster; /st/ Cluster
The human capacity for processing speech is remarkable, especially given that information in speech unfolds over multiple time scales concurrently. Similarly notable is our ability to filter out of extraneous sounds and focus our attention on one conversation, epitomized by the ‘Cocktail Party’ effect. Yet, the neural mechanisms underlying on-line speech decoding and attentional stream selection are not well understood. We review findings from behavioral and neurophysiological investigations that underscore the importance of the temporal structure of speech for achieving these perceptual feats. We discuss the hypothesis that entrainment of ambient neuronal oscillations to speech’s temporal structure, across multiple time-scales, serves to facilitate its decoding and underlies the selection of an attended speech stream over other competing input. In this regard, speech decoding and attentional stream selection are examples of ‘active sensing’, emphasizing an interaction between proactive and predictive top-down modulation of neuronal dynamics and bottom-up sensory input.
Patients with primary progressive aphasia (PPA) vary considerably in terms of which brain regions are impacted, as well as in the extent to which syntactic processing is impaired. Here we review the literature on the neural basis of syntactic deficits in PPA. Structural and functional imaging studies have most consistently associated syntactic deficits with damage to left inferior frontal cortex. Posterior perisylvian regions have been implicated in some studies. Damage to the superior longitudinal fasciculus, including its arcuate component, has been linked with syntactic deficits, even after gray matter atrophy is taken into account. These findings suggest that syntactic processing depends on left frontal and posterior perisylvian regions, as well as intact connectivity between them. In contrast, anterior temporal regions, and the ventral tracts that link frontal and temporal language regions, appear to be less important for syntax, since they are damaged in many PPA patients with spared syntactic processing.
syntax; primary progressive aphasia; voxel-based morphometry; functional MRI; diffusion tensor imaging
Recent evidence suggests that blindness enables visual circuits to contribute to language processing. We examined whether this dramatic functional plasticity has a sensitive period. BOLD fMRI signal was measured in congenitally blind, late blind (blindness onset 9-years-old or later) and sighted participants while they performed a sentence comprehension task. In a control condition, participants listened to backwards speech and made match/non-match to sample judgments. In both, congenitally and late blind participants BOLD signal increased in bilateral foveal-pericalcarine cortex during response preparation, irrespective of whether the stimulus was a sentence or backwards speech. However, only in congenitally blind people left occipital areas (pericalcarine, extrastriate, fusiform and lateral) responded more to sentences than backwards speech. We conclude that age of blindness onset constrains the non-visual functions of occipital cortex: while plasticity is present in both congenitally and late blind individuals, recruitment of visual circuits for language depends on blindness during childhood.
plasticity; development; sensitive-period; critical-period; language evolution; visual cortex; blind; sentence comprehension; foveal; pericalcarine
•Co-variation in lateralization during word reading dissociated three subsystems.•Posterior ventral occipito-temporal cortex (vOT) with precentral gyrus.•Middle vOT with pars opercularis, pars triangularis and supramarginal gyrus.•Anterior vOT with pars orbitalis, middle frontal gyrus and thalamus.
The ventral occipitotemporal sulcus (vOT) sustains strong interactions with the inferior frontal cortex during word processing. Consequently, activation in both regions co-lateralize towards the same hemisphere in healthy subjects. Because the determinants of lateralisation differ across posterior, middle and anterior vOT subregions, we investigated whether lateralisation in different inferior frontal regions would co-vary with lateralisation in the three different vOT subregions. A whole brain analysis found that, during semantic decisions on written words, laterality covaried in (1) posterior vOT and the precentral gyrus; (2) middle vOT and the pars opercularis, pars triangularis, and supramarginal gyrus; and (3) anterior vOT and the pars orbitalis, middle frontal gyrus and thalamus. These findings increase the spatial resolution of our understanding of how vOT interacts with other brain areas during semantic categorisation on words.
Functional MRI; Language; Word processing; Semantic matching; Laterality index; Left lateralization; Inter-subject variability; Language subsystems
•MMN response dissociates lexical access and combinatorial processing of compounds.•Compound lexical frequency and meaning transparency affect the MMN.•Larger MMN for high vs. low-frequency opaque compounds.•No MMN frequency effects for transparent compounds or differences to pseudo-compounds.•Results support a parallel dual-route account of compound word processing.
Are compound words represented as unitary lexical units, or as individual constituents that are processed combinatorially? We investigated the neuro-cognitive processing of compounds using EEG and a passive-listening oddball design in which lexical access and combinatorial processing elicit dissociating Mismatch Negativity (MMN) brain-response patterns. MMN amplitude varied with compound frequency and semantic transparency (the clarity of the relationship between compound and constituent meanings). Opaque compounds elicited an enhanced ‘lexical’ MMN, reflecting stronger lexical representations, to high- vs. low-frequency compounds. Transparent compounds showed no frequency effect, nor differed to pseudo-compounds, reflecting the combination of a reduced ‘syntactic’ MMN indexing combinatorial links, and an enhanced ‘lexical’ MMN for real-word compounds compared to pseudo-compounds. We argue that transparent compounds are processed combinatorially alongside parallel lexical access of the whole-form representation, but whole-form access is the dominant mechanism for opaque compounds, particularly those of high-frequency. Results support a flexible dual-route account of compound processing.
Compounds; Dual-route; Speech; Language; ERPs; MMN
Many differences in brain activity have been reported between persons who stutter (PWS) and typically fluent controls during oral reading tasks. An earlier meta-analysis of imaging studies identified stutter-related regions, but recent studies report less agreement with those regions. A PET study on adult dextral PWS (n = 18) and matched fluent controls (CONT, n = 12) is reported that used both oral reading and monologue tasks. After correcting for speech rate differences between the groups the task-activation differences were surprisingly small. For both analyses only some regions previously considered stutter-related were more activated in the PWS group than in the CONT group, and these were also activated during eyes-closed rest (ECR). In the PWS group, stuttering frequency was correlated with cortico-striatal-thalamic circuit activity in both speaking tasks. The neuroimaging findings for the PWS group, relative to the CONT group, appear consistent with neuroanatomic abnormalities being increasingly reported among PWS.
stuttering; oral reading; monologue; brain imaging; PET
•Native and non-native English speakers can visually discriminate English from an unknown language.•Viewing known speech excites the articulatory motor cortex more than unknown speech.•Viewing known speech excites the articulatory motor cortex more than non-speech mouth movements.•Motor excitability is high during observation of a face not speaking.•Motor excitability does not differ between native and non-native speakers.
It is possible to comprehend speech and discriminate languages by viewing a speaker’s articulatory movements. Transcranial magnetic stimulation studies have shown that viewing speech enhances excitability in the articulatory motor cortex. Here, we investigated the specificity of this enhanced motor excitability in native and non-native speakers of English. Both groups were able to discriminate between speech movements related to a known (i.e., English) and unknown (i.e., Hebrew) language. The motor excitability was higher during observation of a known language than an unknown language or non-speech mouth movements, suggesting that motor resonance is enhanced specifically during observation of mouth movements that convey linguistic information. Surprisingly, however, the excitability was equally high during observation of a static face. Moreover, the motor excitability did not differ between native and non-native speakers. These findings suggest that the articulatory motor cortex processes several kinds of visual cues during speech communication.
Bilingualism; Lipreading; Motor cortex; Action observation; Motor evoked potentials; Social cognition; Speech; Speechreading; Transcranial magnetic stimulation
Current accounts of spoken language assume the existence of a lexicon where wordforms are stored and interact during spoken language perception, understanding and production. Despite the theoretical importance of the wordform lexicon, the exact localization and function of the lexicon in the broader context of language use is not well understood. This review draws on evidence from aphasia, functional imaging, neuroanatomy, laboratory phonology and behavioral results to argue for the existence of parallel lexica that facilitate different processes in the dorsal and ventral speech pathways. The dorsal lexicon, localized in the inferior parietal region including the supramarginal gyrus, serves as an interface between phonetic and articulatory representations. The ventral lexicon, localized in the posterior superior temporal sulcus and middle temporal gyrus, serves as an interface between phonetic and semantic representations. In addition to their interface roles, the two lexica contribute to the robustness of speech processing.
lexicon; language; spoken word recognition; lexical access; speech perception; speech production; neuroimaging; aphasia; dual stream model; localization
A limited number of studies have investigated language in Huntington's disease (HD). These have generally reported abnormalities in rule-governed (grammatical) aspects of language, in both syntax and morphology. Several studies of verbal inflectional morphology in English and French have reported evidence of over-active rule processing, such as over-suffixation errors (e.g., walkeded) and over-regularizations (e.g., digged). Here we extend the investigation to noun inflection in Hungarian, a Finno-Ugric agglutinative language with complex morphology, and to genetically proven pre-symptomatic Huntington's disease (pre-HD). Although individuals with pre-HD have no clinical, motor or cognitive symptoms, the underlying pathology may already have begun, and thus sensitive behavioral measures might reveal already-present impairments. Indeed, in a Hungarian morphology production task, pre-HD patients made both over-suffixation and over-regularization errors. The findings suggest the generality of over-active rule processing in both HD and pre-HD, across languages from different families with different morphological systems, and for both verbal and noun inflection. Because the neuropathology in pre-HD appears to be largely restricted to the caudate nucleus and related structures, the findings further implicate these structures in language, and in rule-processing in particular. Finally, the need for effective treatments in HD, which will likely depend in part on the ability to sensitively measure early changes in the disease, suggests the possibility that inflectional morphology, and perhaps other language measures, may provide useful diagnostic, tracking, and therapeutic tools for assessing and treating early degeneration in pre-HD and HD.
pre-symptomatic Huntington's disease; basal ganglia; caudate nucleus; language production; regular and irregular morphology; Hungarian
Despite growing evidence of young adults neurally pre-activating word features during sentence comprehension, less clear is the degree to which this generalizes to older adults. Using ERPs, we tested for linguistic prediction in younger and older readers by means of indefinite articles (a’s and an’s) preceding more and less probable noun continuations. Although both groups exhibited cloze probability-graded noun N400s, only the young showed significant article effects, indicating probabilistic sensitivity to the phonology of anticipated upcoming nouns. Additionally, both age groups exhibited prolonged increased frontal positivities to less probable nouns, although in older adults this effect was prominent only in a subset with high verbal fluency (VF). This ERP positivity to contextual constraint violations offers additional support for prediction in the young. For high VF older adults, the positivity may indicate they, too, engage in some form of linguistic pre-processing when implicitly cued, as may have occurred via the articles.
Aging; Language; Comprehension; Prediction; Event-related brain potentials; N400; Frontal positivity; Verbal fluency; Implicit cueing; Executive processes
In a neuroimaging study focusing on young bilinguals, we explored the brains of bilingual and monolingual babies across two age groups (younger 4–6 months, older 10–12 months), using fNIRS in a new event-related design, as babies processed linguistic phonetic (Native English, Non-Native Hindi) and nonlinguistic Tone stimuli. We found that phonetic processing in bilingual and monolingual babies is accomplished with the same language-specific brain areas classically observed in adults, including the left superior temporal gyrus (associated with phonetic processing) and the left inferior frontal cortex (associated with the search and retrieval of information about meanings, and syntactic and phonological patterning), with intriguing developmental timing differences: left superior temporal gyrus activation was observed early and remained stably active over time, while left inferior frontal cortex showed greater increase in neural activation in older babies notably at the precise age when babies’ enter the universal first-word milestone, thus revealing a first-time focal brain correlate that may mediate a universal behavioral milestone in early human language acquisition. A difference was observed in the older bilingual babies’ resilient neural and behavioral sensitivity to Non-Native phonetic contrasts at a time when monolingual babies can no longer make such discriminations. We advance the “Perceptual Wedge Hypothesis”as one possible explanation for how exposure to greater than one language may alter neural and language processing in ways that we suggest are advantageous to language users. The brains of bilinguals and multilinguals may provide the most powerful window into the full neural “extent and variability” that our human species’ language processing brain areas could potentially achieve.
fNIRS; bilingualism; infant phonetic processing; Perceptual Wedge Hypothesis; brain development; language acquisition; Broca's Area; STG; LIFC
The laterality difference in the occipitotemporal region between Chinese (bilaterality) and alphabetic languages (left laterality) has been attributed to their difference in visual appearance. However, these languages also differ in orthographic transparency. To disentangle the effect of orthographic transparency from visual appearance, we trained subjects to read the same artificial script either as an alphabetic (i.e., transparent orthography) or a logographic (i.e., nontransparent orthography) language. Consistent with our previous results, both types of phonological training enhanced activations in the left fusiform gyrus. More interestingly, the laterality in the fusiform gyrus (especially the posterior region) was modulated by the orthographic transparency of the artificial script (more left-lateralized activation after alphabetic training than after logographic training). These results provide an alternative account (i.e., orthographic transparency) for the laterality difference between Chinese and alphabetic languages, and may have important implications for the role of the fusiform in reading.
Reading; Fusiform laterality; Orthographic transparency; Language learning; fMRI
Concern for the impact of prenatal cocaine exposure (PCE) on human language development is based on observations of impaired performance on assessments of language skills in these children relative to non-exposed children. We investigated the effects of PCE on speech processing ability using event-related potentials (ERPs) among a sample of adolescents followed prospectively since birth. This study presents findings regarding cortical functioning in 107 prenatally cocaine-exposed (PCE) and 46 non-drug-exposed (NDE) 13-year-old adolescents. PCE and NDE groups differed in processing of auditorily presented non-words at very early sensory/phonemic processing components (N1/P2), in somewhat higher-level phonological processing components (N2), and in late high –level linguistic/memory components (P600). These findings suggest that children with PCE have atypical neural responses to spoken language stimuli during low-level phonological processing and at a later stage of processing of spoken stimuli.
prenatal cocaine exposure; adolescence; risk; event-related potential
Lexical-semantic knowledge is a core language component that undergoes prolonged development throughout childhood and is therefore highly amenable to developmental studies. Most previous lexical-semantic functional MRI (fMRI) studies have been limited to single-word or word-pair tasks, outside a sentence context. Our objective was to investigate the development of lexical-semantic language networks in typically developing children using a more ecological sentence-embedded semantic task that permitted performance monitoring while minimizing head movement by avoiding overt speech. Sixteen adults and 23 children completed two fMRI runs of an auditory lexical-semantic decision task with a button-press response, using reverse speech as control condition. Children and adults showed similar activation in bilateral temporal and left inferior frontal regions. Greater activation in adults than in children was seen in left inferior parietal, premotor, and inferior frontal regions, and in bilateral supplementary motor area (SMA). Specifically for semantically incongruous sentences, adults also showed greater activation than children in left inferior frontal cortex, possibly related to enhanced top-down control. Age-dependent activation increases in motor-related regions were shown to be unrelated to overt motor responses, but could be associated with covert speech accompanying semantic decision. Unlike previous studies, age-dependent differences were not detected in posterior sensory cortices (such as extrastriate cortex), nor in middle temporal gyrus.
A fundamental advance in our understanding of human language would come from a detailed account of how non-linguistic and linguistic manual actions are differentiated in real time by language users. To explore this issue, we targeted the N400, an ERP component known to be sensitive to semantic context. Deaf signers saw 120 American Sign Language sentences, each consisting of a “frame” (a sentence without the last word; e.g. BOY SLEEP IN HIS) followed by a “last item” belonging to one of four categories: a high-cloze-probability sign (a “semantically reasonable” completion to the sentence; e.g. BED), a low-cloze-probability sign (a real sign that is nonetheless a “semantically odd” completion to the sentence; e.g. LEMON), a pseudo-sign (phonologically legal but non-lexical form), or a non-linguistic grooming gesture (e.g. the performer scratching her face). We found significant N400-like responses in the incongruent and pseudo-sign contexts, while the gestures elicited a large positivity.
sign language; ASL; ERP; N400; deaf; pseudo-word; grooming gesture
Schemas are abstract nonverbal representations that parsimoniously depict spatial relations. Despite their ubiquitous use in maps and diagrams, little is known about their neural instantiation. We sought to determine the extent to which schematic representations are neurally distinguished from language on the one hand, and from rich perceptual representations on the other. In patients with either left hemisphere damage or right hemisphere damage, a battery of matching tasks depicting categorical spatial relations was used to probe for the comprehension of basic spatial concepts across distinct representational formats (words, pictures, and schemas). Left hemisphere patients underperformed right hemisphere patients across all tasks. However, focused residual analyses using VLSM (voxel-based lesion-symptom mapping) suggest that (1) left hemisphere deficits in the representation of categorical spatial relations are difficult to distinguish from deficits in naming these relations and (2) the right hemisphere plays a special role in extracting schematic representations from richly textured pictures.
semantics; spatial cognition; hemispheric specialization; lesion studies
Few studies have examined connected speech in demented and non-demented patients with Parkinson’s disease (PD). We assessed the speech production of 35 patients with Lewy body spectrum disorder (LBSD), including non-demented PD patients, patients with PD dementia (PDD), and patients with dementia with Lewy bodies (DLB), in a semi-structured narrative speech sample in order to characterize impairments of speech fluency and to determine the factors contributing to reduced speech fluency in these patients. Both demented and non-demented PD patients exhibited reduced speech fluency, characterized by reduced overall speech rate and long pauses between sentences. Reduced speech rate in LBSD correlated with measures of between-utterance pauses, executive functioning, and grammatical comprehension. Regression analyses related non-fluent speech, grammatical difficulty, and executive difficulty to atrophy in frontal brain regions. These findings indicate that multiple factors contribute to slowed speech in LBSD, and this is mediated in part by disease in frontal brain regions.
Parkinson’s disease; speech; language; fluency; dementia with Lewy bodies
This study examined the time course of object naming in 21 individuals with primary progressive aphasia (PPA) (8 agrammatic (PPA-G); 13 logopenic (PPA-G)) and healthy age-matched speakers (n=17) using a semantic interference paradigm with related and unrelated interfering stimuli presented at stimulus onset asynchronies (SOAs) of −1000, −500, −100 and 0 ms. Results showed semantic interference (SI) (i.e. significantly slower RTs in related compared to unrelated conditions) for all groups at −500, −100 and 0 ms, indicating timely spreading activation to semantic competitors. However, both PPA groups showed a greater magnitude of SI than normal across SOAs. The PPA-L group and six PPA-G participants also evinced SI at −1000 ms, suggesting an abnormal time course of semantic interference resolution, and concomitant left hemisphere cortical atrophy in brain regions associated with semantic processing. These subtle semantic mapping impairments in non-semantic variants of PPA may contribute to the anomia of these patients.
primary progressive aphasia; semantic interference; word interference paradigms; naming deficits in primary progressive aphasia; Free Surfer; cortical thickness
The hypothesized role of Broca’s area in sentence processing ranges from domain-general executive function to domain-specific computation that is specific to certain syntactic structures. We examined this issue by manipulating syntactic structure and conflict between syntactic and semantic cues in a sentence processing task. Functional neuroimaging revealed that activation within several Broca’s area regions of interest reflected the parametric variation in syntactic-semantic conflict. These results suggest that Broca’s area supports sentence processing by mediating between multiple incompatible constraints on sentence interpretation, consistent with this area’s well-known role in conflict resolution in other linguistic and non-linguistic tasks.
language; syntax; semantics; left inferior frontal cortex; executive function; Broca’s area; sentence processing
Conceptual metaphor theory suggests that knowledge is structured around metaphorical mappings derived from physical experience. Segregated processing of object properties in sensory cortex allows testing of the hypothesis that metaphor processing recruits activity in domain-specific sensory cortex. Using functional magnetic resonance imaging (fMRI) we show that texture-selective somatosensory cortex in the parietal operculum is activated when processing sentences containing textural metaphors, compared to literal sentences matched for meaning. This finding supports the idea that comprehension of metaphors is perceptually grounded.
fMRI; parietal operculum; grounded cognition; tactile
The Parallel Distributed Processing (PDP) framework has significant potential for producing models of cognitive tasks that approximate how the brain performs the same tasks. To date, however, there has been relatively little contact between PDP modeling and data from cognitive neuroscience. In an attempt to advance the relationship between explicit, computational models and physiological data collected during the performance of cognitive tasks, we developed a PDP model of visual word recognition which simulates key results from the ERP reading literature, while simultaneously being able to successfully perform lexical decision—a benchmark task for reading models. Simulations reveal that the model’s success depends on the implementation of several neurally plausible features in its architecture which are sufficiently domain-general to be relevant to cognitive modeling more generally.
Computational Modeling; Parallel Distributed Processing; Event-Related Potentials; N400; Visual Word Recognition
Some situations require one to quickly stop an initiated response. Recent evidence suggests that rapid stopping engages a mechanism that has diffuse effects on the motor system. For example, stopping the hand dampens the excitability of the task-irrelevant leg. However, it is unclear whether this ‘global suppression’ could apply across wider motor modalities. Here we tested whether stopping speech leads to suppression of the task-irrelevant hand. We used Transcranial Magnetic Stimulation over the primary motor cortex with concurrent electromyography from the hand. We found that when speech was successfully stopped the motor evoked potential from the task-irrelevant hand was significantly reduced compared to when the participant failed to stop speaking, or responded on non stop signal trials, or compared to baseline. This shows that when speech is quickly stopped, there is a broad suppression across the motor system. This has implications for the neural basis of speech control and stuttering.
inhibitory control; speech; motor evoked potential; stop signal task