PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of cogneurospringer.comThis journalToc AlertsSubmit OnlineOpen Choice
 
Cogn Neurodyn. 2009 December; 3(4): 295–296.
Published online 2009 September 4. doi:  10.1007/s11571-009-9091-3
PMCID: PMC2777193

Editorial

In order, then, that it [i.e. the human’s head] should not go rolling upon the earth, which has all manner of heights and hollows, and be at a loss how to climb over the one and climb out of the other, they bestowed upon it the body as a vehicle and means of transport. [...]. (Plato, Timaios, 44e)

By introducing this special issue of Cognitive Neurodynamics on “Language Dynamics” with a seemingly facetious quotation of the ancient Greek philosopher Plato (1925, 428–347 B.C.), we would like to draw the reader’s attention to the dynamical system account of cognition (Beer 2000; van Gelder 1998) that became very popular during the last decade of interdisciplinary research. In this approach, which nowadays covers connectionist models, dynamic field theory, dynamic cognitive modeling, and also symbolic dynamical systems such as dynamic syntax, Bayesian update semantics and symbolic dynamics, the mind is regarded as a dynamical system that performs cognitive computations by transiently exploring an abstract state space. From this point of view, a cognitive task corresponds to a set of initial conditions in state space that is prepared beforehand a computation is carried out. Such a computation involves one or more intermediate steps of provisional symbolic results which are uniquely represented by some states in the system’s state space. Eventually, the result of a cognitive process is obtained when the system reaches one or more distinguished final states. To a large extend, classical connectionism dealt with the case that systems settle down in attractors. Then, cognitive dynamics could be identified with some gradient dynamics over “energy” or “harmony landscapes” where particular cost functions are being optimized. 1

Interpreting the introductory citation seriously (i.e. metaphorically), Plato’s Timaeus does not present an outdated cosmogony or genesis where human beings, as simple heads, are prevented from going “rolling upon the earth, which has all manner of heights and hollows”, by bestowing them with bodies and limbs for goal-directed locomotion. Rather, human minds are exploring cognitive cost landscapes in mental state space as it is assumed by the dynamical system account of cognition.

Clearly, language understanding and communication is one of the most intriguing aspects of cognitive neurodynamics. Bearing this in mind, the University of Reading, which was awarded with a Bridging The Gaps grant for Cognitive Systems Sciences by the British Engineering and Physical Sciences Research Council (EPSRC), hosted a workshop on Dynamical Systems in Language, September 08–09, 2008 where internationally renowned experts in the field were invited to present and discuss their cutting-edge research. The special issue of Cognitive Neurodynamics on “Language Dynamics” in hand collects some of those results, augmented by several other invited papers.

The contribution by Gerth and beim Graben employs fractal tensor product representations for minimalist grammars for training continuous-state Hopfield networks as syntactic language processors. They demonstrate that the first principal component in neural activation space is an interesting macroscopic description of the ongoing dynamics when processing ambiguous sentences.

Huyck discusses a neural network model for the processing of prepositional phrase attachment ambiguity by means of cell assemblies of fatigue leaky-integrate-and-fire neurons. The network generates semantic interpretations as frames that can be passed for further processing to an embodied robot.

Semantic representations in dialogue situations are considered by Kempson, Gargett, Gregoromichelaki, Purver, and Sato. They use a purely symbolic approach, called dynamic syntax, where trees for semantic constituency are generated incrementally, in order to describe utilization of context information in elliptic dialogue.

Lipinski, Sandamirskaya, and Schöner present a dynamic field approach for situated language, where an embodied robot replies to questions about visual field. Answers are generated through attractor dynamics of mutually coupled dynamic fields representing different kinds of semantic information.

Tensor product representations of linguistic contexts are also a substantial ingredient for Mizraji, Pomi, and Valle-Lisboa who constructed a dynamic search engine as a neuromimetic device. Their multimodular networks are circuits of associators which can be deployed for decision making or word sense disambiguation.

Tabor continues a long-lasting debate about whether or not symbolic language processing is incompatible with its neurodynamic realization. To this aim he discusses the symbolic dynamics of dynamical automata in the light of Turing versus super-Turing computability.

Tu, Cooper, and Siegelmann suggest a linguistic inference dynamics in terms of extended semantic networks as working memory models and Bayesian networks as predictors. The model is able to infer implicit meanings from natural language sentences and to obtain refined information from past experiences.

A localist connectionist language processor is the SINUS model proposed by Vosse and Kempen. It exploits performance grammar in unification space through binding dynamics between lexicalized tree fragments. The underlying syntactic representation thereby resembles the so-called tree-adjoining grammars that are, like minimalist grammars, suitable approaches for describing natural languages. SINUS allows predictive parsing and exhibits graceful degradation in cases of incomplete or ambiguous input.

The last contribution by Wennekers and Palm completes this special issue on language dynamics. They use an extension of Hebbian cell assemblies, called operational cell assemblies, in neural networks for studying the generation of pattern sequences obeying syntactic rules. Here, unspecific external input switches between state space attractors yielding the learned sequences. The model also describes disambiguation driven by noise or by further contextual information.

Surveying this exciting collection of original research articles, we express the hope that this special issue of Cognitive Neurodynamics may serve as guide for further progress in dynamical systems models of language.

Footnotes

1Note, that recent accounts of cognitive computation, such as e.g. dynamic cognitive modeling, do not necessarily refer to state space attractors any more. Here, e.g. the concept of heteroclinic sequences became very useful. For a review, see beim Graben and Potthast (2009).

References


Articles from Cognitive Neurodynamics are provided here courtesy of Springer Science+Business Media B.V.