The names of genes are central in describing their function and relationship. However, gene symbols are often a subject of controversy. In addition, the discovery of mammalian genes is now so rapid that a proper use of gene symbol nomenclature rules tends to be overlooked. This is currently the situation in the rat and there is a need for a cohesive and unifying overview of all rat gene symbols in use. Based on the experiences in rat gene symbol curation that we have gained from running the "Ratmap" rat genome database, we have now developed a database that unifies different rat gene naming attempts with the accepted rat gene symbol nomenclature rules.
This paper presents a newly developed database known as RGST (Rat Gene Symbol Tracker). The database contains rat gene symbols from three major sources: the Rat Genome Database (RGD), Ensembl, and NCBI-Gene. All rat symbols are compared with official symbols from orthologous human genes as specified by the Human Gene Nomenclature Committee (HGNC). Based on the outcome of the comparisons, a rat gene symbol may be selected. Rat symbols that do not match a human ortholog undergo a strict procedure of comparisons between the different rat gene sources as well as with the Mouse Genome Database (MGD). For each rat gene this procedure results in an unambiguous gene designation. The designation is presented as a status level that accompanies every rat gene symbol suggested in the database. The status level describes both how a rat symbol was selected, and its validity.
This database fulfils the important need of unifying rat gene symbols into an automatic and cohesive nomenclature system. The RGST database is available directly from the RatMap home page: .
Biological signals may carry specific characteristics that reflect basic dynamics of the body. In particular, heart beat signals carry specific signatures that are related to human physiologic mechanisms. In recent years, many researchers have shown that representations which used non-linear symbolic sequences can often reveal much hidden dynamic information. This kind of symbolization proved to be useful for predicting life-threatening cardiac diseases.
This paper presents an improved method called the “Adaptive Interbeat Interval Analysis (AIIA) method”. The AIIA method uses the Simple K-Means algorithm for symbolization, which offers a new way to represent subtle variations between two interbeat intervals without human intervention. After symbolization, it uses the n-gram algorithm to generate different kinds of symbolic sequences. Each symbolic sequence stands for a variation phase. Finally, the symbolic sequences are categorized by classic classifiers.
In the experiments presented in this paper, AIIA method achieved 91% (3-gram, 26 clusters) accuracy in successfully classifying between the patients with Atrial Fibrillation (AF), Congestive Heart Failure (CHF) and healthy people. It also achieved 87% (3-gram, 26 clusters) accuracy in classifying the patients with apnea.
The two experiments presented in this paper demonstrate that AIIA method can categorize different heart diseases. Both experiments acquired the best category results when using the Bayesian Network. For future work, the concept of the AIIA method can be extended to the categorization of other physiological signals. More features can be added to improve the accuracy.
Neural network models of attention can provide a unifying approach to the study of human cognitive and emotional development (Posner & Rothbart, 2007). This paper we argue that a neural networks approach to the infant development of joint attention can inform our understanding of the nature of human social learning, symbolic thought process and social cognition. At its most basic, joint attention involves the capacity to coordinate one’s own visual attention with that of another person. We propose that joint attention development involves increments in the capacity to engage in simultaneous or parallel processing of information about one’s own attention and the attention of other people. Infant practice with joint attention is both a consequence and organizer of the development of a distributed and integrated brain network involving frontal and parietal cortical systems. This executive distributed network first serves to regulate the capacity of infants to respond to and direct the overt behavior of other people in order to share experience with others through the social coordination of visual attention. In this paper we describe this parallel and distributed neural network model of joint attention development and discuss two hypotheses that stem from this model. One is that activation of this distributed network during coordinated attention enhances to depth of information processing and encoding beginning in the first year of life. We also propose that with development joint attention becomes internalized as the capacity to socially coordinate mental attention to internal representations. As this occurs the executive joint attention network makes vital contributions to the development of human symbolic thinking and social cognition.
In this paper aesthetic experience is defined as an experience qualitatively different from everyday experience and similar to other exceptional states of mind. Three crucial characteristics of aesthetic experience are discussed: fascination with an aesthetic object (high arousal and attention), appraisal of the symbolic reality of an object (high cognitive engagement), and a strong feeling of unity with the object of aesthetic fascination and aesthetic appraisal. In a proposed model, two parallel levels of aesthetic information processing are proposed. On the first level two sub-levels of narrative are processed, story (theme) and symbolism (deeper meanings). The second level includes two sub-levels, perceptual associations (implicit meanings of object's physical features) and detection of compositional regularities. Two sub-levels are defined as crucial for aesthetic experience, appraisal of symbolism and compositional regularities. These sub-levels require some specific cognitive and personality dispositions, such as expertise, creative thinking, and openness to experience. Finally, feedback of emotional processing is included in our model: appraisals of everyday emotions are specified as a matter of narrative content (eg, empathy with characters), whereas the aesthetic emotion is defined as an affective evaluation in the process of symbolism appraisal or the detection of compositional regularities.
aesthetic experience; fascination; appraisal; emotion; narrative; composition
Using a naturalistic video database, we examined whether gestures scaffold the symbolic development of a language-enculturated chimpanzee, a language-enculturated bonobo, and a human child during the second year of life. These three species constitute a complete clade: species possessing a common immediate ancestor. A basic finding was the functional and formal similarity of many gestures between chimpanzee, bonobo, and human child. The child’s symbols were spoken words; the apes’ symbols were lexigrams – non-iconic visual signifiers. A developmental pattern in which gestural representation of a referent preceded symbolic representation of the same referent appeared in all three species (but was statistically significant only for the child). Nonetheless, across species, the ratio of symbol to gesture increased significantly with age. But even though their symbol production increased, the apes continued to communicate more frequently by gesture than by symbol. In contrast, by 15–18 months of age, the child used symbols more frequently than gestures. This ontogenetic sequence from gesture to symbol, present across the clade but more pronounced in child than ape, provides support for the role of gesture in language evolution. In all three species, the overwhelming majority of gestures were communicative (i.e., paired with eye contact, vocalization, and/or persistence). However, vocalization was rare for the apes, but accompanied the majority of the child’s communicative gestures. This species difference suggests the co-evolution of speech and gesture after the evolutionary divergence of the hominid line. Multimodal expressions of communicative intent (e.g., vocalization plus persistence) were normative for the child, but less common for the apes. This species difference suggests that multimodal expression of communicative intent was also strengthened after hominids diverged from apes.
gestural theory of language evolution; language-enculturated apes; symbolic development; cross-species comparisons; gesture; communication development; language development
This article analyzes some analogies going from Artificial Life questions about the symbol–matter connection to Artificial Intelligence questions about symbol-grounding. It focuses on the notion of the interpretability of syntax and how the symbols are integrated in a unity ("binding problem"). Utilizing the DNA code as a model, this paper discusses how syntactic features could be defined as high-grade characteristics of the non syntactic relations in a material-dynamic structure, by using an emergentist approach. This topic furnishes the ground for a confutation of J. Searle’s statement that syntax is observer-relative, as he wrote in his book "Mind: A Brief Introduction". Moreover the evolving discussion also modifies the classic symbol-processing doctrine in the mind which Searle attacks as a strong AL argument, that life could be implemented in a computational mode. Lastly, this paper furnishes a new way of support for the autonomous systems thesis in Artificial Life and Artificial Intelligence, using, inter alia, the "adaptive resonance theory" (ART).
Analogy-making; Connectionism; Theories of syntax; Genetic code; Artificial life; Cognitive robotics & AI; Binding problem
Uncontrolled high blood pressure leads clinicians to wonder about adherence degree among hypertensive patients. In this context, our study aims to describe and analyze patients' experience of antihypertensive drugs in order to shed light on the multiple social and symbolic logics, forming part of the cultural factors shaping personal medication practices.
The medical inductive and comprehensive anthropological approach implemented is based on an ethnographic survey (observations of consultations and interviews). Semi-structured interviews were conducted with 68 hypertensive patients (39 women and 29 men, between the ages of 40 and 95, of whom 52 were over 60) who had been receiving treatment for over a year.
Antihypertensive drugs are reinterpreted when filtered through the cultural model of physiopathology (the body as an engine). This symbolic dimension facilitates acceptance of therapy but leads to a hierarchization of other prescribed drugs and of certain therapeutic classes (diuretics). Prescription compliance does not solely depend on the patient's perception of cardiovascular risk, but also on how the patient fully accepts the treatment and integrates it into his or her daily life; this requires identification with the product, building commitment and self-regulation of the treatment (experience, managing treatment and control of side effects, intake and treatment continuity). Following the prescription requires a relationship based on trust between the doctor and patient, which we have identified in three forms: reasoned trust, emotional trust and conceded trust.
Consideration and understanding of these pragmatic and symbolic issues by the treating physician should aid practitioners in carrying out their role as medical educators in the management of hypertension.
This paper was originally published in French, in the journal Pratiques et organisation des soins 39(1): 3-12.
Brain-machine interfaces are a growing field of research and application. The increasing possibilities to connect the human brain to electronic devices and computer software can be put to use in medicine, the military, and entertainment. Concrete technologies include cochlear implants, Deep Brain Stimulation, neurofeedback and neuroprosthesis. The expectations for the near and further future are high, though it is difficult to separate hope from hype. The focus in this paper is on the effects that these new technologies may have on our ‘symbolic order’—on the ways in which popular categories and concepts may change or be reinterpreted. First, the blurring distinction between man and machine and the idea of the cyborg are discussed. It is argued that the morally relevant difference is that between persons and non-persons, which does not necessarily coincide with the distinction between man and machine. The concept of the person remains useful. It may, however, become more difficult to assess the limits of the human body. Next, the distinction between body and mind is discussed. The mind is increasingly seen as a function of the brain, and thus understood in bodily and mechanical terms. This raises questions concerning concepts of free will and moral responsibility that may have far reaching consequences in the field of law, where some have argued for a revision of our criminal justice system, from retributivist to consequentialist. Even without such a (unlikely and unwarranted) revision occurring, brain-machine interactions raise many interesting questions regarding distribution and attribution of responsibility.
Brain-machine interaction; Brain-computer interfaces; Converging technologies; Cyborg; Deep brain stimulation; Moral responsibility; Neuroethics
Human multimodal communication can be said to serve two main purposes: information transfer and social influence. In this paper, I argue that different components of multimodal signals play different roles in the processes of information transfer and social influence. Although the symbolic components of communication (e.g., verbal and denotative signals) are well suited to transfer conceptual information, emotional components (e.g., non-verbal signals that are difficult to manipulate voluntarily) likely take a function that is closer to social influence. I suggest that emotion should be considered a property of communicative signals, rather than an entity that is transferred as content by non-verbal signals. In this view, the effect of emotional processes on communication serve to change the quality of social signals to make them more efficient at producing responses in perceivers, whereas symbolic components increase the signals’ efficiency at interacting with the cognitive processes dedicated to the assessment of relevance. The interaction between symbolic and emotional components will be discussed in relation to the need for perceivers to evaluate the reliability of multimodal signals.
emotional communication; multimodal communication; social signals; ethology; non-verbal communication; pragmatics
Recently, there has been a growing emphasis on basic number processing competencies (such as the ability to judge which of two numbers is larger) and their role in predicting individual differences in school-relevant math achievement. Children’s ability to compare both symbolic (e.g. Arabic numerals) and nonsymbolic (e.g. dot arrays) magnitudes has been found to correlate with their math achievement. The available evidence, however, has focused on computerized paradigms, which may not always be suitable for universal, quick application in the classroom. Furthermore, it is currently unclear whether both symbolic and nonsymbolic magnitude comparison are related to children’s performance on tests of arithmetic competence and whether either of these factors relate to arithmetic achievement over and above other factors such as working memory and reading ability. In order to address these outstanding issues, we designed a quick (2 minute) paper-and-pencil tool to assess children’s ability to compare symbolic and nonsymbolic numerical magnitudes and assessed the degree to which performance on this measure explains individual differences in achievement. Children were required to cross out the larger of two, single-digit numerical magnitudes under time constraints. Results from a group of 160 children from grades 1–3 revealed that both symbolic and nonsymbolic number comparison accuracy were related to individual differences in arithmetic achievement. However, only symbolic number comparison performance accounted for unique variance in arithmetic achievement. The theoretical and practical implications of these findings are discussed which include the use of this measure as a possible tool for identifying students at risk for future difficulties in mathematics.
The annual James Arthur lecture series on the Evolution of the Human Brain was inaugurated at the American Museum of Natural History in 1932, through a bequest from a successful manufacturer with a particular interest in mechanisms. Karl Pribram's thirty-ninth lecture of the series, delivered in 1970, was a seminal event that heralded much of the research agenda, since pursued by representatives of diverse disciplines, that touches on the evolution of human uniqueness.
In his James Arthur lecture Pribram raised questions about the coding of information in the brain and about the complex association between language, symbol, and the unique human cognitive system. These questions are as pertinent today as in 1970. The emergence of modern human symbolic cognition is often viewed as a gradual, incremental process, governed by inexorable natural selection and propelled by the apparent advantages of increasing intelligence. However, there are numerous theoretical considerations that render such a scenario implausible, and an examination of the pattern of acquisition of behavioral and anatomical novelties in human evolution indicates that, throughout, major change was both sporadic and rare. What is more, modern bony anatomy and brain size were apparently both achieved well before we have any evidence for symbolic behavior patterns. This suggests that the biological substrate underlying the symbolic thought that is so distinctive of Homo sapiens today was exaptively achieved, long before its potential was actually put to use. In which case we need to look for the agent, perforce a cultural one, that stimulated the adoption of symbolic thought patterns. That stimulus may well have been the spontaneous invention of articulate language.
Fossil and genetic evidence suggests the emergence of anatomically modern humans (Homo sapiens) in sub-Saharan Africa some time between 200 and 100 thousand years (ka) ago. But the first traces of symbolic behavior—a trait unique to our species—are not found until many tens of millennia later, and include items such as engraved ochres and eggshells, tools made from bone, and personal ornaments made of shell beads. These behavioral indicators appear in concert with two innovative phases of Middle Stone Age technology, known as the Still Bay (SB) and Howieson's Poort (HP) industries, across a range of climatic and ecological zones in southern Africa. The SB and HP have recently been dated to about 72-71 ka and 65-60 ka, respectively, at sufficiently high resolution to investigate the possible causes and effects. A remarkable feature of these two industries is the spatial synchroneity of their start and end dates at archaeological sites spread across a region of two million square kilometers. What were the catalysts for the SB and HP, and what were the consequences? Both industries flourished at a time when tropical Africa had just entered a period of wetter and more stable conditions, and populations of hunter-gatherers were expanding rapidly throughout sub-Saharan Africa before contracting into geographically and genetically isolated communities. The SB and HP also immediately preceded the likely exit time of modern humans from Africa into southern Asia and across to Australia, which marked the beginning of the worldwide dispersal of our species. In this paper, we argue that environmental factors alone are insufficient to explain these two bursts of technological and behavioral innovation. Instead, we propose that the formation of social networks across southern Africa during periods of population expansion, and the disintegration of these networks during periods of population contraction, can explain the abrupt appearance and disappearance of the SB and HP, as well as the hiatus between them. But it will take improved chronologies for the key demographic events to determine if the emergence of innovative technology and symbolic behavior provided the stimulus for the expansion of hunter-gatherer populations (and their subsequent global dispersal), or if these Middle Stone Age innovations came into existence only after populations had expanded and geographically extensive social networks had developed.
Middle Stone Age; southern Africa; Still Bay; Howieson's poort; technological innovation; symbolic behavior; human dispersal; demographic history; social networks
Adult humans, infants, pre-school children, and non-human animals appear to share a system of approximate numerical processing for non-symbolic stimuli such as arrays of dots or sequences of tones. Behavioral studies of adult humans implicate a link between these non-symbolic numerical abilities and symbolic numerical processing (e.g., similar distance effects in accuracy and reaction-time for arrays of dots and Arabic numerals). However, neuroimaging studies have remained inconclusive on the neural basis of this link. The intraparietal sulcus (IPS) is known to respond selectively to symbolic numerical stimuli such as Arabic numerals. Recent studies, however, have arrived at conflicting conclusions regarding the role of the IPS in processing non-symbolic, numerosity arrays in adulthood, and very little is known about the brain basis of numerical processing early in development. Addressing the question of whether there is an early-developing neural basis for abstract numerical processing is essential for understanding the cognitive origins of our uniquely human capacity for math and science. Using functional magnetic resonance imaging (fMRI) at 4-Tesla and an event-related fMRI adaptation paradigm, we found that adults showed a greater IPS response to visual arrays that deviated from standard stimuli in their number of elements, than to stimuli that deviated in local element shape. These results support previous claims that there is a neurophysiological link between non-symbolic and symbolic numerical processing in adulthood. In parallel, we tested 4-y-old children with the same fMRI adaptation paradigm as adults to determine whether the neural locus of non-symbolic numerical activity in adults shows continuity in function over development. We found that the IPS responded to numerical deviants similarly in 4-y-old children and adults. To our knowledge, this is the first evidence that the neural locus of adult numerical cognition takes form early in development, prior to sophisticated symbolic numerical experience. More broadly, this is also, to our knowledge, the first cognitive fMRI study to test healthy children as young as 4 y, providing new insights into the neurophysiology of human cognitive development.
This functional imaging study provides evidence for a neurobiological link between early non-symbolic numerical abilities of 4 year-old children and the more symbolic numerical processing of adults.
The movements we make with our hands both reflect our mental processes and help to shape them. Our actions and gestures can affect our mental representations of actions and objects. In this paper, we explore the relationship between action, gesture and thought in both humans and non-human primates and discuss its role in the evolution of language. Human gesture (specifically representational gesture) may provide a unique link between action and mental representation. It is kinaesthetically close to action and is, at the same time, symbolic. Non-human primates use gesture frequently to communicate, and do so flexibly. However, their gestures mainly resemble incomplete actions and lack the representational elements that characterize much of human gesture. Differences in the mirror neuron system provide a potential explanation for non-human primates' lack of representational gestures; the monkey mirror system does not respond to representational gestures, while the human system does. In humans, gesture grounds mental representation in action, but there is no evidence for this link in other primates. We argue that gesture played an important role in the transition to symbolic thought and language in human evolution, following a cognitive leap that allowed gesture to incorporate representational elements.
gesture; mental representation; evolution of language; embodied cognition; primates; mirror neurons
Massive text mining of the biological literature holds great promise of relating disparate information and discovering new knowledge. However, disambiguation of gene symbols is a major bottleneck.
We developed a simple thesaurus-based disambiguation algorithm that can operate with very little training data. The thesaurus comprises the information from five human genetic databases and MeSH. The extent of the homonym problem for human gene symbols is shown to be substantial (33% of the genes in our combined thesaurus had one or more ambiguous symbols), not only because one symbol can refer to multiple genes, but also because a gene symbol can have many non-gene meanings. A test set of 52,529 Medline abstracts, containing 690 ambiguous human gene symbols taken from OMIM, was automatically generated. Overall accuracy of the disambiguation algorithm was up to 92.7% on the test set.
The ambiguity of human gene symbols is substantial, not only because one symbol may denote multiple genes but particularly because many symbols have other, non-gene meanings. The proposed disambiguation approach resolves most ambiguities in our test set with high accuracy, including the important gene/not a gene decisions. The algorithm is fast and scalable, enabling gene-symbol disambiguation in massive text mining applications.
Communication about feelings is a core element of human interaction. Aided augmentative and alternative communication systems must therefore include symbols representing these concepts. The symbols must be readily distinguishable in order for users to communicate effectively. However, emotions are represented within most systems by schematic faces in which subtle distinctions are difficult to represent. We examined whether background color cuing and spatial arrangement might help children identify symbols for different emotions.
Thirty nondisabled children searched for symbols representing emotions within an 8-choice array. On some trials, a color cue signaled the valence of the emotion (positive vs. negative). Additionally, symbols were either organized with the negatively-valenced symbols at the top and the positive symbols on the bottom of the display, or the symbols were distributed randomly throughout. Dependent variables were accuracy and speed of responses.
The speed with which children could locate a target was significantly faster for displays in which symbols were clustered by valence, but only when the symbols had white backgrounds. Addition of a background color cue did not facilitate responses.
Rapid search was facilitated by a spatial organization cue, but not by the addition of background color. Further examination of the situations in which color cues may be useful is warranted.
Aided AAC; Color Cuing; Display Construction
The effect of regular exercise on cognitive functioning and personality was investigated in 32 subjects representing 4 discrete groups based on sex and age. Before and after a 10 week exercise programme of jogging, calisthenics, and recreational activities, a test battery was administered to assess functioning in a number of domains: intelligence (WAIS Digit Symbol and Block Design); brain function (Trail-Making); speed of performance (Crossing-Off); memory and learning (WMS Visual Reproduction and Associate Learning); morale and life satisfaction (Life Satisfaction and Control Ratings); anxiety (MAACL); and depression (MAACL). Improvement was observed on several physiological parameters. ANOVA revealed significant sex and age differences on Digit Symbol and Block Design and age differences on Trail-Making, Crossing-Off, Associate Learning, and anxiety. Regardless of sex and age, significant improvement in performance was observed from pre to post-test on Digit Symbol, Block Design, Trail-Making, Crossing-Off, and on Associate Learning. In addition, an increase on health status rating (p less than .01) and decrease in anxiety were observed from pre to post-test. These data illustrate beneficial effects of exercise on certain measures of cognitive functioning and personality.
Automatic speech recognition (ASR) systems rely almost exclusively on short-term segment-level features (MFCCs), while ignoring higher level suprasegmental cues that are characteristic of human speech. However, recent experiments have shown that categorical representations of prosody, such as those based on the Tones and Break Indices (ToBI) annotation standard, can be used to enhance speech recognizers. However, categorical prosody models are severely limited in scope and coverage due to the lack of large corpora annotated with the relevant prosodic symbols (such as pitch accent, word prominence, and boundary tone labels). In this paper, we first present an architecture for augmenting a standard ASR with symbolic prosody. We then discuss two novel, un-supervised adaptation techniques for improving, respectively, the quality of the linguistic and acoustic components of our categorical prosody models. Finally, we implement the augmented ASR by enriching ASR lattices with the adapted categorical prosody models. Our experiments show that the proposed unsupervised adaptation techniques significantly improve the quality of the prosody models; the adapted prosodic language and acoustic models reduce binary pitch accent (presence versus absence) classification error rate by 13.8% and 4.3%, respectively (relative to the seed models) on the Boston University Radio News Corpus, while the prosody-enriched ASR exhibits a 3.1% relative reduction in word error rate (WER) over the baseline system.
Categorical prosody models; lattice enrichment; speech recognition; unsupervised adaptation
Developmental dyscalculia is a heterogeneous disorder with largely dissociable performance profiles. Though our current understanding of the neurofunctional foundations of (adult) numerical cognition has increased considerably during the past two decades, there are still many unanswered questions regarding the developmental pathways of numerical cognition. Most studies on developmental dyscalculia are based upon adult calculation models which may not provide an adequate theoretical framework for understanding and investigating developing calculation systems. Furthermore, the applicability of neuroscience research to pedagogy has, so far, been limited.
After providing an overview of current conceptualisations of numerical cognition and developmental dyscalculia, the present paper (1) reviews recent research findings that are suggestive of a neurofunctional link between fingers (finger gnosis, finger-based counting and calculation) and number processing, and (2) takes the latter findings as an example to discuss how neuroscience findings may impact on educational understanding and classroom interventions.
Sources of evidence
Finger-based number representations and finger-based calculation have deep roots in human ontology and phylogeny. Recently, accumulating empirical evidence supporting the hypothesis of a neurofunctional link between fingers and numbers has emerged from both behavioural and brain imaging studies.
Preliminary but converging research supports the notion that finger gnosis and finger use seem to be related to calculation proficiency in elementary school children. Finger-based counting and calculation may facilitate the establishment of mental number representations (possibly by fostering the mapping from concrete non-symbolic to abstract symbolic number magnitudes), which in turn seem to be the foundations for successful arithmetic achievement.
Based on the findings illustrated here, it is plausible to assume that finger use might be an important and complementary aid (to more traditional pedagogical methods) to establish mental number representations and/or to facilitate learning to count and calculate. Clearly, future prospective studies are needed to investigate whether the explicit use of fingers in early mathematics teaching might prove to be beneficial for typically developing children and/or might support the mapping from concrete to abstract number representations in children with and without developmental dyscalculia.
dyscalculia; functional brain imaging; neuroscience; finger-based calculation; mental number representations
A key issue in cooperation research is to determine the conditions under which individuals invest in a public good. Here, we tested whether cues of being watched increase investments in an anonymous public good situation in real life. We examined whether individuals would invest more by removing experimentally placed garbage (paper and plastic bottles) from bus stop benches in Geneva in the presence of images of eyes compared to controls (images of flowers). We provided separate bins for each of both types of garbage to investigate whether individuals would deposit more items into the appropriate bin in the presence of eyes. The treatment had no effect on the likelihood that individuals present at the bus stop would remove garbage. However, those individuals that engaged in garbage clearing, and were thus likely affected by the treatment, invested more time to do so in the presence of eyes. Images of eyes had a direct effect on behaviour, rather than merely enhancing attention towards a symbolic sign requesting removal of garbage. These findings show that simple images of eyes can trigger reputational effects that significantly enhance on non-monetary investments in anonymous public goods under real life conditions. We discuss our results in the light of previous findings and suggest that human social behaviour may often be shaped by relatively simple and potentially unconscious mechanisms instead of very complex cognitive capacities.
Sound symbolism, or the nonarbitrary link between linguistic sound and meaning, has often been discussed in connection with language evolution, where the oral imitation of external events links phonetic forms with their referents (e.g., Ramachandran & Hubbard, 2001). In this research, we explore whether sound symbolism may also facilitate synchronic language learning in human infants. Sound symbolism may be a useful cue particularly at the earliest developmental stages of word learning, because it potentially provides a way of bootstrapping word meaning from perceptual information. Using an associative word learning paradigm, we demonstrated that 14-month-old infants could detect Köhler-type (1947) shape-sound symbolism, and could use this sensitivity in their effort to establish a word-referent association.
A number of studies from the 1960s to 1990s assessed the symbolic competence of great apes and other animals. These studies provided varying forms of evidence that some species were capable of symbolically representing their worlds, both through productive symbol use and comprehension of symbolic stimuli. One such project at the Language Research Center involved training chimpanzees (Pan troglodytes) to use lexigram symbols (geometric visual stimuli that represented objects, actions, locations, and individuals). Those studies now are more than 40 years old, and only a few of the apes involved in those studies are still alive. Three of these chimpanzees (and a fourth, control chimpanzee) were assessed across a 10-year period from 1999 to 2008 for their continued knowledge of lexigram symbols and, in the case of one chimpanzee, the continued ability to comprehend human speech. This article describes that longitudinal assessment and outlines the degree to which symbol competence was retained by these chimpanzees across that decade-long period. All chimpanzees showed retention of lexigram vocabularies, although there were differences in the number of words that were retained across the individuals. One chimpanzee also showed continual retention of human speech perception. These retained vocabularies largely consisted of food item names, but also names of inedible objects, locations, individuals, and some actions. Many of these retained words were for things that are not common in the daily lives of the chimpanzees and for things that are rarely requested by the chimpanzees. Thus, the early experiences of these chimpanzees in symbol-rich environments have produced long-lasting memories for symbol meaning, and those competencies have benefited research in a variety of topics in comparative cognition.
A dominant view in numerical cognition is that numerical comparisons operate on a notation independent representation (Dehaene, 1992). Although previous human neurophysiological studies using scalp-recorded event-related potentials (ERPs) on the numerical distance effect have been interpreted as supporting this idea, differences in the electrophysiological correlates of the numerical distance effect in symbolic notations (e.g. Arabic numerals) and non-symbolic notations (e.g. a set of visually presented dots of a certain number) are not entirely consistent with this view.
Methods and results
Two experiments were conducted to resolve these discrepancies. In Experiment 1, participants performed a symbolic and a non-symbolic numerical comparison task ("smaller or larger than 5?") with numerical values 1–4 and 6–9 while ERPs were recorded. Consistent with a previous report (Temple & Posner, 1998), in the symbolic condition the amplitude of the P2p ERP component (210–250 ms post-stimulus) was larger for values near to the standard than for values far from the standard whereas this pattern was reversed in the non-symbolic condition. However, closer analysis indicated that the reversal in polarity was likely due to the presence of a confounding stimulus effect on the early sensory ERP components for small versus larger numerical values in the non-symbolic condition. In Experiment 2 exclusively large numerosities (8–30) were used, thereby rendering sensory differences negligible, and with this control in place the numerical distance effect in the non-symbolic condition mirrored the symbolic condition of Experiment 1.
Collectively, the results support the claim of an abstract semantic processing stage for numerical comparisons that is independent of input notation.
To increase the symbol rate of the electroencephalography (EEG) based brain computer interface (BCI) typing systems by utilizing the context information.
Event related potentials (ERP) corresponding to a stimulus in EEG can be used to detect the intended target of a person for BCI. This paradigm is widely utilized to build letter-by-letter BCI typing systems. Nevertheless currently available BCI-typing systems still requires improvement due to low typing speeds. This is mainly due to the reliance on multiple repetitions before making a decision to achieve a higher typing accuracy. Another possible approach to increase the speed of typing while not significantly reducing the accuracy of typing is to use additional context information. In this paper, we study the effect of using a language model as additional evidence for intent detection. Bayesian fusion of an n-gram symbol model with the EEG features is proposed, and specifically regularized discriminant analysis ERP discriminant is used to obtain EEG-based features. The target detection accuracies are rigorously evaluated for varying language model orders, as well as the number of ERP-inducing repetitions.
The results demonstrate that the language models contribute significantly to letter classification accuracy. For instance, we find that a single-trial ERP detection supported by a 4-gram language model may achieve the same performance as using 3-trial ERP classification for the non-initial letters of words.
Overall, fusion of evidence from EEG and language models yields a significant opportunity to increase the symbol rate of a BCI typing system.
Fall related injuries in nursing homes have a major impact on the quality of life in later adulthood and there is a dearth of studies on falling and fall prevention from the older person's perspective. The aim of the study was to identify how older persons perceive falling, fall prevention, and fall accidents. Six in-depth interviews were carried out and a hermeneutic phenomenological method was used to describe and interpret the older persons’ accounts. Interpretations of Levinasian and Heidegarian philosophy related to dwelling and mobility helped cultivate important insights. Symbolic and physical environments are important for the participants’ well-being. The older persons in the study did not wish to dwell on the subject of falling and spoke of past and present coping strategies and the importance of staying on their feet. The women spoke about endurance in their daily lives. The men's narrations were more dramatic; they became animated when they spoke of their active past lives. As the scope of the study is small, these gender differences require further investigation. However, their stories give specific knowledge about the individual and their symbolic environmental circumstances and universal knowledge about the importance of integrating cultural environmental knowledge in health promotion and care work. Traditional fall prevention interventions are often risk oriented and based on generalized knowledge applied to particular cases. The findings indicate a need for contextual life-world knowledge and an understanding of fall prevention as a piece in a larger puzzle within a broader framework of culture, health, and well-being. Showing an interest in the older persons’ stories can help safeguard their integrity and promote their well-being. This can ignite a spark that kindles their desire to participate in meaningful exercises and activities.
Fall prevention; Heidegger; Levinas; older persons; narratives; well-being; nursing care; health promotion