Theories of decision-making and its neural substrates have long assumed the existence of two distinct and competing valuation systems, variously described as goal-directed vs. habitual, or, more recently and based on statistical arguments, as model-free vs. model-based reinforcement-learning. Though both have been shown to control choices, the cognitive abilities associated with these systems are under ongoing investigation. Here we examine the link to cognitive abilities, and find that individual differences in processing speed covary with a shift from model-free to model-based choice control in the presence of above-average working memory function. This suggests shared cognitive and neural processes; provides a bridge between literatures on intelligence and valuation; and may guide the development of process models of different valuation components. Furthermore, it provides a rationale for individual differences in the tendency to deploy valuation systems, which may be important for understanding the manifold neuropsychiatric diseases associated with malfunctions of valuation.
decision-making; reward; cognitive abilities; model-based and model-free learning; fluid intelligence; habitual and goal-directed system
Transcranial direct current stimulation (tDCS) is a non-invasive brain stimulation method with many putative applications and reported to effectively modulate behaviour. However, its effects have yet to be considered at a computational level. To address this we modelled the tuning curves underlying the behavioural effects of stimulation in a perceptual task. Participants judged which of the two serially presented images contained more items (numerosity judgement task) or was presented longer (duration judgement task). During presentation of the second image their posterior parietal cortices (PPCs) were stimulated bilaterally with opposite polarities for 1.6 s. We also examined the impact of three stimulation conditions on behaviour: anodal right-PPC and cathodal left-PPC (rA-lC), reverse order (lA-rC) and no-stimulation condition. Behavioural results showed that participants were more accurate in numerosity and duration judgement tasks when they were stimulated with lA-rC and rA-lC stimulation conditions respectively. Simultaneously, a decrease in performance on numerosity and duration judgement tasks was observed when the stimulation condition favoured the other task. Thus, our results revealed a double-dissociation of laterality and task. Importantly, we were able to model the effects of stimulation on behaviour. Our computational modelling showed that participants' superior performance was attributable to a narrower tuning curve — smaller standard deviation of detection noise. We believe that this approach may prove useful in understanding the impact of brain stimulation on other cognitive domains.
•Behavioural effects of transcranial electrical stimulation were modelled.•Computational modelling was based on tuning curves found in human and primates.•Superior performance was attributable to a narrower tuning curve and vice versa.•Results revealed a double-dissociation of laterality of stimulation and task.•While stimulation improved performance in one task, it impaired performance in another.
Receptive field; Neuronal tuning curve; Magnitude judgement; Numerosity; Duration; Time; Computational modelling
Febrile seizures are induced by fever and are the most common type of seizures in children. Although numerous studies have been performed on febrile seizures, their pathophysiology remains unclear. Recent studies have shown that cytokines may play a role in the pathogenesis of febrile seizures. The present study was conducted to identify potential links between serum interleukin-1beta (IL-1β), tumor necrosis factor-alpha (TNF-α), and febrile seizures.
Ninety-two patients with simple or complex febrile seizures (46 patients per seizure type), and 46 controls with comparable age, sex, and severity of temperature were enrolled.
The median concentrations of serum IL-1β in the simple, complex febrile seizure, and control groups were 0.05, 0.1, and 0.67 pg/mL, respectively (P=0.001). Moreover, the median concentrations of TNF-α in the simple, complex febrile seizure, and control groups were 2.5, 1, and 61.5 pg/mL, respectively (P=0.001). Furthermore, there were significant differences between the case groups in serum IL-1β and TNF-α levels (P<0.05).
Unlike previous studies, our study does not support the hypothesis that increased IL-1β and TNF-α production is involved in the pathogenesis of febrile seizures.
Interleukin-1beta; Tumor necrosis factor-alpha; Febrile seizures
In this study Fe (III)-doped TiO2 nanoparticles were synthesized by sol–gel method at two atomic ratio of Fe/Ti, 0.006 and 0.034 percent. Then the photoactivity of them was investigated on degradation of phenol under UV (<380 nm) irradiation and visible light (>380 nm). Results showed that at appropriate atomic ratio of Fe to Ti (% 0.034) photoactivity of Fe(III)–doped TiO2 nanoparticles increased. In addition, the effects of various operational parameters on photocatalytic degradation, such as pH, initial concentration of phenol and amount of photocatalyst were examined and optimized. At all different initial concentration, highest degradation efficiency occurred at pH = 3 and 0.5 g/L Fe(III)–doped TiO2 dosage. With increase in initial concentration of phenol, photocatalytic degradation efficiency decreased. Photoactivity of Fe (III)-doped TiO2 under UV irradiation and visible light at optimal condition (pH = 3 and catalyst dosage = and 0.5 g/L) was compared with P25 TiO2 nanoparticles. Results showed that photoactivity of Fe(III)-doped TiO2 under visible light was more than P25 TiO2 photoactivity, but it was less than P25 TiO2 photoactivity under UV irradiation. Also efficiency of UV irradiation alone and amount of phenol adsorption on Fe(III)-doped TiO2 at dark condition was investigated.
Aqueous solution; Phenol; Fe (III)-doped TiO2; P25 TiO2; Sol–gel method
Despite decades of research on spatial memory, we know surprisingly little about how the brain guides navigation to goals. While some models argue that vectors are represented for navigational guidance, other models postulate that the future path is computed. Although the hippocampal formation has been implicated in processing spatial goal information, it remains unclear whether this region processes path- or vector-related information.
We report neuroimaging data collected from subjects navigating London’s Soho district; these data reveal that both the path distance and the Euclidean distance to the goal are encoded by the medial temporal lobe during navigation. While activity in the posterior hippocampus was sensitive to the distance along the path, activity in the entorhinal cortex was correlated with the Euclidean distance component of a vector to the goal. During travel periods, posterior hippocampal activity increased as the path to the goal became longer, but at decision points, activity in this region increased as the path to the goal became closer and more direct. Importantly, sensitivity to the distance was abolished in these brain areas when travel was guided by external cues.
The results indicate that the hippocampal formation contains representations of both the Euclidean distance and the path distance to goals during navigation. These findings argue that the hippocampal formation houses a flexible guidance system that changes how it represents distance to the goal depending on the fluctuating demands of navigation.
•The hippocampus represents both the path and the Euclidean distances to goals•Entorhinal activity reflects the change in the Euclidean distance when the goal is set•The posterior hippocampus represents the future path at different stages en route•Significant correlations are abolished when travel is guided by external cues
Howard et al. reveal that during the navigation of a simulated real-word environment, hippocampal and entorhinal activity is correlated with the distance to the goal. The posterior hippocampus encodes the path during travel, decision making, and detours. The entorhinal cortex encodes the distance along the vector when the goal is set.
It is widely accepted that brain maturation from adolescence to adulthood contributes to substantial behavioural changes. Despite this, however, knowledge of the precise mechanisms is still sparse. We used fMRI to investigate developmental differences between healthy adolescents (age range 14–15) and adults (age range 20–39) in feedback-related decision making using a probabilistic reversal learning task. Conventionally groups are compared based on continuous values of blood oxygen level dependent (BOLD) percentage signal change. In contrast, we transformed these values into discrete states and used the pattern of these states to compare groups. We focused our analysis on anterior cingulate cortex (ACC), ventral striatum (VS) and ventromedial prefrontal cortex (vmPFC) as their functions have been shown to be critical in feedback related decision making. Discretisation of continuous BOLD values revealed differential patterns of activity as compared to conventional statistical methods. Results showed differential representation of feedback and decision in ACC and vmPFC between adolescents and adults but no difference in VS. We argue that the pattern of activity of ACC, vmPFC and VS in adolescents resulted in several drawbacks in decision making such as redundant and imprecise representation of decision and subsequently poorer performance in terms of the number of system changes (change of contingencies). This method can be effectively used to infer group differences from within-group analysis rather than studying the differences by direct between-group comparisons.
•ACC activity in adults represented solely the subsequent decision.•ACC activity in adolescents reflected both feedback and decision.•Activity of vmPFC in adults reflected both feedback and decision.•Activity of vmPFC in adolescents it represented feedback only.•VS represented feedback and did not differ between adolescents and adults.
Developmental; Uncertainty; Anterior cingulate cortex (ACC); Ventral striatum (VS); Ventromedial prefrontal cortex (vmPFC); Reward processing; Decision-making; Probabilistic reversal learning
How do humans perceive the passage of time and the duration of events without a dedicated sensory system for timing? Previous studies have demonstrated that when a stimulus changes over time, its duration is subjectively dilated, indicating that duration judgments are based on the number of changes within an interval. In this study, we tested predictions derived from three different accounts describing the relation between a changing stimulus and its subjective duration as either based on (1) the objective rate of changes of the stimulus, (2) the perceived saliency of the changes, or (3) the neural energy expended in processing the stimulus. We used visual stimuli flickering at different frequencies (4–166 Hz) to study how the number of changes affects subjective duration. To this end, we assessed the subjective duration of these stimuli and measured participants' behavioral flicker fusion threshold (the highest frequency perceived as flicker), as well as their threshold for a frequency-specific neural response to the flicker using EEG. We found that only consciously perceived flicker dilated perceived duration, such that a 2 s long stimulus flickering at 4 Hz was perceived as lasting as long as a 2.7 s steady stimulus. This effect was most pronounced at the slowest flicker frequencies, at which participants reported the most consistent flicker perception. Flicker frequencies higher than the flicker fusion threshold did not affect perceived duration at all, even if they evoked a significant frequency-specific neural response. In sum, our findings indicate that time perception in the peri-second range is driven by the subjective saliency of the stimulus' temporal features rather than the objective rate of stimulus changes or the neural response to the changes.
Adaptation is an automatic neural mechanism supporting the optimization of visual processing on the basis of previous experiences. While the short-term effects of adaptation on behaviour and physiology have been studied extensively, perceptual long-term changes associated with adaptation are still poorly understood. Here, we show that the integration of adaptation-dependent long-term shifts in neural function is facilitated by sleep. Perceptual shifts induced by adaptation to a distorted image of a famous person were larger in a group of participants who had slept (experiment 1) or merely napped for 90 min (experiment 2) during the interval between adaptation and test compared with controls who stayed awake. Participants' individual rapid eye movement sleep duration predicted the size of post-sleep behavioural adaptation effects. Our data suggest that sleep prevented decay of adaptation in a way that is qualitatively different from the effects of reduced visual interference known as ‘storage’. In the light of the well-established link between sleep and memory consolidation, our findings link the perceptual mechanisms of sensory adaptation—which are usually not considered to play a relevant role in mnemonic processes—with learning and memory, and at the same time reveal a new function of sleep in cognition.
adaptation; sleep; learning; faces; figural after-effects; plasticity
Cutaneous leishmaniasis (CL) is a major health problem in many parts of Iran, although diagnosis of CL especially in the endemic area is easy, but treatment and management of the disease is a global dilemma. Diagnosis of CL in non-endemic area is not as simple as in endemic foci. In this study, the status and the proportions of CL induced by Leishmania major and L. tropica among CL suspected patients referred to the Center for Research and Training in Skin Diseases and Leprosy, (CRTSDL) during 2008 to 2011 are described.
CL patients with suspected lesions were clinically examined. History of trip to zoonotic CL and/or anthroponotic CL endemic areas and the characteristics of their lesion(s) were recorded. Diagnosis of the lesion was done using direct smear microscopy, culture and conventional polymerase chain reaction (PCR).
A total of 404 (M = 256, F = 148) patients with 776 lesions were recruited and parasitologically examined. The results showed that 255 of the patients with 613 lesions; patients with lesion(s) induced by L. major=147 (M = 63, 43%, F = 84, 57%) and lesion(s) induced by L. tropica=108 (M = 35, 32%, F = 73, 68%). History of travel to endemic area was not always correlated with isolated Leishmania species.
Although travel history to endemic area is an important factor to be considered for diagnosis, but parasitological confirmation is necessary initiation of treatment.
Cutaneous leishmaniasis; Diagnosis; PCR; Leishmania species
Adhesion; Gloves, Surgical; Rat
Background: Early diagnosis of albuminuria and the prevention of its progression to macroalbuminuria and diabetic nephropathy are crucial. Angiotensin converting enzyme inhibitors (ACEIs) and antagonists of angiotensin II receptors type I (ARBs) are currently used as first-line treatment for albuminuria in these patients. The present study was conducted to assess the efficacy of addition of spironolactone to ACEIs or ARB in the prevention of diabetic nephropathy.
Methods: Sixty patients were selected from the patients who referred to a Diabetes Clinic in this randomized clinical trial study. The control group received enalapril and the case group took additive therapy with spironolactone for 12 weeks. Blood pressure, concentrations of creatinine and albumin in the serum and urine, urinary albumin/creatinine ratio, serum potassium were determined for each patient in the beginning of and every 4-6 weeks until the end of the study. This clinical trial was registered in the Iranian Registry of Clinical Trials (www.irct.ir) with registration number ID: IRCT201105084849N2.
Results: There was statistically significant difference in albumin/creatinine ratio between the two groups (p<0.001). Albuminuria reduced more significantly in case group compared to control group. It was measured 66.6±26.8 mg/mmol and 45.7±19 mg/mmol in control and case groups, respectively. The patients did not develop any significant adverse effect including reduction in GFR, hyperkalemia, and hypotension.
Conclusion: Low to moderate doses of spironolactone can augment the effect of ACEIs in the prevention of diabetic nephropathy.
Diabetic nephropathy; Albuminuria; Spironolactone; Angiotensin Converting Enzyme Inhibitors (ACEIs)
The interference of magnitudes in different dimensions has been demonstrated previously, but the effect of training in one dimension on judgment of another has yet to be examined. The present study aimed to investigate the effect of training in numerosity judgment on judgment of duration. 32 participants took part in two sessions, 12 days apart, and had to judge which of two successive sets of items was presented longer. Half of the participants (training group) were additionally trained in 11 sessions to judge which one of the two successive sets of items was more numerous. It was found that the participants in the training group became more prone to the interference of numerosity on judging duration after training, when compared to the control group. Thus, being trained to more easily perceive the difference in number of items in the two sets affected the perception of duration. On the 3-month follow up session, no effect was found with 20 participants (n = 10 for each group). These findings indicate that the interference of magnitudes in different dimensions can be modulated by training. We discuss that this modulatory effect might be due to neural changes in shared brain regions between interfering magnitudes and/or is mediated by higher levels of perception.
Vertical root fracture (VRF) is a complication which is chiefly diagnosed radiographically. Recently, film-based radiography has been substituted with digital radiography. At the moment, there is a wide range of monitors available in the market for viewing digital images. The present study aims to compare the diagnostic accuracy, sensitivity and specificity of medical and conventional monitors in detection of vertical root fractures.
Material and Methods
In this in vitro study 228 extracted single-rooted human teeth were endodontically treated. Vertical root fractures were induced in 114 samples. The teeth were imaged by a digital charge-coupled device radiography using parallel technique. The images were evaluated by a radiologist and an endodontist on two medical and conventional liquid-crystal display (LCD) monitors twice. Z-test was used to analyze the sensitivity, accuracy and specificity of each monitor. Significance level was set at 0.05. Inter and intra observer agreements were calculated by Cohen’s kappa.
Accuracy, specificity and sensitivity for conventional monitor were calculated as 67.5%, 72%, 62.5% respectively; and data for medical grade monitor were 67.5%, 66.5% and 68% respectively. Statistical analysis showed no significant differences in detecting VRF between the two techniques. Inter-observer agreement for conventional and medical monitor was 0.47 and 0.55 respectively (moderate). Intra-observer agreement was 0.78 for medical monitor and 0.87 for conventional one (substantial).
The type of monitor does not influence diagnosis of vertical root fractures.
Diagnostic Imaging; Diagnosis; Digital Dental Radiology; Monitor; Tooth Fractures; Radiography
Adaptation aftereffects have been found for low-level visual features such as colour, motion and shape perception, as well as higher-level features such as gender, race and identity in domains such as faces and biological motion. It is not yet clear if adaptation effects in humans extend beyond this set of higher order features. The aim of this study was to investigate whether objects highly associated with one gender, e.g. high heels for females or electric shavers for males can modulate gender perception of a face. In two separate experiments, we adapted subjects to a series of objects highly associated with one gender and subsequently asked participants to judge the gender of an ambiguous face. Results showed that participants are more likely to perceive an ambiguous face as male after being exposed to objects highly associated to females and vice versa. A gender adaptation aftereffect was obtained despite the adaptor and test stimuli being from different global categories (objects and faces respectively). These findings show that our perception of gender from faces is highly affected by our environment and recent experience. This suggests two possible mechanisms: (a) that perception of the gender associated with an object shares at least some brain areas with those responsible for gender perception of faces and (b) adaptation to gender, which is a high-level concept, can modulate brain areas that are involved in facial gender perception through top-down processes.
There is strong evidence that magnitudes in different dimensions can interfere. A majority of previous studies on the interaction of temporal magnitudes on numerosity showed no interfering effect, while many studies have reported the interference of numerosity on judgement of temporal magnitudes. We speculated that this one-way interference is confounded by the magnitudes used in the studies. We used a methodology that allowed us to study this interaction reciprocally. Moreover, we selected magnitudes for two dimensions that enabled us to detect their interfering effects. Participants had to either judge which of two successive sets of items was more numerous (numerosity judgement task), or which set of items was presented longer (duration judgement task). We hypothesised that a longer presentation of a set will be judged as being more numerous, and vice versa, a more numerous set will be judged as being presented longer. Results confirmed our hypothesis. A positive correlation between duration of presentation and judged numerosity as well as a positive correlation between the number of items and judged duration of presentation was found. This observation supports the idea that duration and numerosity judgements are not completely independent and implies the existence of (partly) generalised and abstract components in the magnitude representations.
In order to detect nephropathy, measurement of total (24 hrs) urinary albumin or albumin/creatinin ratio in random urine samples is being recommended. But methods of albumin measurement are not available in all laboratories and also cost about 6 times more than that of urinary total protein measurement.
This Study was performed to determine appropriate cut off point in 24 hours urine total protein to diagnose micro- and macroalbuminuria in patients with diabetes mellitus.
Patients and Methods
In this study, 204 patients with diabetes mellitus type I and II were selected. In collected 24 hours urine from patients, protein and albumin were measured by using Pyrogallol and Immunoturbidimetry methods, respectively.
Normoalbuminuri (albumin < 30 mg/24 hrs urine), microalbuminuri (albumin = 30-300 mg/24 hrs urine), and macroalbuminuri (albumin > 300 mg/24 hrs urine) were detected in 130, 51, and 23 patients, respectively. In 24 hrs urine collections, amounts of protein and albumin were compared to calculate cut off point of exerted protein for nephropathy diagnosis. cut off point of 73 mg/day for urinary total protein had appropriate sensitivity (94.5 %, CI = 91.4 % -97.6 %) and specificity (77.9 %, CI = 72.8 % -82.9 %) for microalbuminuria, while cut off point of 514 mg/day (sensitivity 95.7 %; specificity 98.9 %) was detected for diagnosis macroalbuminuria. Urine protein exertion of 150 mg/day that is currently considered as a normal value in most laboratory kits had a sensitivity of 73.1 % by which 30 % of microalbuminuric cases remained undiagnosed.
Urinary total protein cut-off points of 73 mg/day and 514 mg/day were diagnostic for micro- and macroalbuminuria, respectively.
Urinary Albumin; Urinary Protein; Diabetic Nephropathy
Background: Rise in serum homocysteine level may be associated with higher prevalence of cardiovascular diseases in hypothyroidism. Levothyroxine can partly diminish serum homocysteine level. Folic acid participates in homocysteine metabolic cycle in the human body. The effect of concomitant administration of folic acid and levothyroxine on serum homocysteine level was evaluated in the present study.
Methods: Sixty patients with hypothyroidism participated in this double-blinded clinical trial study. They were divided into two equal groups; Group A received oral levothyroxine 50-100 µg daily. Group B took oral folic acid 1 mg on a daily basis in addition to levothyroxine with similar schedule to group A. The patients were followed up for two months. The serum homocysteine levels of these two groups were measured before and after the study. This study was registered in Iranian Registry of clinical trial (IRCT number: 201112077723N1).
Results: Mean serum homocysteine level fell from 11.5±4.2 to 9.9±3.5 µmol/lit and from 11.2±3.1 to 6.9±1.9 µmol/lit in group A and B, respectively (p<0.001). The mean reduction in serum homocysteine levels were 1.6±1.2 µmol/lit and 4.3±1.4 µmol/lit in group A and B, respectively (p<0.001).
Conclusion: Levothyroxine can decrease serum homocysteine level partly; still its combination with folic acid empowers the effect. Combination therapy declines serum homocysteine level more successfully.
Hypothyroidism; Homocysteine; Folic acid; Levothyroxine
The control of postoperative pain is important in children, and poor pain control leads to organ dysfunction and behavioral problems.
We compared the analgesic effects of suppository acetaminophen, bupivacaine wound infiltration, and caudal block with bupivacaine on postoperative pain in pediatric inguinal herniorrhaphy.
Patients and Methods:
In this double-blinded, randomized controlled clinical trial, 90 children of American Society of Anesthesiologists (ASA) grade I-II, aged between 3 months and 7 years, and scheduled for elective unilateral inguinal herniorrhaphy under general anesthesia were assigned to three equal groups. Patients in the first group received 20 mg/kg of suppository acetaminophen. In the second group, 2 mg/kg of 0.5% bupivacaine was infiltrated in the incisional site, and in the third group, a caudal block was performed with 0.75 mL/kg of 0.25% bupivacaine. The Face, Legs, Activity, Cry, Consolability (FLACC) pain scale was applied 30 minutes after operation. Thereafter, the FLACC score was obtained every hour during the next 6 hours. If the FLACC score was 4 or over, we administered 0.5 mg/kg of intravenous meperidine. The data was transferred to SPSS-10 software and analyzed statistically with chi-square and analysis of variance tests. P < 0.05 was considered significant.
The mean analgesic duration in the acetaminophen, bupivacaine infiltration, and caudal block groups was 4.07, 5.40, and 5.37 hours, respectively. Significant differences were not observed between the bupivacaine infiltration and caudal block groups (P = 0.9), but the differences between the bupivacaine infiltration and acetaminophen groups (P = 0.034) and the caudal block and acetaminophen groups (P = 0.039) were significant. With regard to meperidine administration, significant differences were not observed between the bupivacaine infiltration and caudal block groups (P = 0.848), but significant differences were observed between these two groups and the acetaminophen group (P < 0.05).
Patients in the bupivacaine infiltration and caudal block groups had less postoperative pain than those in the acetaminophen group and received lower amount of meperidine. We concluded that in children, bupivacaine infiltration and caudal block with bupivacaine produce better analgesia than suppository acetaminophen. It seems that bupivacaine infiltration is better than caudal block because of its simplicity, lower incidence of complications, and failure rate.
Bupivacaine; Anesthesia, Caudal; Pediatrics; Analgesia; Suppositories; Acetaminophen
There are some reports in which a condition of zinc deficiency and its associated outcomes with a change in concentration of serum copper among the thalassemic patients has been highlighted. The aim of this prospective study was to determine the serum zinc and copper levels in children with beta-thalassemia major.
In this cross sectional study all children under 12 years affected by beta thalassemia major (40 patients) were evaluated for serum zinc and copper levels in Qazvin thalassemia center (Qazvin, Iran) in 2007. Serum measurements for zinc and copper were performed by atomic absorption spectrophotometer.
The mean concentrations of serum zinc and copper levels were 67.35±20.38 and 152.42±24.17 µg/dl respectively. Twenty-six (65%) of thalassemic patients had zinc concentration under 70 µg/dl (hypozincemia). None of the thalassemic children had copper deficiency. No significant correlation between serum zinc level with age, weight, height, body mass index, duration of blood transfusion, desferrioxamine dose and ferritin level was observed in thalassemic patients (P=0.3).
This study revealed that hypozincemia is common in thalassemic patients, but in contrast, there is no copper deficiency. Further evaluation in this regard is recommended.
Beta-thalassemia; Zinc; Copper; Children
Objective. Febrile seizures are the most common type of convulsion in children. The identification of influencing factors on incidence of the first febrile seizures is of prime priority. The aim of this study was to identify the risk factors of the first febrile seizures in Iranian children. Methods. In this case-control study 80 children aged 9 month to 5 years with their first febrile seizures were compared with 80 children with fever without seizure based on different risk factors in 2007. Results. There was significant difference between two groups regarding the gender, family history of febrile seizures, breast-feeding duration, and the body temperature (P < .05). Conclusion. Our study showed that factors including the gender, family history of febrile seizures, breast-feeding duration, and the body temperature are among the risk factors in occurrence of the first febrile seizure. Preventive measures to remove such risk factors could lead to lower the incidence of febrile seizures.