Immune privilege is used by the eye, brain, reproductive organs and gut to preserve structural and functional integrity in the face of inflammation. The eye is arguably the most vulnerable, and therefore also the most “privileged” of tissues, but paradoxically, remains subject to destructive autoimmunity. It has been proposed, although never proven in vivo, that the eye can induce T regulatory cells (Tregs) locally. Using FoxP3-GFP reporter mice expressing a retina-specific T cell receptor, we now show that uncommitted T cells rapidly convert in the living eye to FoxP3+ Tregs in a process involving retinal antigen recognition, de novo FoxP3 induction and proliferation. This takes place within the ocular tissue and is supported by retinoic acid, which is normally present in the eye due to its function in the chemistry of vision. Non-converted T cells showed evidence of priming, but appeared restricted from expressing effector function in the eye. Preexisting ocular inflammation impeded conversion of uncommitted T cells into Tregs. Importantly, retina-specific T cells primed in vivo before introduction into the eye were resistant to Treg conversion in the ocular environment, and instead caused severe uveitis. Thus, uncommitted T cells can be disarmed, but immune privilege is unable to protect from uveitogenic T cells that have acquired effector function prior to entering the eye. These findings shed new light on the phenomenon of immune privilege and on its role, as well as its limitations, in actively controlling immune responses in the tissue.
Most information about the lifetime prevalence of mental disorders comes from retrospective surveys, but how much these surveys have undercounted due to recall failure is unknown. We compared results from a prospective study with those from retrospective studies.
The representative 1972–1973 Dunedin New Zealand birth cohort (n=1037) was followed to age 32 years with 96% retention, and compared to the national New Zealand Mental Health Survey (NZMHS) and two US National Comorbidity Surveys (NCS and NCS-R). Measures were research diagnoses of anxiety, depression, alcohol dependence and cannabis dependence from ages 18 to 32 years.
The prevalence of lifetime disorder to age 32 was approximately doubled in prospective as compared to retrospective data for all four disorder types. Moreover, across disorders, prospective measurement yielded a mean past-year-to-lifetime ratio of 38% whereas retrospective measurement yielded higher mean past-year-to-lifetime ratios of 57% (NZMHS, NCS-R) and 65% (NCS).
Prospective longitudinal studies complement retrospective surveys by providing unique information about lifetime prevalence. The experience of at least one episode of DSM-defined disorder during a lifetime may be far more common in the population than previously thought. Research should ask what this means for etiological theory, construct validity of the DSM approach, public perception of stigma, estimates of the burden of disease and public health policy.
Epidemiology; longitudinal; prevalence; psychiatry; retrospective
To understand why children exposed to adverse psychosocial experiences are at elevated risk for age-related disease, such as cardiovascular disease, by testing whether adverse childhood experiences predict enduring abnormalities in stress-sensitive biological systems, namely, the nervous, immune, and endocrine/metabolic systems.
A 32-year prospective longitudinal study of a representative birth cohort.
A total of 1037 members of the Dunedin Multidisciplinary Health and Development Study.
During their first decade of life, study members were assessed for exposure to 3 adverse psychosocial experiences: socioeconomic disadvantage, maltreatment, and social isolation.
Main Outcome Measures
At age 32 years, study members were assessed for the presence of 3 age-related-disease risks: major depression, high inflammation levels (high-sensitivity C-reactive protein level >3 mg/L), and the clustering of metabolic risk biomarkers (overweight, high blood pressure, high total cholesterol, low high-density lipoprotein cholesterol, high glycated hemoglobin, and low maximum oxygen consumption levels.
Children exposed to adverse psychosocial experiences were at elevated risk of depression, high inflammation levels, and clustering of metabolic risk markers. Children who had experienced socioeconomic disadvantage (incidence rate ratio, 1.89; 95% confidence interval, 1.36–2.62), maltreatment (1.81; 1.38–2.38), or social isolation (1.87; 1.38–2.51) had elevated age-related-disease risks in adulthood. The effects of adverse childhood experiences on age-related-disease risks in adulthood were nonredundant, cumulative, and independent of the influence of established developmental and concurrent risk factors.
Children exposed to adverse psychosocial experiences have enduring emotional, immune, and metabolic abnormalities that contribute to explaining their elevated risk for age-related disease. The promotion of healthy psychosocial experiences for children is a necessary and potentially cost-effective target for the prevention of age-related disease.
Using data from the large, 30-year prospective Dunedin cohort study, we examined whether preexisting individual differences in childhood temperament predicted adulthood disordered gambling (a diagnosis covering the full continuum of gambling-related problems). A 90-min observational assessment at age 3 was used to categorize children into five temperament groups, including one primarily characterized by behavioral and emotional undercontrol. The children with undercontrolled temperament at 3 years of age were more than twice as likely to evidence disordered gambling at ages 21 and 32 than were children who were well-adjusted at age 3. These associations could not be explained by differences in childhood IQ or family socioeconomic status. Cleanly demonstrating the temporal relation between behavioral undercontrol and adult disordered gambling is an important step toward building more developmentally sensitive theories of disordered gambling and may put researchers in a better position to begin considering potential routes to disordered-gambling prevention through enhancing self-control and emotional regulation.
disordered gambling; undercontrol; personality; temperament; prediction; psychopathology; self-control; personality
Like many metazoans, the freshwater prawn Macrobrachium rosenbergii begins its post-embryonic life with a set of morphologically distinct planktonic larval stages, followed by a benthic post-larval stage during which the maturing organism differs from the larvae both ecologically and physiologically. Understanding of the molecular basis underlying morphogenesis in crustaceans is limited to the observation that methyl farnesoate, the non-epoxidated form of the insect juvenile hormone, acts as the active crustacean juvenoid. Molt steroids were also linked to morphogenesis and several other molecular pathways, such as Hedgehog and Wnt, are known to underlie morphogenesis in all metazoans examined and, as such, are thought to do the same in crustaceans. Using next generation sequencing, we deep-sequenced the transcriptomes of several larval and post-larval stages. De novo assembly, followed by bioinformatics analysis, revealed that many novel transcripts are over-expressed in either larvae- or post-larvae-stage prawn, shedding light on the molecular basis underlying M. rosenbergii metamorphosis. Fast larval molting rates and periodic morphological changes were reflected in over-expression of transcripts annotated to the cell cycle, DNA replication and morphogenic pathways (i.e., Hedgehog and Wnt). Further characterization of transcripts assigned to morphogenic pathways by real-time RT-PCR reconfirmed their over-expression in larvae, albeit with a more complex expression pattern when examined in the individual developmental stages. The expression level of an orthologue of cytochrome P450, 15A1, known to epoxidize methyl farnesoate in insects, was increased in the late larval and early post-larval stages, in accordance with the role of methyl farnesoate in crustacean metamorphosis. This study exemplifies the applicability of a high-throughput sequencing approach for studying complex traits, including metamorphosis, providing new insight into this unexplored area of crustacean research.
Premorbid cognitive deficits in schizophrenia are well documented and have been interpreted as supporting a neurodevelopmental etiological model. The authors investigated the following three unresolved questions about premorbid cognitive deficits: What is their developmental course? Do all premorbid cognitive deficits follow the same course? Are premorbid cognitive deficits specific to schizophrenia or shared by other psychiatric disorders?
Participants were members of a representative cohort of 1,037 males and females born between 1972 and 1973 in Dunedin, New Zealand. Cohort members underwent follow-up evaluations at specific intervals from age 3 to 32 years, with a 96% retention rate. Cognitive development was analyzed and compared in children who later developed schizophrenia or recurrent depression as well as in healthy comparison subjects.
Children who developed adult schizophrenia exhibited developmental deficits (i.e., static cognitive impairments that emerge early and remain stable) on tests indexing verbal and visual knowledge acquisition, reasoning, and conceptualization. In addition, these children exhibited developmental lags (i.e., growth that is slower relative to healthy comparison subjects) on tests indexing processing speed, attention, visual-spatial problem solving ability, and working memory. These two premorbid cognitive patterns were not observed in children who later developed recurrent depression.
These findings suggest that the origins of schizophrenia include two interrelated developmental processes evident from childhood to early adolescence (ages 7–13 years). Children who will grow up to develop adult schizophrenia enter primary school struggling with verbal reasoning and lag further behind their peers in working memory, attention, and processing speed as they get older.
In this article we report a graded relationship between neighborhood socioeconomic status (SES) and children’s antisocial behavior that (1) can be observed at school entry, (2) widens across childhood, (3) remains after controlling for family-level SES and risk, and (4) is completely mediated by maternal warmth and parental monitoring (defined throughout as supportive parenting). Children were participants in the Environmental Risk (E-Risk) Longitudinal Twin Study (n=2232), which prospectively tracked the development of children and their neighborhoods across childhood. Direct and independent effects of neighborhood-level SES on children’s antisocial behavior were observed as early as age 5 and the gap between children living in deprived versus more affluent neighborhoods widened as children approached adolescence. By age 12, the effect of neighborhood socioeconomic status on children’s antisocial behavior was as large as the effect observed for our most robust predictor of antisocial behavior – sex! (Cohen’s d = .51 when comparing children growing up in deprived versus more affluent neighborhoods in comparison to Cohen’s d = .53 when comparing antisocial behavior among boys versus girls). However, differences in children’s levels and rate of change in antisocial behavior across deprived versus more affluent neighborhoods were completely mediated by supportive parenting practices. Implications of our findings for studying and reducing socioeconomic disparities in antisocial behavior among children are discussed.
childhood antisocial behavior; neighborhood poverty; socioeconomic status; social inequalities; maternal warmth; parental monitoring; supportive parenting
It has been reported that borderline personality related characteristics can be observed in children, and that these characteristics are associated with increased risk for the development of borderline personality disorder. It is not clear whether borderline personality related characteristics in children share etiological features with adult borderline personality disorder. We investigated the etiology of borderline personality related characteristics in a longitudinal cohort study of 1,116 pairs of same-sex twins followed from birth through age 12 years. Borderline personality related characteristics measured at age 12 years were highly heritable, were more common in children who had exhibited poor cognitive function, impulsivity, and more behavioral and emotional problems at age 5 years, and co-occurred with symptoms of conduct disorder, depression, anxiety, and psychosis. Exposure to harsh treatment in the family environment through age 10 years predicted borderline personality related characteristics at age 12 years. This association showed evidence of environmental mediation and was stronger among children with a family history of psychiatric illness, consistent with diathesis–stress models of borderline etiology. Results indicate that borderline personality related characteristics in children share etiological features with borderline personality disorder in adults and suggest that inherited and environmental risk factors make independent and interactive contributions to borderline etiology.
Teaching children requires effort and some children naturally require more effort than others. This study tests whether teacher effort devoted to individual children varies as a function of children’s personal characteristics. Using a nation-wide longitudinal study of twins followed between ages 5-12 years, we asked teachers about the effort they invested in each child enrolled in our study. We found that teacher effort was a function of heritable child characteristics; that children’s challenging behavior assessed at age 5 predicted teacher effort at age 12; and that challenging child behavior and teacher effort share common etiology in children’s genes. While child effects accounted for a significant proportion of variance in teacher effort, we also found variation that could not be attributed to children’s behavior. Treating children with challenging behavior and enhancing teachers’ skills in behavior management could increase the time and energy teachers have to deliver curriculum in their classrooms.
Children growing up in poor versus affluent neighborhoods are more likely to spend time in prison, develop health problems and die at an early age. The question of how neighborhood conditions influence our behavior and health has attracted the attention of public health officials and scholars for generations. Online tools are now providing new opportunities to measure neighborhood features and may provide a cost effective way to advance our understanding of neighborhood effects on child health.
A virtual systematic social observation (SSO) study was conducted to test whether Google Street View could be used to reliably capture the neighborhood conditions of families participating in the Environmental-Risk (E-Risk) Longitudinal Twin Study. Multiple raters coded a subsample of 120 neighborhoods and convergent and discriminant validity was evaluated on the full sample of over 1,000 neighborhoods by linking virtual SSO measures to: (a) consumer based geo-demographic classifications of deprivation and health, (b) local resident surveys of disorder and safety, and (c) parent and teacher assessments of children’s antisocial behavior, prosocial behavior, and body mass index.
High levels of observed agreement were documented for signs of physical disorder, physical decay, dangerousness and street safety. Inter-rater agreement estimates fell within the moderate to substantial range for all of the scales (ICCs ranged from .48 to .91). Negative neighborhood features, including SSO-rated disorder and decay and dangerousness corresponded with local resident reports, demonstrated a graded relationship with census-defined indices of socioeconomic status, and predicted higher levels of antisocial behavior among local children. In addition, positive neighborhood features, including SSO-rated street safety and the percentage of green space, were associated with higher prosocial behavior and healthy weight status among children.
Our results support the use of Google Street View as a reliable and cost effective tool for measuring both negative and positive features of local neighborhoods.
Systematic social observation; Google Street View; neighborhood disorder; neighborhood deprivation; antisocial behavior; body mass index
Using longitudinal and prospective measures of trauma during childhood, the authors assessed the risk of developing psychotic symptoms associated with maltreatment, bullying, and accidents in a nationally representative U.K. cohort of young twins.
Data were from the Environmental Risk Longitudinal Twin Study, which follows 2,232 twin children and their families. Mothers were interviewed during home visits when children were ages 5, 7, 10, and 12 on whether the children had experienced maltreatment by an adult, bullying by peers, or involvement in an accident. At age 12, children were asked about bullying experiences and psychotic symptoms. Children’s reports of psychotic symptoms were verified by clinicians.
Children who experienced maltreatment by an adult (relative risk=3.16, 95% CI=1.92–5.19) or bullying by peers (relative risk=2.47, 95% CI=1.74–3.52) were more likely to report psychotic symptoms at age 12 than were children who did not experience such traumatic events. The higher risk for psychotic symptoms was observed whether these events occurred early in life or later in childhood. The risk associated with childhood trauma remained significant in analyses controlling for children’s gender, socioeconomic deprivation, and IQ; for children’s early symptoms of internalizing or externalizing problems; and for children’s genetic liability to developing psychosis. In contrast, the risk associated with accidents was small (relative risk=1.47, 95% CI=1.02–2.13) and inconsistent across ages.
Trauma characterized by intention to harm is associated with children’s reports of psychotic symptoms. Clinicians working with children who report early symptoms of psychosis should inquire about traumatic events such as maltreatment and bullying.
To test how genomic loci identified in genome-wide association studies influence the development of obesity.
A 38-year prospective longitudinal study of a representative birth cohort.
The Dunedin Multidisciplinary Health and Development Study, Dunedin, New Zealand.
One thousand thirty-seven male and female study members.
We assessed genetic risk with a multilocus genetic risk score. The genetic risk score was composed of single-nucleotide polymorphisms identified in genome-wide association studies of obesity-related phenotypes. We assessed family history from parent body mass index data collected when study members were 11 years of age.
Main Outcome Measures
Body mass index growth curves, developmental phenotypes of obesity, and adult obesity outcomes were defined from anthropometric assessments at birth and at 12 subsequent in-person interviews through 38 years of age.
Individuals with higher genetic risk scores were more likely to be chronically obese in adulthood. Genetic risk first manifested as rapid growth during early childhood. Genetic risk was unrelated to birth weight. After birth, children at higher genetic risk gained weight more rapidly and reached adiposity rebound earlier and at a higher body mass index. In turn, these developmental phenotypes predicted adult obesity, mediating about half the genetic effect on adult obesity risk. Genetic associations with growth and obesity risk were independent of family history, indicating that the genetic risk score could provide novel information to clinicians.
Genetic variation linked with obesity risk operates, in part, through accelerating growth in the early childhood years after birth. Etiological research and prevention strategies should target early childhood to address the obesity epidemic.
We noted an unexpected inheritance pattern of lesions in several strains of gene-manipulated mice with ocular phenotypes. The lesions, which appeared at various stages of backcross to C57BL/6, bore resemblance to the rd8 retinal degeneration phenotype. We set out to examine the prevalence of this mutation in induced mutant mouse lines, vendor C57BL/6 mice and in widely used embryonic stem cells.
Ocular lesions were evaluated by fundus examination and histopathology. Detection of the rd8 mutation at the genetic level was performed by PCR with appropriate primers. Data were confirmed by DNA sequencing in selected cases.
Analysis of several induced mutant mouse lines with ocular disease phenotypes revealed that the disease was associated 100% with the presence of the rd8 mutation in the Crb1 gene rather than with the gene of interest. DNA analysis of C57BL/6 mice from common commercial vendors demonstrated the presence of the rd8 mutation in homozygous form in all C57BL/6N substrains, but not in the C57BL/6J substrain. A series of commercially available embryonic stem cells of C57BL/6N origin and C57BL/6N mouse lines used to generate ES cells also contained the rd8 mutation. Affected mice displayed ocular lesions typical of rd8, which were detectable by funduscopy and histopathology as early as 6 weeks of age.
These findings identify the presence of the rd8 mutation in the C57BL/6N mouse substrain used widely to produce transgenic and knockout mice. The results have grave implications for the vision research community who develop mouse lines to study eye disease, as presence of rd8 can produce significant disease phenotypes unrelated to the gene or genes of interest. It is suggested that researchers screen for rd8 if their mouse lines were generated on the C57BL/6N background, bear resemblance to the rd8 phenotype, or are of indeterminate origin.
The rd8 mutation of the Crb1 gene, which can cause significant retinal degeneration, was found in homozygous form in the C57BL/6N mouse substrain. It is present in stocks from most major mouse vendors and in ES cells derived from the C57BL/6N substrain. Its presence can confound interpretation of ocular mutant phenotypes.
The eye is an immunologically privileged and profoundly immunosuppressive environment. Early studies reported inhibition of T cell proliferation, IFN-γ production and generation of T regulatory cells (Treg) by aqueous humor (AH), and identified TGF-β as a critical factor. However, T cell subsets including FoxP3+ Treg and Th17 were unknown at that time, as was the role of retinoic acid (RA) in Treg induction. Consequently, the effect of the ocular microenvironment on T cell lineage commitment and function, and the role of RA in this process, had not been explored. We now use gene manipulated mice and highly purified T cell populations to demonstrate that AH suppresses lineage commitment and acquisition of Th1 and Th17 effector function of naïve T cells, manifested as reduction of lineage-specific transcription factors and cytokines. Instead, AH promoted their massive conversion to FoxP3+ Treg that expressed CD25, GITR, CTLA-4 and CD103 and were functionally suppressive. TGF-β and RA were both needed and synergized for Treg conversion by AH, with TGF-β enhancing T cell expression of RARα. Newly converted FoxP3+ Tregs were unstable, but were stabilized upon continued exposure to AH or by the DNA demethylating agent 5-AZA. In contrast, T cells already committed to effector function were resistant to the suppressive and Treg-inducing effects of AH. We conclude that RA in the eye plays a dual role: in vision and in immune privilege. Nevertheless, primed effector T cells are relatively insensitive to AH, helping to explain their ability to induce uveitis despite an inhibitory ocular microenvironment.
Autoimmune uveitis is a complex group of sight-threatening diseases that arise without a known infectious trigger. The disorder is often associated with immunological responses to retinal proteins. Experimental models of autoimmune uveitis targeting retinal proteins have led to a better understanding of the basic immunological mechanisms involved in the pathogenesis of uveitis and have provided a template for the development of novel therapies. The disease in humans is believed to be T cell-dependent, as clinical uveitis is ameliorated by T cell-targeting therapies. The roles of T helper 1 (Th1) and Th17 cells have been major topics of interest in the past decade. Studies in uveitis patients and experiments in animal models have revealed that Th1 and Th17 cells can both be pathogenic effectors, although, paradoxically, some cytokines produced by these subsets can also be protective, depending on when and where they are produced. The major proinflammatory as well as regulatory cytokines in uveitis, the therapeutic approaches, and benefits of targeting these cytokines will be discussed in this review.
Evidence from multiple avenues of pathogen recognition is accumulating that the mitochondria form an integral platform from which innate signaling takes place. Recent studies revealed that the mitochondria are shaping the innate response to intracellular pathogens, and mitochondrial function is modulating and being modulated by innate immune signaling. Further, cell biological analyses have uncovered the dynamic relocalization of key components involved in cytosolic viral recognition and signaling to the mitochondria, and the mobilization of mitochondria to the sites of viral replication. In this review, we provide an integrated view of how cellular stress and signals following cytosolic viral recognition are intimately linked and coordinated at the mitochondria. We incorporate recent findings into our current understanding of the role of mitochondrial function in antiviral immunity and suggest the existence of a ‘mitoxosome’, a mitochondrial oxidative signalosome where multiple pathways of viral recognition and cellular stress signals converge on the surface of the mitochondria to facilitate a coordinated antiviral response.
viral; signaling proteins; Toll-like receptors/pattern recognition receptors; apoptosis/autophagy; inflammation
Thanks to the confluence of genome sequencing and bioinformatics, the number of metabolic databases has expanded from a handful in the mid 1990s to several thousand today. These databases lie within distinct families that have common ancestry and common attributes. The main families are the MetaCyc, KEGG, Reactome, Model SEED, and BiGG families. We survey these database families, as well as important individual metabolic databases, including multiple human metabolic databases. The MetaCyc family is described in particular detail. It contains well over 1,000 databases, including highly curated databases for Escherichia coli, Saccharamyces cerevisiae, Mus musculus, and Arabidopsis thaliana. These databases are available through a number of web sites that offer a range of software tools for querying and visualizing metabolic networks. These web sites also provide multiple tools for analysis of gene expression and metabolomics data, including visualization of those datasets on metabolic network diagrams, and overrepresentation analysis of gene sets and metabolite sets.
Noninfectious uveitis is a leading cause of blindness and is thought to involve autoimmune T cell responses to retinal proteins, e.g., retinal arrestin (S-Ag). There are no known biomarkers for the disease. Susceptibility is associated with HLA, but little is known about susceptible class II alleles or the potentially pathogenic epitopes that they present. Using a ‘humanized’ HLA-transgenic mouse model of S-Ag induced autoimmune uveitis, we identified several susceptible and resistant alleles of HLA-DR and –DQ genes and defined pathogenic epitopes of S-Ag presented by the susceptible alleles. The sequences of these epitopes overlap with some previously identified peptides of S-Ag (“M” and “N”), known to elicit memory responses in lymphocytes of uveitis patients. HLA-DR-restricted, S-Ag-specific CD4+ T cells could be detected in blood and draining lymph nodes of uveitic mice with HLA class II tetramers and transferred the disease to healthy mice. Importantly, tetramer-positive cells were detected in peripheral blood of a uveitis patient. These findings provide the first tangible evidence that an autoimmune response to retina is causally involved in pathogenesis of human uveitis, demonstrate the feasibility of identifying and isolating retinal antigen-specific T cells from uveitis patients and may facilitate their development as biomarkers for the disease.
HLA class II; Uveitis; S-antigen; Arrestin; Autoimmune biomarker
Mesenchymal stem cells inhibit experimental autoimmune uveitis, and their immunomodulatory function is due at least in part to the induction of antigen-specific Treg in a paracrine fashion by the secretion of TGFβ.
Mesenchymal stem/progenitor cells (MSCs) have regenerative and immunomodulatory properties, exerted by cell-cell contact and in a paracrine fashion. Part of their immunosuppressive activity has been ascribed to their ability to promote the induction of CD4+CD25+FoxP3+ T lymphocytes with regulatory functions (Treg). Here the authors studied the effect of MSCs on the induction of Treg and on the development of autoimmunity, and they examined the possibility that MSC-mediated Treg induction could be attributed to the secretion of soluble factors.
The authors induced experimental autoimmune uveitis (EAU) in mice by immunization with the 1–20 peptide of the intraphotoreceptor binding protein. At the same time, some of the animals were treated intraperitoneally with syngeneic MSCs. The authors checked T-cell responses and in vitro Treg conversion by cell proliferation and blocking assays, in cell-cell contact and transwell settings. TGFβ and TGFβ receptor gene expression analyses were performed by real-time PCR.
The authors found that a single intraperitoneal injection of MSCs was able to significantly attenuate EAU and that a significantly higher percentage of adaptive Treg was present in MSC-treated mice than in MSC-untreated animals. In vitro blocking of antigen presentation by major histocompatibility complex class II precluded priming and clonal expansion of antigen-specific Treg, whereas blockade of TGFβ impaired the expression of FoxP3, preventing the conversion of CD4+ T cells into functionally active Treg.
The authors demonstrated that MSCs can inhibit EAU and that their immunomodulatory function is due at least in part to the induction of antigen-specific Treg in a paracrine fashion by secreting TGFβ.
The authors show that RPE cells induce MDSC differentiation, which could be another mechanism by which RPE cells regulate immune responses in the retina.
To test whether retinal pigment epithelial (RPE) cells are able to induce myeloid-derived suppressor cell (MDSC) differentiation from bone marrow (BM) progenitors.
BM cells were cocultured with or without RPE cells in the presence of GM-CSF and IL-4. Numbers of resultant MDSCs were assessed by flow cytometry after 6 days of incubation. The ability of the RPE cell–induced MDSCs to inhibit T cells was evaluated by a CFSE-based T-cell proliferation assay. To explore the mechanism by which RPE cells induce MDSC differentiation, PD-L1–deficient RPE cells and blocking antibodies against TGF-β, CTLA-2α, and IL-6 were used. RPE cell-induced MDSCs were adoptively transferred into mice immunized with interphotoreceptor retinoid-binding protein in complete Freund's adjuvant to test their efficacy in suppressing autoreactive T-cell responses in experimental autoimmune uveitis (EAU).
RPE cells induced the differentiation of MDSCs. These RPE cell–induced MDSCs significantly inhibited T-cell proliferation in a dose-dependent manner. PD-L1–deficient RPE cells induced MDSC differentiation as efficiently as wild-type RPE cells, and neutralizing TGF-β or CTLA-2α did not alter the numbers of induced MDSCs. However, blocking IL-6 reduced the efficacy of RPE cell–induced MDSC differentiation. Finally, adoptive transfer of RPE cell–induced MDSCs suppressed IRBP-specific T-cell responses that led to EAU.
RPE cells induce the differentiation of MDSCs from bone marrow progenitors. Both cell surface molecules and soluble factors are important in inducing MDSC differentiation. PD-L1, TGF-β, and CTLA-2α were not measurably involved in RPE cell–induced MDSC differentiation, whereas IL-6 was important in the process. The induction of MDSCs could be another mechanism by which RPE cells control immune reactions in the retina, and RPE cell–induced MDSCs should be further investigated as a potential approach to therapy for autoimmune posterior uveitis.
To determine to what extent subjects implanted with the Argus II retinal prosthesis can improve performance compared with residual native vision in a spatial-motor task.
High-contrast square stimuli (5.85 cm sides) were displayed in random locations on a 19″ (48.3 cm) touch screen monitor located 12″ (30.5 cm) in front of the subject. Subjects were instructed to locate and touch the square centre with the system on and then off (40 trials each). The coordinates of the square centre and location touched were recorded.
Ninety-six percent (26/27) of subjects showed a significant improvement in accuracy and 93% (25/27) show a significant improvement in repeatability with the system on compared with off (p<0.05, Student t test). A group of five subjects that had both accuracy and repeatability values <250 pixels (7.4 cm) with the system off (ie, using only their residual vision) was significantly more accurate and repeatable than the remainder of the cohort (p<0.01). Of this group, four subjects showed a significant improvement in both accuracy and repeatability with the system on.
In a study on the largest cohort of visual prosthesis recipients to date, we found that artificial vision augments information from existing vision in a spatial-motor task.
Clinical trials registry no
The Argus™ II 60 channel epiretinal prosthesis has been developed in order to provide partial restoration of vision to subjects blinded from outer retinal degenerative disease. To date the device has been implanted in 21 subjects as part of a feasibility study. In 6 month post-implantation door finding and line tracking orientation and mobility testing, subjects have shown improvements of 86% and 73%, respectively, for system on vs. system off. In high-contrast Square Localization tests using a touch screen monitor 87% of tested subjects performed significantly better with the system on compared with off. These preliminary results show that the Argus II system provides some functional vision to blind subjects.
The present study examines the relation between psychopathy assessed at age 13 using the mother-reported Childhood Psychopathy Scale (Lynam, 1997) and psychopathy assessed at age 24 using the interviewer-rated Psychopathy Checklist: Screening Version (PCL:SV; Hart, Cox, and Hare, 1995). Data from over 250 participants of the middle sample of the Pittsburgh Youth Study were used to examine this relation; approximately 9% of the sample met criteria for a possible PCL:SV diagnosis. Despite the long time-lag, different sources, and different methods, psychopathy from early adolescence into young adulthood was moderately stable, r = 0.31. The relation was present for the PCL:SV total and facet scores, was not moderated by initial risk status or initial psychopathy level, and held even after controlling for other age-13 variables. “Diagnostic” stability was somewhat lower. Specificity and negative predictive power were both good, sensitivity was adequate, but positive predictive power was poor. This constitutes the first demonstration of the relative stability of psychopathy from adolescence into adulthood and provides evidence for the incremental utility of the adolescent psychopathy construct. Implications and future directions are discussed.
adolescent psychopathy; Childhood Psychopathy Scale; chronic offending; longitudinal
To determine whether parental periodontal disease history is a risk factor for periodontal disease in adult offspring.
Proband periodontal examination (combined attachment loss (CAL) at age 32, and incidence of CAL from ages 26–32) and interview data were collected during the age-32 assessments in the Dunedin Study. Parental data were also collected. The sample was divided into two familial-risk groups for periodontal disease (high- and low-risk) based on parents’ self-reported periodontal disease.
Periodontal risk analysis involved 625 proband-parent(s) groups. After controlling for confounding factors, the high-familial-risk periodontal group was more likely to have 1+ sites with 4+mm CAL (RR 1.45; 95% CI 1.11–1.88), 2+ sites with 4+mm CAL (RR 1.45; 95% CI 1.03–2.05), 1+ sites with 5+mm CAL (RR 1.60; 95% CI 1.02–2.50) and 1+ sites with 3+mm incident CAL (RR 1.64; 95% CI 1.01–2.66) than the low-familial-risk group. Predictive validity was enhanced when information was available from both parents.
Parents with poor periodontal health tend to have offspring with poor periodontal health. Family/parental history of oral health is a valid representation of the shared genetic and environmental factors that contribute to an individual’s periodontal status, and may help predict patient prognosis and preventive treatment need.
periodontal; intergenerational; risk; family history
The human condition known as Premature Ovarian Failure (POF) is characterized by loss of ovarian function before the age of 40. A majority of POF cases are sporadic, but 10–15% are familial, suggesting a genetic origin of the disease. Although several causal mutations have been identified, the etiology of POF is still unknown for about 90% of the patients.
We report a genome-wide linkage and homozygosity analysis in one large consanguineous Middle-Eastern POF-affected family presenting an autosomal recessive pattern of inheritance. We identified two regions with a LODmax of 3.26 on chromosome 7p21.1-15.3 and 7q21.3-22.2, which are supported as candidate regions by homozygosity mapping. Sequencing of the coding exons and known regulatory sequences of three candidate genes (DLX5, DLX6 and DSS1) included within the largest region did not reveal any causal mutations.
We detect two novel POF-associated loci on human chromosome 7, opening the way to the identification of new genes involved in the control of ovarian development and function.