Neurocognitive disorders are emerging as a possible complication in patients infected with HIV. Even if asymptomatic, neurocognitive abnormalities are frequently detected using a battery of tests. This supported the creation of asymptomatic neurocognitive impairment (ANI) as a new entity. In a recent article published in BMC Infectious Diseases, Magnus Gisslén and colleagues applied a statistical approach, concluding that there is an overestimation of the actual problem. In fact, about 20% of patients are classified as neurocognitively impaired without a clear impact on daily activities. In the present commentary, we discuss the clinical implications of their findings. Although a cautious approach would indicate a stricter follow-up of patients affected by this disorder, it is premature to consider it as a proper disease. Based on a review of the data in the current literature we conclude that it is urgent to conduct more studies to estimate the overall risk of progression of the asymptomatic neurocognitive impairment. Moreover, it is important to understand whether new biomarkers or neuroimaging tools can help to identify better the most at risk population.
Please see related article: http://www.biomedcentral.com/1471-2334/11/356
HIV; asymptomatic neurocognitive impairment; HIV dementia; HAART
The Alvarado score can be used to stratify patients with symptoms of suspected appendicitis; the validity of the score in certain patient groups and at different cut points is still unclear. The aim of this study was to assess the discrimination (diagnostic accuracy) and calibration performance of the Alvarado score.
A systematic search of validation studies in Medline, Embase, DARE and The Cochrane library was performed up to April 2011. We assessed the diagnostic accuracy of the score at the two cut-off points: score of 5 (1 to 4 vs. 5 to 10) and score of 7 (1 to 6 vs. 7 to 10). Calibration was analysed across low (1 to 4), intermediate (5 to 6) and high (7 to 10) risk strata. The analysis focused on three sub-groups: men, women and children.
Forty-two studies were included in the review. In terms of diagnostic accuracy, the cut-point of 5 was good at 'ruling out' admission for appendicitis (sensitivity 99% overall, 96% men, 99% woman, 99% children). At the cut-point of 7, recommended for 'ruling in' appendicitis and progression to surgery, the score performed poorly in each subgroup (specificity overall 81%, men 57%, woman 73%, children 76%). The Alvarado score is well calibrated in men across all risk strata (low RR 1.06, 95% CI 0.87 to 1.28; intermediate 1.09, 0.86 to 1.37 and high 1.02, 0.97 to 1.08). The score over-predicts the probability of appendicitis in children in the intermediate and high risk groups and in women across all risk strata.
The Alvarado score is a useful diagnostic 'rule out' score at a cut point of 5 for all patient groups. The score is well calibrated in men, inconsistent in children and over-predicts the probability of appendicitis in women across all strata of risk.
Currently, there is much interest in identifying clinically relevant biomarkers, as they have the potential to be high utility non-invasive tools for early diagnosis and reliable patient monitoring in numerous conditions. Since its discovery almost 15 years ago, research on the ubiquitous antioxidant enzyme peroxiredoxin 4 (Prx4) has culminated in the recognition that Prx4 levels are different in blood drawn from the healthy general population and patients with acute or chronic diseases. In this commentary, the most striking research data from different in vitro approaches, animal models and human observational studies are discussed collectively, highlighting the clinical importance of Prx4 as a multifunctional staging and prognosis biomarker. In this context, the oxidative state of patients may be reflected by intra- and extracellular Prx4 levels, redox state, oligomerization and nitro-oxidative modifications of the enzyme. A consolidated model of the potential role and origin of circulating Prx4 is presented to stimulate further investigations in light of the current biomarker situation.
Bone repair failure is a major complication of open fracture, leading to non-union of broken bone extremities and movement at the fracture site. This results in a serious disability for patients. The role played by the periosteum and bone marrow progenitors in bone repair is now well documented. In contrast, limited information is available on the role played by myogenic progenitor cells in bone repair. In a recent article published in BMC Musculoskeletal Disorders, Liu et al. compared the presence of myogenic progenitor (MyoD lineage cells) in closed and open fractures. They showed that myogenic progenitors are present in open, but not closed fractures, suggesting that muscle satellite cells may colonize the fracture site in the absence of intact periosteum. Interestingly, these progenitors sequentially expressed a chondrogenic and, thereafter, an osteoblastic phenotype, suggestive of a functional role in the repair process. This finding opens up new perspectives for the research of orthopedic surgical methods, which could maximize myogenic progenitor access and mobilization to augment bone repair.
Please see related article: http://www.biomedcentral.com/1471-2474/12/288
bone repair; fracture; pseudarthrosis; progenitors; muscle
Acute kidney injury (AKI) in hospitalized patients is independently associated with increased morbidity and mortality in pediatric and adult populations. Continued reliance on serum creatinine and urine output to diagnose AKI has resulted in our inability to provide successful therapeutic and supportive interventions to prevent and mitigate AKI and its effects. Research efforts over the last decade have focused on the discovery and validation of novel urinary biomarkers to detect AKI prior to a change in kidney function and to aid in the differential diagnosis of AKI. The aim of this article is to review the AKI biomarker literature with a focus on the context in which they should serve to add to the clinical context facing physicians caring for patients with, or at-risk for, AKI. The optimal and appropriate utilization of AKI biomarkers will only be realized by understanding their characteristics and placing reasonable expectations on their performance in the clinical arena.
Mental disorders are associated with a considerable burden of disease as well as being risk factors for other health outcomes. The new Global Burden of Disease (GBD) Study will make estimates for both the disability and mortality directly associated with mental disorders, as well as the burden attributable to other health outcomes. Herein we discuss the process by which health outcomes in which mental disorders are risk factors are selected for inclusion in the GBD Study. We make suggestions for future research to strengthen the body of evidence for mental disorders as risk factors.
We identified a list of potential associations between mental disorders and subsequent health outcomes based on a review of the literature and consultation with mental health experts. A two-stage filter was applied to identify mental disorders and health outcomes that meet the criteria for inclusion in the GBD Study. Major limitations in the current literature are discussed and illustrated with examples identified during our review.
Results and discussion
Only two associations are included in the new GBD Study. These associations are the increased risk of ischemic heart disease with major depression and mental disorders as a risk factor for suicide. There is evidence that mental disorders are independent risk factors for cardiovascular disease (CVD), type 2 diabetes and injuries. However, these associations were unable to be included because of insufficient data. The most common reasons for exclusion were inconsistent identification of 'cases', uncertain validity of health outcomes, lack of generalizability, insufficient control for confounding factors and lack of evidence for temporality.
CVD, type 2 diabetes and injury are important public health policy areas. Prospective community studies of outcomes in patients with mental disorders are required, and their design must address a range of confounding factors.
About half of Americans 50 to 75 years old do not follow recommended colorectal cancer (CRC) screening guidelines, leaving 40 million individuals unscreened. A simple blood test would increase screening compliance, promoting early detection and better patient outcomes. The objective of this study is to demonstrate the performance of an improved sensitivity blood-based Septin 9 (SEPT9) methylated DNA test for colorectal cancer. Study variables include clinical stage, tumor location and histologic grade.
Plasma samples were collected from 50 untreated CRC patients at 3 institutions; 94 control samples were collected at 4 US institutions; samples were collected from 300 colonoscopy patients at 1 US clinic prior to endoscopy. SEPT9 methylated DNA concentration was tested in analytical specimens, plasma of known CRC cases, healthy control subjects, and plasma collected from colonoscopy patients.
The improved SEPT9 methylated DNA test was more sensitive than previously described methods; the test had an overall sensitivity for CRC of 90% (95% CI, 77.4% to 96.3%) and specificity of 88% (95% CI, 79.6% to 93.7%), detecting CRC in patients of all stages. For early stage cancer (I and II) the test was 87% (95% CI, 71.1% to 95.1%) sensitive. The test identified CRC from all regions, including proximal colon (for example, the cecum) and had a 12% false-positive rate. In a small prospective study, the SEPT9 test detected 12% of adenomas with a false-positive rate of 3%.
A sensitive blood-based CRC screening test using the SEPT9 biomarker specifically detects a majority of CRCs of all stages and colorectal locations. The test could be offered to individuals of average risk for CRC who are unwilling or unable to undergo colonscopy.
The validation of biomarkers has become a key goal of translational biomedical research. The purpose of this article is to discuss the role of biomarkers in the management of acute lung injury (ALI) and related research. Biomarkers should be sensitive and specific indicators of clinically important processes and should change in a relevant timeframe to affect recruitment to trials or clinical management. We do not believe that they necessarily need to reflect pathogenic processes. We critically examined current strategies used to identify biomarkers and which, owing to expedience, have been dominated by reanalysis of blood derived markers from large multicenter Phase 3 studies. Combining new and existing validated biomarkers with physiological and other data may add predictive power and facilitate the development of important aids to research and therapy.
Crimean-Congo hemorrhagic fever (CCHF) virus has the widest geographic range of all tick-borne viruses and is endemic in more than 30 countries in Eurasia and Africa. Over the past decade, new foci have emerged or re-emerged in the Balkans and neighboring areas. Here we discuss the factors influencing CCHF incidence and focus on the main issue of the use of ribavirin for treating this infection. Given the dynamics of CCHF emergence in the past decade, development of new anti-viral drugs and a vaccine is urgently needed to treat and prevent this acute, life-threatening disease.
Kawasaki disease is an acute vasculitis of infants and young children that is recognized through a constellation of clinical signs that can mimic other benign conditions of childhood. The etiology remains unknown and there is no specific laboratory-based test to identify patients with Kawasaki disease. Treatment to prevent the complication of coronary artery aneurysms is most effective if administered early in the course of the illness. We sought to develop a diagnostic algorithm to help clinicians distinguish Kawasaki disease patients from febrile controls to allow timely initiation of treatment.
Urine peptidome profiling and whole blood cell type-specific gene expression analyses were integrated with clinical multivariate analysis to improve differentiation of Kawasaki disease subjects from febrile controls.
Comparative analyses of multidimensional protein identification using 23 pooled Kawasaki disease and 23 pooled febrile control urine peptide samples revealed 139 candidate markers, of which 13 were confirmed (area under the receiver operating characteristic curve (ROC AUC 0.919)) in an independent cohort of 30 Kawasaki disease and 30 febrile control urine peptidomes. Cell type-specific analysis of microarrays (csSAM) on 26 Kawasaki disease and 13 febrile control whole blood samples revealed a 32-lymphocyte-specific-gene panel (ROC AUC 0.969). The integration of the urine/blood based biomarker panels and a multivariate analysis of 7 clinical parameters (ROC AUC 0.803) effectively stratified 441 Kawasaki disease and 342 febrile control subjects to diagnose Kawasaki disease.
A hybrid approach using a multi-step diagnostic algorithm integrating both clinical and molecular findings was successful in differentiating children with acute Kawasaki disease from febrile controls.
Currently available pharmacological and non-pharmacological treatments have shown only modest effects in slowing the progression of dementia. Our objective was to assess the impact of a long-term non-pharmacological group intervention on cognitive function in dementia patients and on their ability to carry out activities of daily living compared to a control group receiving the usual care.
A randomized, controlled, single-blind longitudinal trial was conducted with 98 patients (follow-up: n = 61) with primary degenerative dementia in five nursing homes in Bavaria, Germany. The highly standardized intervention consisted of motor stimulation, practice in activities of daily living, and cognitive stimulation (acronym MAKS). It was conducted in groups of ten patients led by two therapists for 2 hours, 6 days a week for 12 months. Control patients received treatment as usual. Cognitive function was assessed using the cognitive subscale of the Alzheimer's Disease Assessment Scale (ADAS-Cog), and the ability to carry out activities of daily living using the Erlangen Test of Activities of Daily Living (E-ADL test) at baseline and after 12 months.
Of the 553 individuals screened, 119 (21.5%) were eligible and 98 (17.7%) were ultimately included in the study. At 12 months, the results of the per protocol analysis (n = 61) showed that cognitive function and the ability to carry out activities of daily living had remained stable in the intervention group but had decreased in the control patients (ADAS-Cog: adjusted mean difference: -7.7, 95% CI -14.0 to -1.4, P = 0.018, Cohen's d = 0.45; E-ADL test: adjusted mean difference: 3.6, 95% CI 0.7 to 6.4, P = 0.015, Cohen's d = 0.50). The effect sizes for the intervention were greater in the subgroup of patients (n = 50) with mild to moderate disease (ADAS-Cog: Cohen's d = 0.67; E-ADL test: Cohen's d = 0.69).
A highly standardized, non-pharmacological, multicomponent group intervention conducted in a nursing-home setting was able to postpone a decline in cognitive function in dementia patients and in their ability to carry out activities of daily living for at least 12 months.
http://www.isrctn.com Identifier: ISRCTN87391496
dementia; non-pharmacological intervention; group therapy; RCT; nursing home
Effective strategies for the primary prevention of low back pain (LBP) remain elusive with few large-scale clinical trials investigating exercise and education approaches. The purpose of this trial was to determine whether core stabilization alone or in combination with psychosocial education prevented incidence of low back pain in comparison to traditional lumbar exercise.
The Prevention of Low Back Pain in the Military study was a cluster randomized clinical study with four intervention arms and a two-year follow-up. Participants were recruited from a military training setting from 2007 to 2008. Soldiers in 20 consecutive companies were considered for eligibility (n = 7,616). Of those, 1,741 were ineligible and 1,550 were eligible but refused participation. For the 4,325 Soldiers enrolled with no previous history of LBP average age was 22.0 years (SD = 4.2) and there were 3,082 males (71.3%). Companies were randomly assigned to receive traditional lumbar exercise, traditional lumbar exercise with psychosocial education, core stabilization exercise, or core stabilization with psychosocial education, The psychosocial education session occurred during one session and the exercise programs were done daily for 5 minutes over 12 weeks. The primary outcome for this trial was incidence of low back pain resulting in the seeking of health care.
There were no adverse events reported. Evaluable patient analysis (4,147/4,325 provided data) indicated no differences in low back incidence resulting in the seeking of health care between those receiving the traditional exercise and core stabilization exercise programs. However, brief psychosocial education prevented low back pain episodes regardless of the assigned exercise approach, resulting in a 3.3% (95% CI: 1.1 to 5.5%) decrease over two years (numbers needed to treat (NNT) = 30.3, 95% CI = 18.2 to 90.9).
Core stabilization has been advocated as preventative, but offered no such benefit when compared to traditional lumbar exercise in this trial. Instead, a brief psychosocial education program that reduced fear and threat of low back pain decreased incidence of low back pain resulting in the seeking of health care. Since this trial was conducted in a military setting, future studies are necessary to determine if these findings can be translated into civilian populations.
NCT00373009 at ClinicalTrials.gov - http://clinicaltrials.gov/
primary prevention; core stabilization; patient education; incidence; low back pain
Tuberculosis (TB) is still a serious public health issue, even in large cities in developed countries. Control of this old disease is based on complicated programs that require completion of long treatments and contact tracing. In an accompanying research article published in BMC Public Health, Bothamley and colleagues found that areas with a ratio lower than one nurse per forty notifications had increased rates with respect to TB notifications, smear-positive cases, loss to follow-up and treatment abandonment across the UK. Furthermore, in these areas there was less opportunity for directly observed therapy, assistance with complex needs, educational outreach and new-entrant screening. In this commentary, we discuss the importance of improving organizational aspects and evaluating TB control programs. According to Bothamley and colleagues, a ratio of one nurse per forty notifications is an effective method of reducing the high TB incidences observed in London and in other cities in developed countries, or to maintain the decline in incidence in cities with lower incidences. It is crucial to evaluate TB programs every year to detect gaps early.
See related article: http://www.biomedcentral.com/1471-2458/11/896
Fortunately radiation accidents are infrequent occurrences, but since they have the potential of large scale events like the nuclear accidents of Chernobyl and Fukushima, preparatory planning of the medical management of radiation accident victims is very important. Radiation accidents can result in different types of radiation exposure for which the diagnostic and therapeutic measures, as well as the outcomes, differ. The clinical course of acute radiation syndrome depends on the absorbed radiation dose and its distribution. Multi-organ-involvement and multi-organ-failure need be taken into account. The most vulnerable organ system to radiation exposure is the hematopoietic system. In addition to hematopoietic syndrome, radiation induced damage to the skin plays an important role in diagnostics and the treatment of radiation accident victims. The most important therapeutic principles with special reference to hematopoietic syndrome and cutaneous radiation syndrome are reviewed.
Endothelial dysfunction has been proposed as the underlying cause of diabetic angiopathy that eventually leads to cardiovascular disease, the major cause of death in diabetes. We recently demonstrated the ameliorating effect of regular vitamin D intake on the glycemic status of patients with type 2 diabetes (T2D). In this study, the effects of improvement of vitamin D status on glycemic status, lipid profile and endothelial biomarkers in T2D subjects were investigated.
Subjects with T2D were randomly allocated to one of the two groups to receive either plain yogurt drink (PYD; containing 170 mg calcium and no vitamin D/250 mL, n1 = 50) or vitamin D3-fortified yogurt drink (FYD; containing 170 mg calcium and 500 IU/250 mL, n2 = 50) twice a day for 12 weeks. Anthropometric measures, glycemic status, lipid profile, body fat mass (FM) and endothelial biomarkers including serum endothelin-1, E-selectin and matrix metalloproteinase (MMP)-9 were evaluated at the beginning and after the 12-week intervention period.
The intervention resulted in a significant improvement in fasting glucose, the Quantitative Insulin Check Index (QUICKI), glycated hemoglobin (HbA1c), triacylglycerols, high-density lipoprotein cholesterol (HDL-C), endothelin-1, E-selectin and MMP-9 in FYD compared to PYD (P < 0.05, for all). Interestingly, difference in changes of endothelin-1, E-selectin and MMP-9 concentrations in FYD compared to PYD (-0.35 ± 0.63 versus -0.03 ± 0.55, P = 0.028; -3.8 ± 7.3 versus 0.95 ± 8.3, P = 0.003 and -2.3 ± 3.7 versus 0.44 ± 7.1 ng/mL, respectively, P < 0.05 for all), even after controlling for changes of QUICKI, FM and waist circumference, remained significant for endothelin-1 and MMP-9 (P = 0.009 and P = 0.005, respectively) but disappeared for E-selectin (P = 0.092). On the contrary, after controlling for serum 25(OH)D, the differences disappeared for endothelin-1(P = 0.066) and MMP-9 (P = 0.277) but still remained significant for E-selectin (P = 0.011).
Ameliorated vitamin D status was accompanied by improved glycemic status, lipid profile and endothelial biomarkers in T2D subjects. Our findings suggest both direct and indirect ameliorating effects of vitamin D on the endothelial biomarkers.
Long-term care (LTC) in the form of care provided in nursing homes, homes for the aged and home care is considered an appropriate answer to the growing needs of the aging populations of the industrialized world. However, the provision of and expenditures on LTC vary considerably between these industrialized countries. Although one would expect LTC to be subject to many internationally comparative studies, including all European countries, this is not the case. A paper presented by Damiani et al. in BMC Health Services Research contains an internationally comparative model regarding the development of LTC in Europe (2003 to 2007). They achieve an intriguing compromise between depth and width in the sparsely populated domain of internationally comparative research on LTC by characterizing countries' LTC and interpreting the large north/south differences found. Their results also show that 'cash for care' schemes form a substantial alternative to traditional LTC provision. An additional time series analysis showed that many countries seem to be engaged in reorganizing the LTC sector. This study widens knowledge in a neglected area of health services research and should serve as a source of inspiration for further studies.
Please see related article: http://www.biomedcentral.com/1472-6963/11/316 
The major metabolic complications of obesity and type 2 diabetes may be prevented and managed with dietary modification. The use of sweeteners that provide little or no calories may help to achieve this objective.
We did a systematic review and network meta-analysis of the comparative effectiveness of sweetener additives using Bayesian techniques. MEDLINE, EMBASE, CENTRAL and CAB Global were searched to January 2011. Randomized trials comparing sweeteners in obese, diabetic, and healthy populations were selected. Outcomes of interest included weight change, energy intake, lipids, glycated hemoglobin, markers of insulin resistance and glycemic response. Evidence-based items potentially indicating risk of bias were assessed.
Of 3,666 citations, we identified 53 eligible randomized controlled trials with 1,126 participants. In diabetic participants, fructose reduced 2-hour blood glucose concentrations by 4.81 mmol/L (95% CI 3.29, 6.34) compared to glucose. Two-hour blood glucose concentration data comparing hypocaloric sweeteners to sucrose or high fructose corn syrup were inconclusive. Based on two ≤10-week trials, we found that non-caloric sweeteners reduced energy intake compared to the sucrose groups by approximately 250-500 kcal/day (95% CI 153, 806). One trial found that participants in the non-caloric sweetener group had a decrease in body mass index compared to an increase in body mass index in the sucrose group (-0.40 vs 0.50 kg/m2, and -1.00 vs 1.60 kg/m2, respectively). No randomized controlled trials showed that high fructose corn syrup or fructose increased levels of cholesterol relative to other sweeteners.
Considering the public health importance of obesity and its consequences; the clearly relevant role of diet in the pathogenesis and maintenance of obesity; and the billions of dollars spent on non-caloric sweeteners, little high-quality clinical research has been done. Studies are needed to determine the role of hypocaloric sweeteners in a wider population health strategy to prevent, reduce and manage obesity and its consequences.
Crohn's disease (CD) and ulcerative colitis (UC), the main forms of inflammatory bowel diseases (IBD) in man, are thought to be caused by an excessive and poorly controlled immune response that is directed against components of the normal microflora. The exact sequence of events by which this pathological process is triggered and maintained is not fully understood, but studies in experimental models of IBD and data emerging from recent clinical trials indicate that T cell-derived cytokines are crucial mediators of the tissue damage. Although CD and UC have been traditionally considered two typical examples of T helper (Th)1 or Th2-associated disease respectively, it is now known that CD- and UC-related inflammation is also marked by enhanced production of cytokines made by a distinct subset of Th cells, termed Th17 cells. Th17 cytokines can have both tissue-protective and inflammatory effects in the gut and there is evidence that Th17 cells can alter their cytokine program according to the stimuli received and convert into Th1-producing cells. These novel findings have contributed to advancing our understanding of mechanisms of gut tissue damage and open new avenues for development of therapeutic strategies in IBD.
In England, trauma is the leading cause of death across all age groups, with over 16,000 deaths per year. Major trauma implies the presence of multiple, serious injuries that could result in death or serious disability. Successive reports have documented the fact that the current ad hoc unstructured management of this patient group is associated with considerable avoidable death and disability. The reform of trauma care in England, especially of the severely injured patient, has already begun. Strong clinical leadership is embraced as the way forward. The present article summarises the steps that have been made over the last decade that led to the recent decision to move towards a long anticipated restructure of the National Health Service (NHS) trauma services with the introduction of Regional Trauma Networks (RTNs). While, for the first time, a genuine political will and support exists, the changes required to maintain the momentum for the implementation of the RTNs needs to be marshalled against arguments, myths and perceptions from the past. Such an approach may reverse the disinterest attitude of many, and will gradually evolve into a cultural shift of the public, clinicians and policymakers in the fullness of time.
Type 1 diabetes is one of the most common endocrine problems in childhood and adolescence, and remains a serious chronic disorder with increased morbidity and mortality, and reduced quality of life. Technological innovations positively affect the management of type 1 diabetes. Closed-loop insulin delivery (artificial pancreas) is a recent medical innovation, aiming to reduce the risk of hypoglycemia while achieving tight control of glucose. Characterized by real-time glucose-responsive insulin administration, closed-loop systems combine glucose-sensing and insulin-delivery components. In the most viable and researched configuration, a disposable sensor measures interstitial glucose levels, which are fed into a control algorithm controlling delivery of a rapid-acting insulin analog into the subcutaneous tissue by an insulin pump. Research progress builds on an increasing use of insulin pumps and availability of glucose monitors. We review the current status of insulin delivery, focusing on clinical evaluations of closed-loop systems. Future goals are outlined, and benefits and limitations of closed-loop therapy contrasted. The clinical utility of these systems is constrained by inaccuracies in glucose sensing, inter- and intra-patient variability, and delays due to absorption of insulin from the subcutaneous tissue, all of which are being gradually addressed.
Cocaine is a stimulant that leads to the rapid accumulation of catecholamines and serotonin in the brain due to prevention of their re-uptake into the neuron that released the neurotransmitter. Cocaine dependence is a public health concern and cause of significant morbidity and mortality worldwide. At present, there are no approved medications for the treatment of this devastating illness, and behavioral interventions have proven to be of limited use. However, there have been a number of recent trials testing promising agents including dopamine agonists, GABAergic medications and the cocaine vaccine. Here we discuss the most recent human clinical trials of potential medications for treatment of cocaine dependence, as well as pre-clinical studies for another promising agent, levo tetrahydropalmatine. Examination of these recent findings shows promise for GABAergic medications and the cocaine vaccine, as well as unique medications such as disulfiram, whose mechanism remains to be determined. Future work may also confirm specific subgroups of patients for treatment response based on clinical characteristics, biomarkers and pharmacogenetics. This review highlights the need for further, bigger studies in order to determine optimal clinical usage.
Health care disparity is a public health challenge. We compared the prevalence of diabetes, quality of care and outcomes between mental health clients (MHCs) and non-MHCs.
This was a population-based longitudinal study of 139,208 MHCs and 294,180 matched non-MHCs in Western Australia (WA) from 1990 to 2006, using linked data of mental health registry, electoral roll registrations, hospital admissions, emergency department attendances, deaths, and Medicare and pharmaceutical benefits claims. Diabetes was identified from hospital diagnoses, prescriptions and diabetes-specific primary care claims (17,045 MHCs, 26,626 non-MHCs). Both univariate and multivariate analyses adjusted for socio-demographic factors and case mix were performed to compare the outcome measures among MHCs, category of mental disorders and non-MHCs.
The prevalence of diabetes was significantly higher in MHCs than in non-MHCs (crude age-sex-standardised point-prevalence of diabetes on 30 June 2006 in those aged ≥20 years, 9.3% vs 6.1%, respectively, P < 0.001; adjusted odds ratio (OR) 1.40, 95% CI 1.36 to 1.43). Receipt of recommended pathology tests (HbA1c, microalbuminuria, blood lipids) was suboptimal in both groups, but was lower in MHCs (for all tests combined; adjusted OR 0.81, 95% CI 0.78 to 0.85, at one year; and adjusted rate ratio (RR) 0.86, 95% CI 0.84 to 0.88, during the study period). MHCs also had increased risks of hospitalisation for diabetes complications (adjusted RR 1.20, 95% CI 1.17 to 1.24), diabetes-related mortality (1.43, 1.35 to 1.52) and all-cause mortality (1.47, 1.42 to 1.53). The disparities were most marked for alcohol/drug disorders, schizophrenia, affective disorders, other psychoses and personality disorders.
MHCs warrant special attention for primary and secondary prevention of diabetes, especially at the primary care level.
Atrazine (2-chloro-4-ethytlamino-6-isopropylamine-1,3,5-triazine; ATR), is the most commonly applied broad-spectrum herbicide in the world. Unintentional overspray of ATR poses an immune function health hazard. The biomolecular mechanisms responsible for ATR-induced immunotoxicity, however, are little understood. This study presents on our investigation into the apoptosis of splenocytes in mice exposed to ATR as we explore possible immunotoxic mechanisms.
Oral doses of ATR were administered to BALB/C mice for 21 days. The histopathology, lymphocyte apoptosis and the expression of apoptosis-related proteins from the Fas/Fas ligand (FasL) apoptotic pathway were examined from spleen samples.
Mice administered ATR exhibited a significant decrease in spleen and thymus weight. Electron microscope histology of ultrathin sections of spleen revealed degenerative micromorphology indicative of apoptosis of splenocytes. Flow cytometry revealed that the percentage of apoptotic lymphocytes increased in a dose-dependent manner after ATR treatment. Western blots identified increased expression of Fas, FasL and active caspase-3 proteins in the treatment groups.
ATR is capable of inducing splenocytic apoptosis mediated by the Fas/FasL pathway in mice, which could be the potential mechanism underlying the immunotoxicity of ATR.
Migraine is a highly prevalent neurological disorder imparting a major burden on health care around the world. The primary pathology may be a state of hyperresponsiveness of the nervous system, but the molecular mechanisms are yet to be fully elucidated. We could now be at a watershed moment in this respect, as the genetic loci associated with typical forms of migraine are being revealed. The genetic discoveries are the latest step in the evolution of our understanding of migraine, which was initially considered a cerebrovascular condition, then a neuroinflammatory process and now primarily a neurogenic disorder. Indeed, the genetic findings, which have revealed ion channels and transporter mutations as causative of migraine, are a powerful argument for the neurogenic basis of migraine. Modulations of ion channels leading to amelioration of the migraine 'hyperresponsive' brain represent attractive targets for drug discovery. There lies ahead an exciting and rapidly progressing phase of migraine translational research, and in this review we highlight recent genetic findings and consider how these may affect the future of migraine neurobiology and therapy.