Pharmacy-based minor ailment schemes (PMASs) have been introduced throughout the UK to reduce the burden of minor ailments on high-cost settings, including general practice and emergency departments.
This study aimed to explore the effect of PMASs on patient health- and cost-related outcomes; and their impact on general practices.
Design and setting
Community pharmacy-based systematic review.
Standard systematic review methods were used, including searches of electronic databases, and grey literature from 2001 to 2011, imposing no restrictions on language or study design. Reporting was conducted in the form recommended in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement and checklist.
Thirty-one evaluations were included from 3308 titles identified. Reconsultation rates in general practice, following an index consultation with a PMAS, ranged from 2.4% to 23.4%. The proportion of patients reporting complete resolution of symptoms after an index PMAS consultation ranged from 68% to 94%. No study included a full economic evaluation. The mean cost per PMAS consultation ranged from £1.44 to £15.90. The total number of consultations and prescribing for minor ailments at general practices often declined following the introduction of PMAS.
Low reconsultation and high symptom-resolution rates suggest that minor ailments are being dealt with appropriately by PMASs. PMAS consultations are less expensive than consultations with GPs. The extent to which these schemes shift demand for management of minor ailments away from high-cost settings has not been fully determined. This evidence suggests that PMASs provide a suitable alternative to general practice consultations. Evidence from economic evaluations is needed to inform the future delivery of PMASs.
community pharmacy services; general practice; pharmacy; primary health care; self care
During the first phase of the FOTO-ED Study, 13% (44/350;95%CI:9–17%) of patients had an ocular fundus finding, such as papilledema, relevant to their emergency department (ED) management found by non-mydriatic ocular fundus photography reviewed by neuro-opthalmologists. All of these findings were missed by ED physicians (EPs), who only examined 14% of enrolled patients by direct ophthalmoscopy. In the present study, we evaluated the sensitivity of non-mydriatic ocular fundus photography, an alternative to direct ophthalmoscopy, for relevant findings when photographs were made available for use by EPs during routine clinical care.
354 patients presenting to our ED with headache, focal neurologic deficit, visual change, or diastolic blood pressure ≥120 mmHg had non-mydriatic fundus photography obtained (Kowa nonmyd-alpha-D). Photographs were placed on the electronic medical record for EPs review. Identification of relevant findings on photographs by EPs was compared to a reference standard of neuro-ophthalmologist review.
EPs reviewed photographs of 239 patients (68%). 35 patients (10%;95%CI:7–13%) had relevant findings identified by neuro-ophthalmologist review (6 disc edema, 6 grade III/IV hypertensive retinopathy, 7 isolated hemorrhages, 15 optic disc pallor, and 1 retinal vascular occlusion). EPs identified 16/35 relevant findings (sensitivity:46%;95%CI:29–63%), and also identified 289/319 normal findings (specificity:96%; 95%CI:87–94%). EPs reported that photographs were helpful for 125 patients (35%).
EPs used non-mydriatic fundus photographs more frequently than they perform direct ophthalmoscopy, and their detection of relevant abnormalities improved. Ocular fundus photography often assisted ED care even when normal. Non-mydriatic ocular fundus photography offers a promising alternative to direct ophthalmoscopy.
Post-traumatic epilepsy (PTE) occurs in a proportion of traumatic brain injury (TBI) cases, significantly compounding the disability, risk of injury, and death for sufferers. To date, predictive biomarkers for PTE have not been identified. This study used the lateral fluid percussion injury (LFPI) rat model of TBI to investigate whether structural, functional, and behavioral changes post-TBI relate to the later development of PTE.
Adult male Wistar rats underwent LFPI or sham-injury. Serial MR and PET imaging, and behavioral analyses were performed over six months post-injury. Rats were then implanted with recording electrodes and monitored for two consecutive weeks using video-EEG to assess for PTE. Of the LFPI rats, 52% (n=12) displayed spontaneous recurring seizures and/or epileptic discharges on the video-EEG recordings.
MRI volumetric and signal analysis of changes in cortex, hippocampus, thalamus, and amygdala, 18F-FDG PET analysis of metabolic function, and behavioral analysis of cognitive and emotional changes, at one week, one month, three months, and six months post-LFPI, all failed to identify significant differences on univariate analysis between the epileptic and non-epileptic groups. However, hippocampal surface shape analysis using high dimensional mapping-large deformation identified significant changes in the ipsilateral hippocampus at one week post-injury relative to baseline that differed between rats that would go onto become epileptic versus those who did not. Furthermore, a multivariate logistic regression model that incorporated the one week, one month, and three month 18F-FDG PET parameters from the ipsilateral hippocampus was able to correctly predict the epileptic outcome in all of the LFPI cases. As such, these subtle changes in the ipsilateral hippocampus at acute phases after LFPI may be related to PTE and require further examination.
These findings suggest PTE may be independent of major structural, functional, and behavioral changes induced by TBI, and suggest more subtle abnormalities are likely involved. However, there are limitations associated with studying acquired epilepsies in animal models that must be considered when interpreting these results, in particular the failure to detect differences between the groups may be related to the limitations of properly identifying/separating the epileptic and non-epileptic animals into the correct group.
Post-traumatic epilepsy; Lateral fluid percussion injury; MRI; PET; Epileptogenesis
Numerous studies have investigated the effects of isolated CLA supplementation on glucose homeostasis in humans and rodents. However, both the amount and relative abundance of CLA isomers in supplemental form are not representative of what is consumed from natural sources. No study to date has examined the effects of altered CLA isomer content within a natural food source. Our goal was to increase the content of the insulin desensitizing CLAt10,c12 isomer relative to the CLAc9,t11 isomer in cow’s milk by inducing subacute rumenal acidosis (SARA), and subsequently investigate the effects of this milk fat on parameters related to glucose and insulin tolerance in rats.
We fed female rats (~2.5 to 3 months of age) CLA t10,c12 –enriched (SARA) butter or non-SARA butter based diets for 4 weeks in either low (10% of kcal from fat; 0.18% total CLA by weight) or high (60% of kcal from fat; 0.55% total CLA by weight) amounts. In an effort to extend these findings, we then fed rats high (60% kcal) amounts of SARA or non-SARA butter for a longer duration (8 weeks) and assessed changes in whole body glucose, insulin and pyruvate tolerance in comparison to low fat and 60% lard conditions.
There was a main effect for increased fasting blood glucose and insulin in SARA vs. non-SARA butter groups after 4 weeks of feeding (p < 0.05). However, blood glucose and insulin concentration, and maximal insulin-stimulated glucose uptake in skeletal muscle were similar in all groups. Following 8 weeks of feeding, insulin tolerance was impaired by the SARA butter, but not glucose or pyruvate tolerance. The non-SARA butter did not impair tolerance to glucose, insulin or pyruvate.
This study suggests that increasing the consumption of a naturally enriched CLAt10,c12 source, at least in rats, has minimal impact on whole body glucose tolerance or muscle specific insulin response.
Conjugated linoleic acid; Butter; Rats; Glucose tolerance; Insulin tolerance; Insulin-stimulated glucose uptake
Studies analyzing motivation factors that lead to blood donation have found altruism to be the primary motivation factor; however social capital has not been analyzed in this context. Our study examines the association between motivation factors (altruism, self-interest and response to direct appeal) and social capital (cognitive and structural) across three large blood centers in Brazil.
Study Design and Methods
We conducted a cross-sectional survey of 7,635 donor candidates from October 15 through November 20, 2009. Participants completed self-administered questionnaires on demographics, previous blood donation, HIV testing and knowledge, social capital and donor motivations. Enrollment was determined prior to the donor screening process.
Among participants, 43.5% and 41.7% expressed high levels of altruism and response to direct appeal respectively, while only 26.9% expressed high levels of self-interest. More high self-interest was observed at Hemope-Recife (41.7%). Of participants, 37.4% expressed high levels of cognitive social capital while 19.2% expressed high levels of structural social capital. More high cognitive and structural social capital was observed at Hemope-Recife (47.3% and 21.3%, respectively). High cognitive social capital was associated with high levels of altruism, self-interest and response to direct appeal. Philanthropic and high social altruism was associated with high levels of altruism and response to direct appeal.
Cognitive and structural social capital and social altruism are associated with altruism and response to direct appeal, while only cognitive social capital is associated with self-interest. Designing marketing campaigns with these aspects in mind may help blood banks attract potential blood donors more efficiently.
blood donation; Brazil; social capital; motivation
There is little data on HIV prevalence, incidence or residual risks for transfusion transmitted HIV infection among Chinese blood donors.
Donations from five Chinese blood centers in 2008–2010 were screened using two rounds of ELISA testing for anti-HIV-1/2. A reactive result in either or both rounds led to Western Blot confirmatory testing. HIV prevalence and demographic correlates among first time donors, incidence rate and demographic correlates among repeat donors were examined. Weighted multivariable logistic regression analysis examined correlates of HIV confirmatory status among first time donors. Residual risks for transfusion transmitted HIV infection were evaluated based on incidence among repeat donors.
Among 821,320 donations, 40% came from repeat donors.1,837 (0.34%) first time and 577 (0.17%) repeat donations screened reactive for anti-HIV-1/2, among which 1,310 and 419 were tested by Western Blot. 233 (17.7%) first time and 44 (10.5%) repeat donations were confirmed positive. Estimated prevalence was 66 infections per 100,000 (95% CI: 59–74) first time donors. Estimated incidence was 9/100,000 (95% CI: 7–12) person-years among repeat donors. Weighted multivariable logistic regression analysis indicate that first time donors 26–45 years old were 1.6–1.8 times likely to be HIV positive than those 25 years and younger. Donors with some college or above education were less likely to be HIV positive than those with middle school education, ORs ranging from 0.35 to 0.60. Minority were 1.6 times likely to be HIV positive than Han majority donors (OR: 1.6; CI: 1.2–2.1). No difference in prevalence was found between gender. Current HIV TTI residual risk was 5.4 (1.2–12.5) infections per million whole blood donations.
Despite the declining HIV epidemic China, estimated residual risks for transfusion transmitted HIV infection are still high, highlighting the potential blood safety yield of NAT implementation in donation screening.
HIV infection; blood donors; China; Prevalence; Incidence; Residual Risks
The increasing prevalence of bovine tuberculosis (bTB) in the UK and the limitations of the currently available diagnostic and control methods require the development of complementary approaches to assist in the sustainable control of the disease. One potential approach is the identification of animals that are genetically more resistant to bTB, to enable breeding of animals with enhanced resistance. This paper focuses on prediction of resistance to bTB. We explore estimation of direct genomic estimated breeding values (DGVs) for bTB resistance in UK dairy cattle, using dense SNP chip data, and test these genomic predictions for situations when disease phenotypes are not available on selection candidates.
We estimated DGVs using genomic best linear unbiased prediction methodology, and assessed their predictive accuracies with a cross validation procedure and receiver operator characteristic (ROC) curves. Furthermore, these results were compared with theoretical expectations for prediction accuracy and area-under-the-ROC-curve (AUC). The dataset comprised 1151 Holstein-Friesian cows (bTB cases or controls). All individuals (592 cases and 559 controls) were genotyped for 727,252 loci (Illumina Bead Chip). The estimated observed heritability of bTB resistance was 0.23±0.06 (0.34 on the liability scale) and five-fold cross validation, replicated six times, provided a prediction accuracy of 0.33 (95% C.I.: 0.26, 0.40). ROC curves, and the resulting AUC, gave a probability of 0.58, averaged across six replicates, of correctly classifying cows as diseased or as healthy based on SNP chip genotype alone using these data.
These results provide a first step in the investigation of the potential feasibility of genomic selection for bTB resistance using SNP data. Specifically, they demonstrate that genomic selection is possible, even in populations with no pedigree data and on animals lacking bTB phenotypes. However, a larger training population will be required to improve prediction accuracies.
We evaluate the current prevalence of serological markers for HBV and HCV in blood donors and estimated HCV incidence and residual transfusion-transmitted risk at three large Brazilian blood centers.
Material and Methods
Data on whole blood and platelet donations were collected from January through December 2007 and analyzed by center, donor type (replacement vs. community), age, sex, donation status (first-time vs. repeat), and serological results for HBsAg, anti-HBc and anti-HCV. HBV (HBsAg+/anti-HBc+) and HCV (anti-HCV) prevalence rates were calculated for all first time donations. HCV incidence was derived including inter-donation intervals that preceded first repeat donations given during the study and HCV residual risk was estimated for transfusions derived from repeat donors.
There were 307,354 donations from January through December 2007. Overall prevalence of concordant HBsAg and anti-HBc reactivity was 289 per 100,000 donations and of anti-HCV confirmed reactivity 191 per 100,000 donations. There were significant associations between older age and hepatitis markers, especially for HCV. HCV incidence was 3.11 (95% CI 0.77-7.03) per 100,000 person-years, and residual risk of HCV window-phase infections was estimated at 5.0 per million units transfused.
Improvement in blood donor selection, socioeconomic conditions and preventive measures, implemented over time, may have helped to decrease prevalence of hepatitis B and C viruses, relative to previous reports. Incidence and residual risk of HCV are also diminishing. Ongoing monitoring of hepatitis B and C viral markers among Brazilian blood donors should help guide improved recruitment procedures, donor selection, laboratory screening methods and counseling strategies.
Blood donors; Brazil; Residual Risk; Hepatitis B; Hepatitis C; Prevalence; Incidence
The use of molecular simulation to estimate the strength of macromolecular binding free energies is becoming increasingly widespread, with goals ranging from lead optimization and enrichment in drug discovery to personalizing or stratifying treatment regimes. In order to realize the potential of such approaches to predict new results, not merely to explain previous experimental findings, it is necessary that the methods used are reliable and accurate, and that their limitations are thoroughly understood. However, the computational cost of atomistic simulation techniques such as molecular dynamics (MD) has meant that until recently little work has focused on validating and verifying the available free energy methodologies, with the consequence that many of the results published in the literature are not reproducible. Here, we present a detailed analysis of two of the most popular approximate methods for calculating binding free energies from molecular simulations, molecular mechanics Poisson–Boltzmann surface area (MMPBSA) and molecular mechanics generalized Born surface area (MMGBSA), applied to the nine FDA-approved HIV-1 protease inhibitors. Our results show that the values obtained from replica simulations of the same protease–drug complex, differing only in initially assigned atom velocities, can vary by as much as 10 kcal mol–1, which is greater than the difference between the best and worst binding inhibitors under investigation. Despite this, analysis of ensembles of simulations producing 50 trajectories of 4 ns duration leads to well converged free energy estimates. For seven inhibitors, we find that with correctly converged normal mode estimates of the configurational entropy, we can correctly distinguish inhibitors in agreement with experimental data for both the MMPBSA and MMGBSA methods and thus have the ability to rank the efficacy of binding of this selection of drugs to the protease (no account is made for free energy penalties associated with protein distortion leading to the over estimation of the binding strength of the two largest inhibitors ritonavir and atazanavir). We obtain improved rankings and estimates of the relative binding strengths of the drugs by using a novel combination of MMPBSA/MMGBSA with normal mode entropy estimates and the free energy of association calculated directly from simulation trajectories. Our work provides a thorough assessment of what is required to produce converged and hence reliable free energies for protein–ligand binding.
Very few studies have measured disease penetrance and prognostic factors of Chagas cardiomyopathy among asymptomatic Trypanosoma cruzi–infected persons.
Methods and Results
We performed a retrospective cohort study among initially healthy blood donors with an index T cruzi–seropositive donation and age-, sex-, and period-matched seronegatives in 1996 to 2002 in the Brazilian cities of São Paulo and Montes Claros. In 2008 to 2010, all subjects underwent medical history, physical examination, ECGs, and echocardiograms. ECG and echocardiogram results were classified by blinded core laboratories, and records with abnormal results were reviewed by a blinded panel of 3 cardiologists who adjudicated the outcome of Chagas cardiomyopathy. Associations with Chagas cardiomyopathy were tested with multivariate logistic regression. Mean follow-up time between index donation and outcome assessment was 10.5 years for the seropositives and 11.1 years for the seronegatives. Among 499 T cruzi seropositives, 120 (24%) had definite Chagas cardiomyopathy, and among 488 T cruzi seronegatives, 24 (5%) had cardiomyopathy, for an incidence difference of 1.85 per 100 person-years attributable to T cruzi infection. Of the 120 seropositives classified as having Chagas cardiomyopathy, only 31 (26%) presented with ejection fraction <50%, and only 11 (9%) were classified as New York Heart Association class II or higher. Chagas cardiomyopathy was associated (P<0.01) with male sex, a history of abnormal ECG, and the presence of an S3 heart sound.
There is a substantial annual incidence of Chagas cardiomyopathy among initially asymptomatic T cruzi–seropositive blood donors, although disease was mild at diagnosis.
blood donors; Brazil; Chagas cardiomyopathy; Chagas disease; incidence
The efficacy of pegloticase, a polyethylene glycol (PEG)-conjugated mammalian recombinant uricase, approved for chronic refractory gout, can be limited by the development of antibodies (Ab). Analyses from 2 replicate, 6-month, randomized controlled trials were performed to characterize Ab responses to pegloticase.
Anti-pegloticase, anti-PEG, and anti-uricase Ab were determined by validated enzyme-linked immunosorbent assays. Ab titers were analyzed for possible relationships with serum pegloticase concentrations, serum uric acid (sUA) lowering, and risk of infusion reactions (IRs).
Sixty-nine (41%) of 169 patients receiving pegloticase developed high titer anti-pegloticase Ab (> 1:2430) and 40% (67/169) developed anti-PEG Ab; 1 patient receiving placebo developed high titer anti-pegloticase Ab. Only 14% (24/169) of patients developed anti-uricase Ab, usually at low titer. In responders, patients showing sustained UA lowering, mean anti-pegloticase titers at week 25 (1:837 ± 1687 with biweekly and 1:2025 ± 4506 with monthly dosing) were markedly lower than in nonresponders (1:34,528 ± 42,228 and 1:89,658 ± 297,797, respectively). Nonresponder status was associated with reduced serum pegloticase concentrations. Baseline anti-pegloticase Ab, evident in 15% (31/212) of patients, did not predict subsequent loss of urate-lowering response. Loss of sUA response preceded IRs in 44 of 56 (79%) pegloticase-treated patients.
Loss of responsiveness to pegloticase is associated with the development of high titer anti-pegloticase Ab that increase clearance of pegloticase and are associated with a loss of the sUA lowering effect and increased IR risk. Pre-infusion sUA can be used as a surrogate for the presence of deleterious anti-pegloticase Ab.
NCT00325195. Registered 10 May 2006, NCT01356498. Registered 27 October 2008.
Lysophosphatidic acid (LPA) is a bioactive phospholipid with a potentially causative role in neurotrauma. Blocking LPA signaling with the LPA-directed monoclonal antibody B3/Lpathomab is neuroprotective in the mouse spinal cord following injury.
Here we investigated the use of this agent in treatment of secondary brain damage consequent to traumatic brain injury (TBI). LPA was elevated in cerebrospinal fluid (CSF) of patients with TBI compared to controls. LPA levels were also elevated in a mouse controlled cortical impact (CCI) model of TBI and B3 significantly reduced lesion volume by both histological and MRI assessments. Diminished tissue damage coincided with lower brain IL-6 levels and improvement in functional outcomes.
This study presents a novel therapeutic approach for the treatment of TBI by blocking extracellular LPA signaling to minimize secondary brain damage and neurological dysfunction.
Lysophosphatidic acid; Traumatic brain injury; Human cerebrospinal fluid; Control cortical impact; Magnetic resonance imaging; Anti-LPA antibody; IL-6
Hemozoin (Hz) is a heme crystal produced by some blood-feeding organisms, as an efficient way to detoxify heme derived from hemoglobin digestion. In the triatomine insect Rhodnius prolixus, Hz is essentially produced by midgut extracellular phospholipid membranes known as perimicrovillar membranes (PMVM). Here, we investigated the role of commercial glycerophospholipids containing serine, choline and ethanolamine as headgroups and R. prolixus midgut lipids (RML) in heme crystallization. All commercial unsaturated forms of phospholipids, as well as RML, mediated fast and efficient β-hematin formation by means of two kinetically distinct mechanisms: an early and fast component, followed by a late and slow one. The fastest reactions observed were induced by unsaturated forms of phosphatidylethanolamine (uPE) and phosphatidylcholine (uPC), with half-lives of 0.04 and 0.7 minutes, respectively. β-hematin crystal morphologies were strikingly distinct among groups, with uPE producing homogeneous regular brick-shaped crystals. Interestingly, uPC-mediated reactions resulted in two morphologically distinct crystal populations: one less representative group of regular crystals, resembling those induced by uPE, and the other largely represented by crystals with numerous sharp edges and tapered ends. Heme crystallization reactions induced by RML were efficient, with a heme to β-hematin conversion rate higher than 70%, but clearly slower (t1/2 of 9.9–17.7 minutes) than those induced by uPC and uPE. Interestingly, crystals produced by RML were homogeneous in shape and quite similar to those mediated by uPE. Thus, β-hematin formation can be rapidly and efficiently induced by unsaturated glycerophospholipids, particularly uPE and uPC, and may play a role on biological heme crystallization in R. prolixus midgut.
Somites are formed progressively from the presomitic mesoderm (PSM) in a highly regulated process according to a strict periodicity driven by an oscillatory mechanism. The Notch and Wnt pathways are key components in the regulation of this somitic oscillator and data from Xenopus and zebrafish embryos indicate that the Notch-downstream target Nrarp participates in the regulation of both activities. We have analyzed Nrarp/nrarp-a expression in the PSM of chick, mouse and zebrafish embryos, and we show that it cycles in synchrony with other Notch regulated cyclic genes. In the mouse its transcription is both Wnt- and Notch-dependent, whereas in the chick and fish embryo it is simply Notch-dependent. Despite oscillating mRNA levels, Nrarp protein does not oscillate in the PSM. Finally, neither gain nor loss of Nrarp function interferes with the normal expression of Notch-related cyclic genes.
Nrarp; Notch pathway; embryo; somitic oscillator; PSM; cyclic gene
Using surrogate biomarkers for disease progression as endpoints in neuroprotective clinical trials may help differentiate symptomatic effects of potential neuroprotective agents from true slowing of the neurodegenerative process. A systematic review was undertaken to determine what biomarkers for disease progression in Alzheimer's disease exist and how well they perform.
MEDLINE and Embase (1950–2011) were searched using five search strategies. Abstracts were assessed to identify papers meriting review in full. Studies of participants with probable Alzheimer's disease diagnosed by formal criteria were included. We made no restriction on age, disease duration, or drug treatment. We only included studies with a longitudinal design, in which the putative biomarker and clinical measure were both measured at least twice, as this is the only appropriate study design to use when developing a disease progression biomarker. We included studies which attempted to draw associations between the changes over time in the biomarker used to investigate disease progression and a clinical measure of disease progression.
Fifty-nine studies were finally included. The commonest biomarker modality examined was brain MRI (17/59, 29% of included studies). Median follow-up in included studies was only 1.0 (IQR 0.8–1.7) year and most studies only measured the putative biomarker and clinical measure twice. Included studies were generally of poor quality with small numbers of participants (median 31 (IQR 17 to 64)), applied excessively restrictive study entry criteria, had flawed methodologies and conducted overly simplistic statistical analyses without adjusting for confounding factors.
We found insufficient evidence to recommend the use of any biomarker as an outcome measure for disease progression in Alzheimer's disease trials. However, further investigation into the efficacy of using MRI measurements of ventricular volume and whole brain volume appeared to be merited. A provisional ‘roadmap’ to improve the quality of future disease progression biomarker studies is presented.
The human histone deacetylase 8 (HDAC8)
is a key hydrolase in gene
regulation and has been identified as a drug target for the treatment
of several cancers. Previously the HDAC8 enzyme has been extensively
studied using biochemical techniques, X-ray crystallography, and computational
methods. Those investigations have yielded detailed information about
the active site and have demonstrated that the substrate entrance
surface is highly dynamic. Yet it has remained unclear how the dynamics
of the entrance surface tune and influence the catalytic activity
of HDAC8. Using long time scale all atom molecular dynamics simulations
we have found a mechanism whereby the interactions and dynamics of
two loops tune the configuration of functionally important residues
of HDAC8 and could therefore influence the activity of the enzyme.
We subsequently investigated this hypothesis using a well-established
fluorescence activity assay and a noninvasive real-time progression
assay, where deacetylation of a p53 based peptide was observed by
nuclear magnetic resonance spectroscopy. Our work delivers detailed
insight into the dynamic loop network of HDAC8 and provides an explanation
for a number of experimental observations.
Background and Objectives
Although the incidence of TRALI is unknown in Brazil, some blood centers have adopted strategies to prevent TRALI. We evaluated the impact of three policies to mitigate TRALI on the supply of blood products: to divert the production of whole blood-derived plasma from female donors; to defer all female donors from apheresis platelet collections, and to defer only multiparous female donors from apheresis platelet collections.
Materials and Methods
Data from allogeneic whole blood and apheresis platelet donations from April 2008 to December 2009 were collected in three Brazilian blood centers and the impact of the aforementioned strategies was evaluated.
Of 544,814 allogeneic blood donations, 30.8% of whole blood plasma and 24.1% of apheresis platelet donations would be reduced if only male donor plasma was issued for transfusion and all female donors were deferred from apheresis donation, respectively. If only multiparous donors were deferred from apheresis donation, there would be a 5% decrease of all apheresis platelet collections.
Restricting the use of whole blood derived plasma to male-only donors and deferring all female apheresis platelet donors would impact two out of three Brazilian blood centers. A deferral policy on multiparous apheresis platelet donors may be acceptable as a temporary measure, but may cause more stress on a system that is already working at its limit.
TRALI; multiparous donors; apheresis platelets; leukocyte antibodies; Brazil; transfusion reactions
Background and Objectives
Higher risk of HIV infection could be associated with test seeking, which is one motivation for donating blood. Cognitive social capital is defined as the social support, trust, and cooperation that guide community behaviour. Structural social capital refers to an individual’s participation in institutions and organizations. The association between social capital and test seeking was assessed.
Materials and Methods
A survey of over 7500 donors in 3 Brazilian blood centres was conducted. Test seeking was classified into 4 non-overlapping categories (non-test seeker, possible, presumed, and self-disclosed test seekers) using 1 direct and 2 indirect questions. Social capital was summarized into cognitive and structural categorizations. Multivariable logistic regression analysis was performed.
Compared to non-test seekers (62% of survey respondents), cognitive social capital was higher for each category of test seeking (OR= 1.1, 7.4, 7.1, p<0.05 respectively). Male gender, lower education, and lower income were also significantly associated with test seeking. Conclusion: As test seekers appear to have strong social networks, blood banks may leverage this to convince them to seek testing at other locations.
social capital; motivation; blood donors; logistic regression
Determine the frequency of and the predictive factors for abnormal ocular fundus findings among emergency department (ED) headache patients.
Cross-sectional study of prospectively enrolled adult patients presenting to our ED with a chief complaint of headache. Ocular fundus photographs were obtained using a nonmydriatic fundus camera that does not require pupillary dilation. Demographic and neuroimaging information was collected. Photographs were reviewed independently by 2 neuroophthalmologists for findings relevant to acute care. The results were analyzed using univariate statistics and logistic regression modeling.
We included 497 patients (median age: 40 years, 73% women), among whom 42 (8.5%, 95% confidence interval: 6%–11%) had ocular fundus abnormalities. Of these 42 patients, 12 had disc edema, 9 had optic nerve pallor, 6 had grade III/IV hypertensive retinopathy, and 15 had isolated retinal hemorrhages. Body mass index ≥35 kg/m2 (odds ratio [OR]: 2.3, p = 0.02), younger age (OR: 0.7 per 10-year increase, p = 0.02), and higher mean arterial blood pressure (OR: 1.3 per 10-mm Hg increase, p = 0.003) were predictive of abnormal retinal photography. Patients with an abnormal fundus had a higher percentage of hospital admission (21% vs 10%, p = 0.04). Among the 34 patients with abnormal ocular fundi who had brain imaging, 14 (41%) had normal imaging.
Ocular fundus abnormalities were found in 8.5% of patients with headache presenting to our ED. Predictors of abnormal funduscopic findings included higher body mass index, younger age, and higher blood pressure. Our study confirms the importance of funduscopic examination in patients with headache, particularly in the ED, and reaffirms the utility of nonmydriatic fundus photography in this setting.
Background. The pilot phase IIb VIKING study suggested that dolutegravir (DTG), a human immunodeficiency virus (HIV) integrase inhibitor (INI), would be efficacious in INI-resistant patients at the 50 mg twice daily (BID) dose.
Methods. VIKING-3 is a single-arm, open-label phase III study in which therapy-experienced adults with INI-resistant virus received DTG 50 mg BID while continuing their failing regimen (without raltegravir or elvitegravir) through day 7, after which the regimen was optimized with ≥1 fully active drug and DTG continued. The primary efficacy endpoints were the mean change from baseline in plasma HIV-1 RNA at day 8 and the proportion of subjects with HIV-1 RNA <50 c/mL at week 24.
Results. Mean change in HIV-1 RNA at day 8 was −1.43 log10 c/mL, and 69% of subjects achieved <50 c/mL at week 24. Multivariate analyses demonstrated a strong association between baseline DTG susceptibility and response. Response was most reduced in subjects with Q148 + ≥2 resistance-associated mutations. DTG 50 mg BID had a low (3%) discontinuation rate due to adverse events, similar to INI-naive subjects receiving DTG 50 mg once daily.
Conclusions. DTG 50 mg BID–based therapy was effective in this highly treatment-experienced population with INI-resistant virus.
Clinical Trials Registration. www.clinicaltrials.gov (NCT01328041) and http://www.gsk-clinicalstudywww.gsk-clinicalstudyregister.com (112574).
dolutegravir; DTG; elvitegravir resistance; integrase inhibitor; raltegravir resistance
The objective of this study was to assess whether National Institute of Health Research (NIHR) Health Technology Assessment (HTA)-funded randomised controlled trials (RCTs) published in the HTA journal were described in sufficient detail to replicate in practice.
RCTs published in the HTA journal.
98 RCTs published in the HTA journal up to March 2011. Completeness of the intervention description was assessed independently by two researchers using a checklist, which included assessments of participants, intensity, schedule, materials and settings. Disagreements in scoring were discussed in the team; differences were then explored and resolved.
Primary and secondary outcome measures
Proportion of trials rated as having a complete description of the intervention (primary outcome measure). The proportion of drug trials versus psychological and non-drug trials rated as having a complete description of the intervention (secondary outcome measures).
Components of the intervention description were missing in 68/98 (69.4%) reports. Baseline characteristics and descriptions of settings had the highest levels of completeness with over 90% of reports complete. Reports were less complete on patient information with 58.2% of the journals having an adequate description. When looking at individual intervention types, drug intervention descriptions were more complete than non-drug interventions with 33.3% and 30.6% levels of completeness, respectively, although this was not significant statistically. Only 27.3% of RCTs with psychological interventions were deemed to be complete, although again these differences were not significant statistically.
Ensuring the replicability of study interventions is an essential part of adding value in research. All those publishing clinical trial data need to ensure transparency and completeness in the reporting of interventions to ensure that study interventions can be replicated.
AUDIT; STATISTICS & RESEARCH METHODS
Published values regarding the sensitivity (IC50) of carnitine palmitoyl transferase I (CPT-I) to malonyl-CoA (M-CoA) inhibition in isolated mitochondria are inconsistent with predicted in vivo rates of fatty acid oxidation. Therefore, we have re-examined M-CoA inhibition kinetics under varying palmitoyl-CoA (P-CoA) concentrations in both isolated mitochondria and permeabilized muscle fibres (PMF). PMF have an 18-fold higher IC50 (0.61 vs 0.034 μM) in the presence of 25 μM P-CoA and a 13-fold higher IC50 (6.3 vs 0.49 μM) in the presence of 150 μM P-CoA compared to isolated mitochondria. M-CoA inhibition kinetics determined in PMF predicts that CPT-I activity is inhibited by 33% in resting muscle compared to >95% in isolated mitochondria. Additionally, the ability of M-CoA to inhibit CPT-I appears to be dependent on P-CoA concentration, as the relative inhibitory capacity of M-CoA is decreased with increasing P-CoA concentrations. Altogether, the use of PMF appears to provide a M-CoA IC50 that better reflects the predicted in vivo rates of fatty acid oxidation. These findings also demonstrate the ratio of [P-CoA]/[M-CoA] is critical for regulating CPT-I activity and may partially rectify the in vivo disconnect between M-CoA content and CPT-I flux within the context of exercise and type II diabetes.
skeletal muscle; carnitine palmitoyl transferase-I; malonyl-CoA; palmitoyl-CoA; isolated mitochondria; permeabilized fibres
Due to the need for medical optimization and congested operating room schedules, surgical repair is often performed at night. Studies have shown that work done at night increases complications. The primary aim of our study is to compare the rates of complications and 30-day mortality between 2 surgical times of day, daytime group (DTG, 07:00-15:59) and nighttime group (NTG, 16:00-06:59).
Retrospective chart review from 2005 through 2010.
Level 1 Trauma Center.
1443 patients with hip fracture, age ≥50 years with isolated injury and surgical treatment of the fracture.
Main Outcomes and Measures:
Thirty-day mortality and complications: myocardial infarction, cardiac event, stroke, central nervous system event, pneumonia, urinary tract infection, postoperative wound infection, and bleeding requiring transfusion of 3 or more red blood cell units.
A total of 859 patients met the inclusion criteria; 668 patients in the DTG and 191 patients in the NTG. The 30-day mortality was 7.8%. The complication rate was 28%. No difference was found in 30-day mortality or complication rate based on the time of day the surgery was performed (P = 1.0 and P = .92, respectively). This remained unchanged when controlling for health status and surgical complexity. Age (odds ratio = 1.03/year), Charlson Comorbidity Index (CCI; odds ratio = 1.21), and American Society of Anesthesiologists (ASA; odds ratio = 1.85) score were predictive of adverse outcomes.
Surgical time of day did not affect 30-day mortality or total number of complications. Age, ASA score, and CCI were associated with adverse outcomes.
geriatric; hip fracture; surgery time; time to operation; time of day; outcomes