Enzymatic heme catabolism by heme oxygenases is conserved from bacteria to humans and proceeds through a common mechanism leading to the formation of iron, carbon monoxide, and biliverdin. The first members of a novel class of heme oxygenases were recently identified in Staphylococcus aureus (IsdG and IsdI) and were termed the IsdG-family of heme oxygenases. Enzymes of the IsdG-family form tertiary structures distinct from those of the canonical heme oxygenase family, suggesting that IsdG-family members degrade heme via a unique reaction mechanism. Herein we report that the IsdG-family of heme oxygenases degrade heme to the oxo-bilirubin chromophore staphylobilin. We also present the crystal structure of heme-bound IsdI in which heme ruffling and constrained binding of oxygen is consistent with cleavage of the porphyrin ring at the β– or γ–meso carbons. Combined, these data establish that the IsdG-family of heme oxygenases degrade heme to a novel chromophore distinct from biliverdin.
Staphylococcus aureus; heme oxygenase
Reverse transcriptase (RT) plays an essential role in HIV-1 replication, and inhibition of this enzyme is a key component of HIV-treatment. However, the use of RT inhibitors can lead to the emergence of drug-resistant variants. Until recently, most clinically relevant resistance mutations were found in the polymerase domain of RT. Lately, an increasing number of resistance mutations has been identified in the connection and RNaseH domain. To further explore the role of these domains we analyzed the complete RT sequence of HIV-1 subtype B patients failing therapy. Position A/T400 in the connection subdomain is polymorphic, but the proportion of T400 increases from 41% in naïve patients to 72% in patients failing therapy. Previous studies suggested a role for threonine in conferring resistance to nucleoside RT inhibitors. Here we report that T400 also mediates resistance to non-nucleoside RT inhibitors. The susceptibility to NVP and EFV was reduced 5-fold and 2-fold, respectively, in the wild-type subtype B NL4.3 background. We show that substitution A400T reduces the RNaseH activity. The changes in enzyme activity are remarkable given the distance to both the polymerase and RNaseH active sites. Molecular dynamics simulations were performed, which provide a novel atomistic mechanism for the reduction in RNaseH activity induced by T400. Substitution A400T was found to change the conformation of the RNaseH primer grip region. Formation of an additional hydrogen bond between residue T400 and E396 may play a role in this structural change. The slower degradation of the viral RNA genome may provide more time for dissociation of the bound NNRTI from the stalled RT-template/primer complex, after which reverse transcription can resume.
Human T-lymphotropic virus type 1/2 (HTLV-1/2) infection is endemic in Brazil but representative donor prevalence and incidence data are lacking. All blood donations (2007–2009) from three blood centers in Brazil were studied. Samples reactive on one HTLV screening test (EIA) were retested with a different EIA; dual EIA reactivity correlated strongly with a confirmatory Western blot. Prevalence, incidence, and residual transfusion risk were calculated. Among 281,760 first-time donors, 363 were positive for HTLV on both EIAs (135 per 105, 95% CI 122–150). Prevalence differed considerably by region, from 83 to 222 per 105. Overall incidence rate was 3.6/105 person-years and residual transfusion risk was 5.0/106 per blood unit transfused. The logistic regression model showed significant associations with: age [adjusted odds ratio (aOR)=5.23 for age 50+ vs. <20], female sex (aOR=1.97), black (aOR=2.70 vs. white), and mixed skin colors (aOR=1.78 vs. white), and inversely with education (aOR=0.49, college vs. less than high school). HTLV testing with a dual-EIA strategy is feasible and can be useful in areas with low resources. Incidence and residual risk of HTLV-1 transmission by transfusion were relatively high and could be reduced by improving donor recruitment and selection in high prevalence areas. Blood center data may contribute to surveillance for HTLV infection.
Strains of many infectious agents differ in fundamental epidemiological parameters including transmissibility, virulence and pathology. We investigated whether genotypes of Mycobacterium bovis (the causative agent of bovine tuberculosis, bTB) differ significantly in transmissibility and virulence, combining data from a nine-year survey of the genetic structure of the M. bovis population in Northern Ireland with detailed records of the cattle population during the same period. We used the size of herd breakdowns as a proxy measure of transmissibility and the proportion of skin test positive animals (reactors) that were visibly lesioned as a measure of virulence. Average breakdown size increased with herd size and varied depending on the manner of detection (routine herd testing or tracing of infectious contacts) but we found no significant variation among M. bovis genotypes in breakdown size once these factors had been accounted for. However breakdowns due to some genotypes had a greater proportion of lesioned reactors than others, indicating that there may be variation in virulence among genotypes. These findings indicate that the current bTB control programme may be detecting infected herds sufficiently quickly so that differences in virulence are not manifested in terms of outbreak sizes. We also investigated whether pathology of infected cattle varied according to M. bovis genotype, analysing the distribution of lesions recorded at post mortem inspection. We concentrated on the proportion of cases lesioned in the lower respiratory tract, which can indicate the relative importance of the respiratory and alimentary routes of infection. The distribution of lesions varied among genotypes and with cattle age and there were also subtle differences among breeds. Age and breed differences may be related to differences in susceptibility and husbandry, but reasons for variation in lesion distribution among genotypes require further investigation.
Aims. To profile site of stroke/cerebrovascular accident, type and extent of field loss, treatment options, and outcome. Methods. Prospective multicentre cohort trial. Standardised referral and investigation protocol of visual parameters. Results. 915 patients were recruited with a mean age of 69 years (SD 14). 479 patients (52%) had visual field loss. 51 patients (10%) had no visual symptoms. Almost half of symptomatic patients (n = 226) complained only of visual field loss: almost half (n = 226) also had reading difficulty, blurred vision, diplopia, and perceptual difficulties. 31% (n = 151) had visual field loss as their only visual impairment: 69% (n = 328) had low vision, eye movement deficits, or visual perceptual difficulties. Occipital and parietal lobe strokes most commonly caused visual field loss. Treatment options included visual search training, visual awareness, typoscopes, substitutive prisms, low vision aids, refraction, and occlusive patches. At followup 15 patients (7.5%) had full recovery, 78 (39%) had improvement, and 104 (52%) had no recovery. Two patients (1%) had further decline of visual field. Patients with visual field loss had lower quality of life scores than stroke patients without visual impairment. Conclusions. Stroke survivors with visual field loss require assessment to accurately define type and extent of loss, diagnose coexistent visual impairments, and offer targeted treatment.
Laboratory evidence has shown that cannabinoids might have a neuroprotective action. We investigated whether oral dronabinol (Δ9-tetrahydrocannabinol) might slow the course of progressive multiple sclerosis.
In this multicentre, parallel, randomised, double-blind, placebo-controlled study, we recruited patients aged 18–65 years with primary or secondary progressive multiple sclerosis from 27 UK neurology or rehabilitation departments. Patients were randomly assigned (2:1) to receive dronabinol or placebo for 36 months; randomisation was by stochastic minimisation, using a computer-generated randomisation sequence, balanced according to expanded disability status scale (EDSS) score, centre, and disease type. Maximum dose was 28 mg per day, titrated against bodyweight and adverse effects. Primary outcomes were EDSS score progression (masked assessor, time to progression of ≥1 point from a baseline score of 4·0–5·0 or ≥0·5 points from a baseline score of ≥5·5, confirmed after 6 months) and change from baseline in the physical impact subscale of the 29-item multiple sclerosis impact scale (MSIS-29-PHYS). All patients who received at least one dose of study drug were included in the intention-to-treat analyses. This trial is registered as an International Standard Randomised Controlled Trial (ISRCTN 62942668).
Of the 498 patients randomly assigned to a treatment group, 329 received at least one dose of dronabinol and 164 received at least one dose of placebo (five did not receive the allocated intervention). 145 patients in the dronabinol group had EDSS score progression (0·24 first progression events per patient-year; crude rate) compared with 73 in the placebo group (0·23 first progression events per patient-year; crude rate); HR for prespecified primary analysis was 0·92 (95% CI 0·68–1·23; p=0·57). Mean yearly change in MSIS-29-PHYS score was 0·62 points (SD 3·29) in the dronabinol group versus 1·03 points (3·74) in the placebo group. Primary analysis with a multilevel model gave an estimated between-group difference (dronabinol–placebo) of −0·9 points (95% CI −2·0 to 0·2). We noted no serious safety concerns (114 [35%] patients in the dronabinol group had at least one serious adverse event, compared with 46 [28%] in the placebo group).
Our results show that dronabinol has no overall effect on the progression of multiple sclerosis in the progressive phase. The findings have implications for the design of future studies of progressive multiple sclerosis, because lower than expected progression rates might have affected our ability to detect clinical change.
UK Medical Research Council, National Institute for Health Research Efficacy and Mechanism Evaluation programme, Multiple Sclerosis Society, and Multiple Sclerosis Trust.
Metabolic profiling of macrophage metabolic response upon exposure to 4-hydroxynonenal (HNE) demonstrates that HNE does not simply inactivate superoxide generating enzymes but could also be responsible for the impairment of downfield signaling pathways. Multianalyte microphysiometry (MAMP) was employed to simultaneously measure perturbations in extracellular acidification, lactate production and oxygen consumption for the examination of aerobic and anaerobic pathways. Combining the activation of oxidative burst with phorbol myristate acetate (PMA) and the immunosuppression with HNE, the complex nature of HNE toxicity was determined to be concentration- and time-dependent. Further analysis was utilized to assess the temporal effect of HNE on reactive oxygen species (ROS) production and on protein kinase C (PKC). Increased levels of HNE with decreasing PKC activity suggest PKC is a target for HNE adductation prior to oxidative burst. Additionally, localization of PKC to the cell membrane was prevented with the introduction of HNE, demonstrating a consequence of HNE adductation on NADPH activation. The impairment of ROS by HNE suggests HNE has a greater role in foam cell formation and tissue damage than is already known. Although work has been performed to understand the effect of HNE’s regulation of specific signaling pathways, details regarding its involvement in cellular metabolism as a whole are generally unknown. This study examines the impact of HNE on macrophage oxidative burst and identifies PKC as a key protein for HNE suppression and eventual metabolic response.
Oxidative burst; 4-hydroxynonenal; macrophage activation; multianalyte microphysiometry; phorbol myristate acetate
Geologic process, including tectonics and global climate change, profoundly impact the evolution of life because they have the propensity to facilitate episodes of biogeographic differentiation and influence patterns of speciation. We investigate causal links between a dramatic faunal turnover and two dominant geologic processes operating within Laurentia during the Late Ordovician: the Taconian Orogeny and GICE related global cooling. We utilize a novel approach for elucidating the relationship between biotic and geologic changes using a time-stratigraphic, species-level evolutionary framework for articulated brachiopods from North America. Phylogenetic biogeographic analyses indicate a fundamental shift in speciation mode—from a vicariance to dispersal dominated macroevolutionary regime—across the boundary between the Sandbian to Katian Stages. This boundary also corresponds to the onset of renewed intensification of tectonic activity and mountain building, the development of an upwelling zone that introduced cool, nutrient-rich waters into the epieric seas of eastern Laurentia, and the GICE isotopic excursion. The synchronicity of these dramatic geologic, oceanographic, and macroevolutionary changes supports the influence of geologic events on biological evolution. Together, the renewed tectonic activity and oceanographic changes facilitated fundamental changes in habitat structure in eastern North America that reduced opportunities for isolation and vicariance. They also facilitated regional biotic dispersal of taxa that led to the subsequent establishment of extrabasinal (=invasive) species and may have led to a suppression of speciation within Laurentian faunas. Phylogenetic biogeographic analysis further indicates that the Richmondian Invasion was a multidirectional regional invasion event that involved taxa immigrating into the Cincinnati region from basins located near the continental margins and within the continental interior.
Due to the presence of PCR inhibitors, PCR cannot be used directly on most clinical samples, including human urine, without pre-treatment. A magnetic bead-based strategy is one potential method to collect biomarkers from urine samples and separate the biomarkers from PCR inhibitors. In this report, a 1 mL urine sample was mixed within the bulb of a transfer pipette containing lyophilized nucleic acid-silica adsorption buffer and silica-coated magnetic beads. After mixing, the sample was transferred from the pipette bulb to a small diameter tube, and captured biomarkers were concentrated using magnetic entrainment of beads through pre-arrayed wash solutions separated by small air gaps. Feasibility was tested using synthetic segments of the 140 bp tuberculosis IS6110 DNA sequence spiked into pooled human urine samples. DNA recovery was evaluated by qPCR. Despite the presence of spiked DNA, no DNA was detectable in unextracted urine samples, presumably due to the presence of PCR inhibitors. However, following extraction with the magnetic bead-based method, we found that ∼50% of spiked TB DNA was recovered from human urine containing roughly 5×103 to 5×108 copies of IS6110 DNA. In addition, the DNA was concentrated approximately ten-fold into water. The final concentration of DNA in the eluate was 5×106, 14×106, and 8×106 copies/µL for 1, 3, and 5 mL urine samples, respectively. Lyophilized and freshly prepared reagents within the transfer pipette produced similar results, suggesting that long-term storage without refrigeration is possible. DNA recovery increased with the length of the spiked DNA segments from 10±0.9% for a 75 bp DNA sequence to 42±4% for a 100 bp segment and 58±9% for a 140 bp segment. The estimated LOD was 77 copies of DNA/µL of urine. The strategy presented here provides a simple means to achieve high nucleic acid recovery from easily obtained urine samples, which does not contain inhibitors of PCR.
According to the 2007–2008 National Health and Nutrition Examination Survey, the prevalence of obesity in the US population was 33·8 %; 34·3 % and 38·2 %, respectively, in middle-aged men and women. We asked whether available blood donor data could be used for obesity surveillance.
Cross-sectional study of BMI and obesity, defined as BMI ≥ 30·0 kg/m2. Adjusted odds ratios (aOR) were calculated with logistic regression.
A network of six US blood centres.
Existing data on self-reported height and weight from blood donors, excluding persons deferred for very low body weight.
Among 1 042 817 donors between January 2007 and December 2008, the prevalence of obesity was 25·1 %; 25·7 % in men and 24·4 % in women. Obesity was associated with middle age (age 50–59 years v. <20 years: aOR =1·92 for men and 1·81 for women), black (aOR =1·57 for men and 2·35 for women) and Hispanic (aOR =1·47 for men and 1·49 for women) race/ethnicity compared with white race/ethnicity, and inversely associated with higher educational attainment (college degree v. high school or lower: aOR =0·56 for men and 0·48 for women) and double red cell donation and platelet donation.
Obesity is common among US blood donors, although of modestly lower prevalence than in the general population, and is associated with recognized demographic factors. Blood donors with higher BMI are specifically recruited for certain blood collection procedures. Blood centres can play a public health role in obesity surveillance and interventions.
BMI; Obesity; Blood donors; Prevalence; United States; Continental population groups; Age; Educational status
Transcranial magnetic stimulation (TMS) studies indicate that the observation of other people's actions influences the excitability of the observer's motor system. Motor evoked potential (MEP) amplitudes typically increase in muscles which would be active during the execution of the observed action. This ‘motor resonance’ effect is thought to result from activity in mirror neuron regions, which enhance the excitability of the primary motor cortex (M1) via cortico-cortical pathways. The importance of TMS intensity has not yet been recognised in this area of research. Low-intensity TMS predominately activates corticospinal neurons indirectly, whereas high-intensity TMS can directly activate corticospinal axons. This indicates that motor resonance effects should be more prominent when using low-intensity TMS. A related issue is that TMS is typically applied over a single optimal scalp position (OSP) to simultaneously elicit MEPs from several muscles. Whether this confounds results, due to differences in the manner that TMS activates spatially separate cortical representations, has not yet been explored. In the current study, MEP amplitudes, resulting from single-pulse TMS applied over M1, were recorded from the first dorsal interosseous (FDI) and abductor digiti minimi (ADM) muscles during the observation of simple finger abductions. We tested if the TMS intensity (110% vs. 130% resting motor threshold) or stimulating position (FDI-OSP vs. ADM-OSP) influenced the magnitude of the motor resonance effects. Results showed that the MEP facilitation recorded in the FDI muscle during the observation of index-finger abductions was only detected using low-intensity TMS. In contrast, changes in the OSP had a negligible effect on the presence of motor resonance effects in either the FDI or ADM muscles. These findings support the hypothesis that MN activity enhances M1 excitability via cortico-cortical pathways and highlight a methodological framework by which the neural underpinnings of action observation can be further explored.
Anemia is an important public health concern. Data from population-based surveys such as the National Health and Nutrition Examination Survey (NHANES) are the gold standard, but are obtained infrequently and include only small samples from certain minority groups.
We assessed whether readily available databases of blood donor hemoglobin values could be used as a surrogate for population hemoglobin values from NHANES.
Blood donor venous and fingerstick hemoglobin values were compared to 10,254 NHANES 2005-2008 venous hemoglobin values using demographically stratified analyses and ANOVA. Fingerstick hemoglobins or hematocrits were converted to venous hemoglobin estimates using regression analysis.
Venous hemoglobin values from 1,609 first time donors correlated extremely well with NHANES data across different age, gender and demographic groups. Cigarette smoking increased hemoglobin by 0.26 to 0.59 g/dL depending on intensity. Converted fingerstick hemoglobin from 36,793 first time donors agreed well with NHANES hemoglobin (weighted mean hemoglobin of 15.53 g/dL for donors and 15.73 g/dL for NHANES) with similar variation in mean hemoglobin by age. However, compared to NHANES, the larger donor dataset showed reduced differences in mean hemoglobin between Blacks and other races/ethnicities.
Overall, first-time donor fingerstick hemoglobins approximate U.S. population data and represent a readily available public health resource for ongoing anemia surveillance.
Erythrocyte count; Hematologic tests; Anemia; United States; blood donors; African Continental Ancestry Group
Fingerstick blood samples are used to estimate donor venous hemoglobin (Hb).
STUDY DESIGN AND METHODS
Fingerstick Hb or hematocrit (Hct) was determined routinely for 2425 selected donors at six blood centers, along with venous Hb. Using sex and measures of iron status including absent iron stores (AIS; ferritin < 12 ng/mL), linear regression models were developed to predict venous Hb from fingerstick.
Across all subjects, fingerstick Hb was higher than venous Hb in the higher part of the clinical range, but lower in the lower part of the range. The relationship varied by sex and iron status. Across centers, a female donor had on average a venous Hb result 0.5 to 0.8 g/dL lower than a male donor with the same fingerstick Hb and iron status. Similarly, a donor with AIS had on average a venous Hb result 0.3 to 1.1 g/dL lower than an iron-replete donor with the same fingerstick value and sex. An iron-replete male donor with a fingerstick result at the cutoff (Hb 12.5 g/dL) had an acceptable expected venous Hb (12.8 to 13.8 g/dL). A female donor with AIS with a fingerstick result at the cutoff had an expected venous Hb below 12.5 g/dL (11.7 to 12.4 g/dL). Of females with AIS, 40.2% donated blood when their venous Hb was less than 12.5 g/dL.
Fingerstick is considered a useful estimator of venous Hb. However, in some donor groups, particularly female donors with AIS, fingerstick overestimates venous Hb at the donation cutoff. This significant limitation should be considered in setting donor fingerstick Hb or Hct requirements.
Blood donors are at risk of iron deficiency. We evaluated the effects of blood donation intensity on iron and hemoglobin in a prospective study.
Four cohorts of frequent and first time or reactivated blood donors (no donation in 2 years), female and male, totaling 2425 were characterized and followed as they donated blood frequently. At enrollment and the final visit, ferritin, soluble transferrin receptor (sTfR), and hemoglobin were determined. Models to predict iron deficiency and hemoglobin deferral were developed. Iron depletion was defined at two levels: Iron Deficient Erythropoiesis (IDE) [log (soluble transferrin receptor/ferritin ≥ 2.07)] and Absent Iron Stores (AIS) (ferritin < 12 ng/mL).
Among returning female first time/reactivated donors, 20% and 51% had AIS and IDE at their final visit, respectively; corresponding proportions for males were 8% and 20%. Among female frequent donors who returned, 27% and 62% had AIS and IDE, respectively, while corresponding proportions for males were 18% and 47%. Predictors of IDE and/or AIS included a higher frequency of blood donation in the last 2 years, a shorter interdonation interval, and being female and young; conversely, taking iron supplements reduced the risk of iron depletion. Predictors of hemoglobin deferral included female gender, Black race and a shorter interdonation interval.
There is a high prevalence of iron depletion in frequent blood donors. Increasing the interdonation interval would reduce the prevalence of iron depletion and hemoglobin deferral. Alternatively, replacement with iron supplements may allow frequent donation without the adverse outcome of iron depletion.
Donors; Hematology – Red Cells; Blood Center Operations
In Brazil nationally representative donor data are limited on HIV prevalence, incidence and residual transfusion risk. The objective of this study was to analyze HIV data obtained over 24 months by the REDS-II program in Brazil.
Donations reactive to 3rd and 4th generation immunoassays (IAs) were further confirmed by a less-sensitive (LS) IA algorithm and Western blot (WB). Incidence was calculated for first-time (FT) donors using the LS-EIA results and for repeat donors with a model developed to include all donors with a previous negative donation. Residual risk was projected by multiplying composite FT/repeat donor incidence rates by HIV marker-negative infectious window periods.
HIV prevalence among FT donors was 92.2/105 donations. FT, repeat donor and composite incidence were 38.5 (95%CI: 25.6–51.4), 22.5 (95%CI: 17.6–28.0) and 27.5 (95%CI: 22.0–33.0) per 100,000 person-years, respectively. Male and community donors had higher prevalence and incidence rates than female and replacement donors. Estimated residual risk of HIV transfusion-transmission was 11.3 per 106 donations (95%CI: 8.4–14.2), which could be reduced to 4.2 per 106 donations (95%CI: 3.2–5.2) by use of individual donation nucleic acid testing (NAT).
Incidence and residual transfusion risk of HIV infection are relatively high in Brazil. Implementation of NAT testing will not be sufficient to decrease transmission rates to levels seen in the US or Europe, therefore other measures focused on decreasing donations by at-risk individuals are also necessary.
Blood donors; HIV; Residual risk
In Brazil, most of donations come from repeat donors, but there is little data on return behavior of donors.
Material and Methods
Donors who made at least one whole blood donation in 2007 were followed for two years using a large multicenter research database. Donation frequency, inter-donation intervals and their association with donor demographics, status, and type of donation were examined among three large blood centers in Brazil, two in the Southeast and one in the Northeast.
In 2007 out of 306,770 allogeneic donations, 38.9% came from 95,127 first-time and 61.1% from 149,664 repeat donors. Through December 31, 2009, 28.1% of first-time and 56.5% of repeat donors had donated again. Overall, the median inter-donation interval was about six months. Among men it was 182 and 171 days for first-time and repeat donors and among women 212 and 200 days. Predictors of return behavior among first-time donors were male sex (OR, 1.17; 95% CI, 1.13–1.20), community donation (OR, 2.26; 95% CI, 2.20–2.33) and age ≤ 24 years (OR range 0.62–0.89 for donors ≥ 25 years). Among repeat donors predictors were male sex (OR 1.35; 95% CI 1.32–1.39), age ≥ 35 years (OR range 1.08–1.18 vs. <= 24 years), and community donation (OR, 2.39; 95% CI 2.33–2.44). Differences in return by geographic region were evident with higher return rates in the Northeast of Brazil.
These data highlight the need to develop improved communication strategies for first-time and replacement donors to convert them into repeat community donors.
Blood donors; blood donor recruitment and retention; Brazil; donor retention; type of donation; first-time donors; repeat donors
Contemporary descriptions of blood donor demographics are of value in formulating recruitment and retention strategies that help assure an adequate blood supply. The demographics of successful (SV), unsuccessful (UV), meaning a non-useable unit, and deferred (DV) donor visits over a 4-year period were investigated using Retrovirus Epidemiology Donor Study (REDS)-II databases.
Data came from six US blood centers participating in REDS-II. This analysis focused on demographic factors recorded for each SV, UV, and DV. Fourteen deferral categories were created that included Low Hct/Hgb; Feeling Unwell; Malaria Travel; Malaria Other; Couldn't Wait; BP/Pulse; Medical Diagnosis; Medication; Test Results; Higher Risk Behavior; vCJD; CJD; Needle Exposure/Tattoo and Other. Rates per 10,000 donor presentations were determined for each category globally and for six sub-categorizations (first time or repeat donor status, gender, race/ethnicity, age, education, and fixed or mobile donation location). Deferral rates were also calculated on simultaneous stratifications of donor status, gender, and race/ethnicity.
Out of 5,607922 donor presentations there were 4,553,145 SV (81.2%), 302,828 UV (5.4%) and 751,381 DV (13.4%). Overall rates of deferral ranged from 0.6 per 10,000 presentations for CJD/Human Growth Hormone/Dura Mater exposure to 777 per 10,000 presentations for Low Hct/Hgb. Deferral rates were remarkably different by first time or repeat donor status, gender, race/ethnicity, and by other demographics. The highest overall deferral rate was 3953 per 10,000 presentations, or nearly 40% in first time, female, Asian donors and the lowest rate was 5.6% in repeat, male, White donors.
Successful donation visits according to demographic characteristics need to be placed within the context of donation attempts that are unsuccessful and donors who are deferred. The deferral rates indicate that the burden of donor deferral is high. Efforts to expand the diversity of the donor base through recruitment of minority donors may bring additional challenges because certain deferral reasons were proportionally much higher in these groups.
The importance of adverse reactions in terms of donor safety recently has received significant attention, but their role in subsequent donation behavior has not been thoroughly investigated.
Study Design and Methods
Six REDS-II blood centers provided data for this analysis. Summary minor and major adverse reaction categories were created. The influence of adverse reactions on donation was examined in two ways: Kaplan-Meier (KM) curves were generated to determine the cumulative pattern of first return, and adjusted odds ratios (AORs) for demographic and other factors positively and negatively associated with return were estimated using multivariable logistic regression.
Donors who had major reactions had longer times to return than donors with minor or no reactions. The AOR of returning for donors with major reactions was 0.32 (95% CI 0.28 – 0.37) and with minor reactions 0.59 (95% CI 0.56 – 0.62) when compared to donors who did not have reactions. Conversely, the most important factors positively associated with return were the number of donations in the previous year and increasing age. Subsequent return, whether a major, minor or no reaction occurred, varied by blood center. Factors that are associated with the risk of having adverse reactions were not substantial influences on the return following adverse reactions.
Having an adverse reaction leads to significantly lower odds of subsequent donation irrespective of previous donation history. Factors that have been associated with a greater risk of adverse reactions were not important positive or negative predictors of return following a reaction.
Brazilian blood centers ask candidate blood donors about the number of sexual partners in the last 12 months. Candidates who report a number over the limit are deferred. We studied the implications of this practice on blood safety.
STUDY DESIGN AND METHODS
We analyzed demographic characteristics, number of heterosexual partners, and disease marker rates among 689,868 donations from three Brazilian Centers between July 2007 and December 2009. Donors were grouped based on maximum number of partners allowed in the last 12 months for each center. Chi-square and logistic regression analysis were conducted to examine associations between demographic characteristics, number of sex partners, and individual and overall positive markers rates for HIV, HTLV-1/2, HBV, HCV, and syphilis.
First-time, younger and more educated donors were associated with a higher number of recent sexual partners, as was male gender in São Paulo and Recife (p <0.001). Serologic markers for HIV, syphilis and overall were associated with multiple partners in São Paulo and Recife (p<0.001), but not in Belo Horizonte (p= 0.05, 0.94, 0.75, respectively). In logistic regression analysis, number of recent sexual partners were associated with positive serologic markers (AOR=1.2–1.5) especially HIV (AOR=1.0–4.4).
Number of recent heterosexual partners was associated with HIV positivity and overall rates of serological markers of sexually transmitted infections. The association was not consistent across centers, making it difficult to define the best cut-off value. These findings suggest the use of recent heterosexual contacts as a potentially important deferral criterion to improve blood safety in Brazil.
We used a multicenter retrospective cohort study design to evaluate
whether human leukocyte antigen (HLA) antibody donor screening would reduce
the risk of transfusion-related acute lung injury (TRALI) or possible
STUDY DESIGN AND METHODS
In the Leukocyte Antibody Prevalence Study-II (LAPS-II), we evaluated
pulmonary outcomes in recipients of 2596 plasma-rich blood components
(transfusable plasma and plateletpheresis) sent to participating hospitals;
half of the components were collected from anti-HLA–positive donors
(study arm) and half from anti-HLA–negative donors (control arm)
matched by sex, parity, and blood center. A staged medical record review
process was used. Final recipient diagnosis was based on case review by a
blinded expert panel of pulmonary or critical care physicians.
TRALI incidence was 0.59% (seven cases) in study arm recipients
versus 0.16% (two cases) in control arm recipients for an odds ratio (OR) of
3.6 (95% confidence interval [CI], 0.7–17.4; p = 0.10). For possible
TRALI cases (nine study arm, eight control arm), the OR was 1.2 (95% CI,
0.4–3.0; p = 0.81), and for TRALI and possible TRALI aggregated
together, it was 1.7 (95% CI, 0.7–3.7; p = 0.24).
Transfusion-associated circulatory overload incidence was identical in the
two arms (1.17 and 1.22%, respectively; OR, 1.0; p = 1.0).
TRALI incidence in recipients of anti-HLA–positive components
was relatively low for a look-back study (1 in 170) and was higher than in
the control arm, but did not reach significance. Based on this trend, the
data are consistent with the likelihood that TRALI risk is decreased by
selecting high-volume plasma components for transfusion from donors at low
risk of having HLA antibodies.
(See the editorial commentary by Katz, on pages 867–9 and see the article by Stramer et al, on pages 886–94.)
Background. Genetic variations of human immunodeficiency virus (HIV), hepatitis C virus (HCV), and hepatitis B virus (HBV) can affect diagnostic assays and therapeutic interventions. Recent changes in prevalence of subtypes/genotypes and drug/immune-escape variants were characterized by comparing recently infected vs more remotely infected blood donors.
Methods. Infected donors were identified among approximately 34 million US blood donations, 2006–2009; incident infections were defined as having no or low antiviral antibody titers. Viral genomes were partially sequenced.
Results. Of 321 HIV strains (50% incident), 2.5% were non-B HIV subtypes. Protease and reverse transcriptase (RT) inhibitor resistance mutations were found in 2% and 11% of infected donors, respectively. Subtypes in 278 HCV strains (31% incident) yielded 1a>1b>3a>2b>2a>4a>6d, 6e: higher frequencies of 3a in incident cases vs higher frequencies of 1b in prevalent cases were found (P = .04). Twenty subgenotypes among 193 HBV strains (26% incident) yielded higher frequencies of A2 in incident cases and higher frequencies of A1, B2, and B4 in prevalent cases (P = .007). No HBV drug resistance mutations were detected. Six percent of incident vs 26% of prevalent HBV contained antibody neutralization escape mutations (P = .01).
Conclusions. Viral genetic variant distribution in blood donors was similar to that seen in high-risk US populations. Blood-borne viruses detected through large-scale routine screening of blood donors can complement molecular surveillance studies of highly exposed populations.
The miniaturization of electrochemical sensors allows for the minimally invasive and cost effective examination of cellular responses at a high efficacy rate. In this work, an ink-jet printed superoxide dismutase electrode was designed, characterized, and utilized as a novel microfluidic device to examine the metabolic response of a 2D layer of macrophage cells. Since superoxide production is one of the first indicators of oxidative burst, macrophage cells were exposed within the microfluidic device to phorbol myristate acetate (PMA), a known promoter of oxidative burst, and the production of superoxide was measured. A 46 ± 19% increase in current was measured over a 30 min time period demonstrating successful detection of sustained macrophage oxidative burst, which corresponds to an increase in the superoxide production rate by 9 ± 3 attomoles/cell/sec. Linear sweep voltammetry was utilized to show the selectivity of this sensor for superoxide over hydrogen peroxide. This novel controllable microfluidic system can be used to study the impact of multiple effectors from a large number of bacteria or other invaders along a 2D layer of macrophages, providing an in vitro platform for improved electrochemical studies of metabolic responses.
Microfluidic; Electrochemistry; Superoxide Dismutase; Ink-jet printing; Oxidative Burst