We evaluated the Colorectal Cancer (CRC) Oncology Watch intervention, a clinical reminder implemented in Veterans Integrated Service Network 7 (including eight hospitals) to improve CRC screening rates in 2008.
Patients and Methods
Veterans Affairs (VA) administrative data were used to construct four cross-sectional groups of veterans at average risk, age 50 to 64 years; one group was created for each of the following years: 2006, 2007, 2009, and 2010. We applied hospital fixed effects for estimation, using a difference-in-differences model in which the eight hospitals served as the intervention sites, and the other 121 hospitals served as controls, with 2006 to 2007 as the preintervention period and 2009 to 2010 as the postintervention period.
The sample included 4,352,082 veteran-years in the 4 years. The adherence rates were 37.6%, 31.6%, 34.4%, and 33.2% in the intervention sites in 2006, 2007, 2009, and 2010, respectively, and the corresponding rates in the controls were 31.0%, 30.3%, 32.3%, and 30.9%. Regression analysis showed that among those eligible for screening, the intervention was associated with a 2.2–percentage point decrease in likelihood of adherence (P < .001). Additional analyses showed that the intervention was associated with a 5.6–percentage point decrease in likelihood of screening colonoscopy among the adherent, but with increased total colonoscopies (all indicators) of 3.6 per 100 veterans age 50 to 64 years.
The intervention had little impact on CRC screening rates for the studied population. This absence of favorable impact may have been caused by an unintentional shift of limited VA colonoscopy capacity from average-risk screening to higher-risk screening and to CRC surveillance, or by physician fatigue resulting from the large number of clinical reminders implemented in the VA.
Identification of serious adverse drug reactions (sADRS) associated with commonly used drugs can elude detection for years. Reye’s syndrome (RS), nephrogenic systemic fibrosis (NSF), and pure red cell aplasia (PRCA) among chronic kidney disease (CKD) patients were recognized in 1951, 2000, and 1998, respectively. Reports associating these syndromes with aspirin, gadodiamide, and epoetin, were published 29, 6, and 4 years later, respectively. We obtained primary information from clinicians who identified causes of these sADRs and reviewed factors contributing to delayed identification of these toxicities. Overall, 3,500 aspirin-associated RS cases in the United States, 1,605 gadolinium-associated NSF cases, and 181 epoetin-associated PRCA cases were reported. Delays in FDA regulation of over-the- counter medications and administration of aspirin to children contributed to development of RS. For NSF, in 1996, the Danish Medicine Agency approved high-dose gadodiamide administration to chronic kidney disease (CKD) patients undergoing MR scans. Overall, 88 % of Danish NSF cases were from two hospitals and 97 % of United States’ NSF cases were from 60 hospitals. These hospitals frequently administered high-doses of gadodiamide to CKD patients. Another factor was the decision to administer linear chelated contrast agents versus lower risk macrocyclic chelated agents. For PRCA, increased use of subcutaneous epoetin formulations to CKD patients, in part due to convenience and cost-savings considerations, and a European regulatory requirement requiring removal of albumin as a stabilizer, led to toxicity. Overall, 81, 13, and 17 years elapsed between drug introduction into practice and identification of a causal relationship for aspirin, erythropoietin, and gadodiamide, respectively. A substantial decline in new cases of these sADRs occurred within two years of identification of the offending drug. Clinicians should be vigilant for sADRs, even for frequently-prescribed pharmaceuticals, particularly in settings where formulation or regulatory changes have occurred, or when over-the-counter, off-label, or pediatric use is common.
pure red cell aplasia; nephrogenic systemic fibrosis; Reye’s syndrome
thrombotic thrombocytopenic purpura; ticlopidine; ADAMTS13; ADAMTS13 inhibitor; Japan
Rapid public health response to a large-scale anthrax attack would reduce overall morbidity and mortality. However, there is uncertainty about the optimal cost-effective response strategy based on timing of intervention, public health resources, and critical care facilities. We conducted a decision analytic study to compare response strategies to a theoretical large-scale anthrax attack on the Chicago metropolitan area beginning either Day 2 or Day 5 after the attack. These strategies correspond to the policy options set forth by the Anthrax Modeling Working Group for population-wide responses to a large-scale anthrax attack: (1) postattack antibiotic prophylaxis, (2) postattack antibiotic prophylaxis and vaccination, (3) preattack vaccination with postattack antibiotic prophylaxis, and (4) preattack vaccination with postattack antibiotic prophylaxis and vaccination. Outcomes were measured in costs, lives saved, quality-adjusted life-years (QALYs), and incremental cost-effectiveness ratios (ICERs). We estimated that postattack antibiotic prophylaxis of all 1,390,000 anthrax-exposed people beginning on Day 2 after attack would result in 205,835 infected victims, 35,049 fulminant victims, and 28,612 deaths. Only 6,437 (18.5%) of the fulminant victims could be saved with the existing critical care facilities in the Chicago metropolitan area. Mortality would increase to 69,136 if the response strategy began on Day 5. Including postattack vaccination with antibiotic prophylaxis of all exposed people reduces mortality and is cost-effective for both Day 2 (ICER=$182/QALY) and Day 5 (ICER=$1,088/QALY) response strategies. Increasing ICU bed availability significantly reduces mortality for all response strategies. We conclude that postattack antibiotic prophylaxis and vaccination of all exposed people is the optimal cost-effective response strategy for a large-scale anthrax attack. Our findings support the US government's plan to provide antibiotic prophylaxis and vaccination for all exposed people within 48 hours of the recognition of a large-scale anthrax attack. Future policies should consider expanding critical care capacity to allow for the rescue of more victims.
Rapid public health response to a large-scale anthrax attack would reduce overall morbidity and mortality, but what is the optimal cost-effective response strategy for timing of intervention, public health resources, and critical care facilities? Using a hypothetical large-scale anthrax attack on the Chicago metropolitan area, this study compared response strategies that would begin either 2 days or 5 days after the attack and would consist of administering prophylaxis and vaccine in various combinations. The findings support the government's plan to provide antibiotic prophylaxis and vaccination for all exposed people within 48 hours of the recognition of a large-scale anthrax attack.
Although amiodarone is the most commonly prescribed antiarrhythmic drug, its use is limited by serious toxicities, including optic neuropathy. Current reports of amiodarone associated optic neuropathy identified from the Food and Drug Administration's Adverse Event Reporting System (FDA-AERS) and published case reports were reviewed. A total of 296 reports were identified: 214 from AERS, 59 from published case reports, and 23 from adverse events reports for patients enrolled in clinical trials. Mean duration of amiodarone therapy before vision loss was 9 months (range 1-84 months). Insidious onset of amiodarone associated optic neuropathy (44%) was the most common presentation, and nearly one-third were asymptomatic. Optic disc edema was present in 85% of cases. Following drug cessation, 58% had improved visual acuity, 21% were unchanged, and 21% had further decreased visual acuity. Legal blindness (< 20/200) was noted in at least one eye in 20% of cases. Close ophthalmologic surveillance of patients during the tenure of amiodarone administration is warranted.
amiodarone; vision loss; optic neuropathy
In 2006, nephrologists in Denmark unexpectedly identified chronic kidney disease (CKD) patients with a new syndrome, nephrogenic systemic fibrosis (NSF). Subsequently, 1603 NSF patients were reported to the Food and Drug Administration. Sixty hospitals in the USA account for 93% of these cases, and two hospitals in Denmark account for 4% of these reports. We review Denmark’s identification and subsequent rapid eradication of NSF.
NSF reports from clinicians, the Danish Medicines Agency (DMA) and gadolinium-based contrast agents (GBCAs) manufacturers were reviewed (2002–11).
In 1994, the DMA approved a non-ionic linear GBCA, gadodiamide (0.1 mmol/kg), for magnetic resonance imagings (MRIs), with a renal insufficiency contraindication. In 1996, 0.3 mmol/kg dosing received DMA approval. In 1998, the DMA removed renal contraindications. In 1997 and 2002, radiologists at Skejby Hospital and Herlev Hospital, respectively, began performing gadodiamide-enhanced magnetic resonance angiography scans (0.3 mmol/kg) of CKD patients. In 2005, Herlev clinicians requested assistance in evaluating etiological causes of NSF occurring among 10 CKD patients who had developed NSF. This investigation, focusing on infectious agents, was inconclusive. In 2006, Herlev clinicians reported that of 108 CKD patients who had received gadodiamide-enhanced MRI, 20 had developed probable NSF. Herlev radiologists voluntarily discontinued administering gadodiamide to all patients and no new NSF cases at Herlev Hospital developed subsequently. After meeting with Herlev radiologists, Skejby radiologists also discontinued administering gadodiamide to all patients. In 2007, the European Medicines Agency and the DMA contraindicated gadodiamide administration to CKD patients. In 2008, in response to these advisories, radiologists at the other 36 Danish hospitals discontinued administering gadodiamide to all patients, following on practices adopted at Skejby and Herlev Hospitals. In 2009, clinicians at Skejby Hospital reported that a look-back survey identified 33 CKD patients with NSF developing after undergoing GBCA-enhanced MRIs between 1999 and 2007. In 2010, an independent review, commissioned by the Minister of Health, concluded that the DMA had erred in rescinding gadodiamide’s renal insufficiency contraindication in 1998 and that this error was a key factor in the development of NSF in Denmark. In 2011, three NSF cases associated with macrocyclic GBCA-associated NSF and three NSF patients with Stages 3 and 4 CKD disease from Skejby Hospital were reported.
A confluence of factors led to the development and eradication of NSF in Denmark.
chronic kidney disease; gadodiamide; gadolinium; magnetic resonance angiography; nephrogenic systemic fibrosis
To report on the use of pharmaceutical patient assistance programs (PAPs) in the outpatient pharmacy at the largest tertiary cancer center in the United States.
We conducted a retrospective (July 1, 2006–Dec 31,2007) cross-sectional analysis of outpatient pharmacy, medical, and cancer registry records at the cancer center. The cancer center identified 104 medications available through PAPs. Study-eligible patients received at least one of these medications, either as a PAP case patient or as a PAP control non-user. Binary logit regression models predicted PAP use, and descriptive statistics compared PAP user and non-user medication fills.
Of 25,552 cancer patients at who received an outpatient medication during the study period, 1,929 met study criteria (n=950 PAP users, 979 PAP non-users). In comparison to controls, PAP users were more likely to be uninsured (odds ratio (OR)=4.60, 95% confidence interval (CI): 2.118, 9.970), indigent (OR=16.95, 95% CI: 6.845, 41.960), and < 65 years old (OR=2.31, 95% CI: 1.517, 3.509). Of the most frequently dispensed medications to PAP users from PAPs (n=5,271), 88% (n=4,936) were for supportive care (e.g., nausea/vomiting). PAPs provided 35% (n=842) of the most common anticancer agents administered to PAP users (n=1,296), accounting for a monthly mean of $55,000 in pharmaceutical expenditures.
In the cancer center’s outpatient pharmacy, PAPs provided financial support for about a third of the most commonly used therapies, primarily for supportive care indications, for a small percentage of eligible cancer patients.
cancer; supportive care; medication assistance; anticancer agents; outpatient pharmacy
Thrombotic thrombocytopenic purpura (TTP) is a type of thrombotic microangiopathy (TMA). Studies report that the majority of TTP patients present with a deficiency of ADAMTS13 activity. In a database of TMA patients in Japan identified between 1998 and 2008, 186 patients with first onset of acquired idiopathic (ai) ADAMTS13-deficient TTP (ADAMTS13 activity <5%) were diagnosed. The median age of onset of TTP in this group of patients was 54 years, 54.8% were female, 75.8% had renal involvement, 79.0% had neurologic symptoms, and 97.8% had detectable inhibitors to ADAMTS13 activity. Younger patients were less likely to present with renal or neurologic dysfunction (p<0.01), while older patients were more likely to die during the TTP hospitalization (p<0.05). Findings from this cohort in Japan differ from those reported previously from the United States, Europe, and Korea with respect to age at onset (two decades younger in the other cohort) and gender composition (60% to 100% female in the other cohort). We conclude that in one of the largest cohorts of ai-TTP with severe deficiency of ADAMTS13 activity reported to date, demographic characteristics differ in Japanese patients relative to those reported from a large Caucasian registry from Western societies. Additional studies exploring these findings are needed.
Serious adverse drug event (sADE) reporting to Institutional Review Boards (IRB) is essential to ensure pharmaceutical safety. However, the quality of these reports has not been studied. Safety reports are especially important for cancer drugs that receive accelerated Food and Drug Administration approval, like imatinib, as preapproval experience with these drugs is limited. We evaluated the quality, accuracy, and completeness of sADE reports submitted to an IRB.
sADE reports submitted to an IRB from 14 clinical trials with imatinib were reviewed. Structured case report forms, containing detailed clinical data fields and a validated causality assessment instrument, were developed. Two forms were generated for each ADE, the first populated with data abstracted from the IRB reports, and the second populated with data from the corresponding clinical record. Completeness and causality assessments were evaluated for each of the two sources, and then compared. Accuracy (concordance between sources) was also assessed.
Of 115 sADEs reported for 177 cancer patients to the IRB, overall completeness of adverse event descriptions was 2.4-fold greater for structured case report forms populated with information from the clinical record versus the corresponding forms from IRB reports (95.0% versus 40.3%, P < 0.05). Information supporting causality assessments was recorded 3.5-fold more often in primary data sources versus IRB adverse event descriptions (93% versus 26%, P < 0.05). Some key clinical information was discrepant between the two sources.
The use of structured syndrome-specific case report forms could enhance the quality of reporting to IRBs, thereby improving the safety of pharmaceuticals administered to cancer patients.
There has been a dramatic sea change in the use of erythropoiesis-stimulating agents (ESAs) for anemic persons with chronic kidney disease (CKD) or cancer patients undergoing chemotherapy. An important area that has not been addressed previously is a CKD patient who also has a malignancy. Clinical guidelines exist that outline recommended treatments for each disease, but the intersection of the two disease processes presents difficult decisions for patients and physicians. Herein, we review the background underlying recent revisions in clinical alerts and guidelines for ESAs, and provide guidance for treating anemia among CKD patients who are receiving no therapy, chemotherapy with curative intent, or chemotherapy with palliative intent. The guiding principle is that comprehensive assessment of risks and benefits in the relevant clinical setting is imperative.
An emerging issue in the proxy literature is whether specifying different proxy viewpoints contributes to different health-related quality of life (HRQL) assessments, and if so, how might each perspective be informative in medical decision making. The aims of this study were to determine if informal caregiver assessments of patients with prostate cancer differed when prompted from both the patient perspective (proxy-patient) and their own viewpoint (proxy-proxy), and to identify factors associated with differences in proxy perspectives (ie, the intraproxy gap).
Research Design and Methods
Using a cross-sectional design, prostate cancer patients and their informal caregivers were recruited from urology clinics in the Jesse Brown Veterans Affairs Healthcare System in Chicago. Dyads assessed HRQL using the EQ-5D visual analog scale (VAS) and EORTC QLQ-C30.
Of 87 dyads, most caregivers were female (83%) and were spouses/partners (58%). Mean difference scores between proxy-patient and proxy-proxy perspectives were statistically significant for QLQ-C30 physical and emotional functioning, and VAS (all P < 0.05), with the proxy-patient perspective closer to patient self-report. Emotional functioning had the largest difference, mean 6.0 (SD 12.8), an effect size = 0.47. Factors weakly correlated with the intraproxy gap included relationship (spouse) and proxy gender for role functioning, and health literacy (limited/functional) for physical functioning (all P < 0.05, 0.20 < r < 0.35).
Meaningful differences between proxy-patient and proxy-proxy perspectives on mental health were consistent with a conceptual framework for understanding proxy perspectives. Prompting different proxy viewpoints on patient health could help clinicians identify patients who may benefit from clinical intervention.
quality of life; proxy; prostate cancer; Veterans
We sought to describe clinical and laboratory findings for a large cohort of patients with thienopyridine-associated thrombotic thrombocytopenic purpura (TTP).
The thienopyridine derivatives, ticlopidine and clopidogrel, are the 2 most common drugs associated with TTP in databases maintained by the U.S. Food and Drug Administration (FDA).
Clinical reports of TTP associated with clopidogrel and ticlopidine were identified from medical records, published case reports, and FDA case reports (n = 128). Duration of thienopyridine exposure, clinical and laboratory findings, and survival were recorded. ADAMTS13 activity (n = 39) and inhibitor (n = 30) were measured for a subset of individuals.
Compared with clopidogrel-associated TTP cases (n = 35), ticlopidine-associated TTP cases (n = 93) were more likely to have received more than 2 weeks of drug (90% vs. 26%), to be severely thrombocytopenic (84% vs. 60%), and to have normal renal function (72% vs. 45%) (p < 0.01 for each). Compared with TTP patients with ADAMTS13 activity >15% (n = 13), TTP patients with severely deficient ADAMTS13 activity (n = 26) were more likely to have received ticlopidine (92.3% vs. 46.2%, p < 0.003). Among patients who developed TTP >2 weeks after thienopyridine, therapeutic plasma exchange (TPE) increased likelihood of survival (84% vs. 38%, p < 0.05). Among patients who developed TTP within 2 weeks of starting thienopyridines, survival was 77% with TPE and 78% without.
Thrombotic thrombocytopenic purpura is a rare complication of thienopyridine treatment. This drug toxicity appears to occur by 2 different mechanistic pathways, characterized primarily by time of onset before versus after 2 weeks of thienopyridine administration. If TTP occurs after 2 weeks of ticlopidine or clopidogrel therapy, therapeutic plasma exchange must be promptly instituted to enhance likelihood of survival.
US veterans have been shown to be a vulnerable population with high cancer rates, and cancer care quality in Veterans Affairs (VA) hospitals is the focus of a congressionally mandated review. We examined rates of surgery and chemotherapy use among veterans with colon cancer at VA and non-VA facilities in California to gain insight into factors associated with quality of cancer care.
A retrospective cohort of incident colon cancer patients from the California Cancer Registry, who were ≥ 66 years old and eligible to use VA and Medicare between 1999 and 2001, were observed for 6 months after diagnosis.
Among 601 veterans with colon cancer, 72% were initially diagnosed and treated in non-VA facilities. Among veterans with stage I to III cancer, those diagnosed and initially treated in VA facilities experienced similar colectomy rates as those at non-VA facilities. Stage III patients diagnosed and initially treated in VA versus non-VA facilities had similar odds of receiving adjuvant chemotherapy. In both settings, older patients had lower odds of receiving chemotherapy than their younger counterparts even when race and comorbidity were considered (age 76 to 85 years: odds ratio [OR] = 0.18; 95% CI, 0.07 to 0.46; age ≥ 86 years: OR = 0.17; 95% CI, 0.04 to 0.73).
In California, older veterans with colon cancer used both VA and non-VA facilities for cancer treatment, and odds of receiving cancer-directed surgery and chemotherapy were similar in both systems. Among stage III patients, older age lowered odds of receiving adjuvant chemotherapy in both systems. Further studies should continue to explore potential health system effects on quality of colon cancer care across the United States.
Colorectal cancer (CRC) screening remains underutilized in the United States. Prior studies reporting the cost effectiveness of randomized interventions to improve CRC screening have not been replicated in the setting of small physician practices. We recently conducted a randomized trial evaluating an academic detailing intervention in 264 small practices in geographically diverse New York City communities. The objective of this secondary analysis is to assess the cost effectiveness of this intervention.
A total of 264 physician offices were randomly assigned to usual care or to a series of visits from trained physician educators. CRC screening rates were measured at baseline and 12 months. The intervention costs were measured and the incremental cost-effectiveness ratio (ICER) was derived. Sensitivity analyses were based on varying cost and effectiveness estimates.
Academic detailing was associated with a 7% increase in CRC screening with colonoscopy. The total intervention cost was $147,865, and the ICER was $21,124 per percentage point increase in CRC screening rate. Sensitivity analyses that varied the costs of the intervention and the average medical practice size were associated with ICERs ranging from $13,631 to $36,109 per percentage point increase in CRC screening rates.
A comprehensive, multicomponent academic detailing intervention conducted in small practices in metropolitan New York was clinically effective in improving CRC screening rates, but was not cost effective.
In April 2009, the FDA retracted a warning asserting that ceftriaxone and intravenous calcium products should not be coadministered to any patient to prevent precipitation events leading to end-organ damage. Following that announcement, we sought to evaluate if the retraction was justified. A search of the FDA Adverse Event Reporting System was conducted to identify any ceftriaxone-calcium interactions that resulted in serious adverse drug events. Ceftazidime-calcium was used as a comparator agent. One hundred four events with ceftriaxone-calcium and 99 events with ceftazidime-calcium were identified. Adverse drug events were recorded according to the listed description of drug involvement (primary or secondary suspect) and were interpreted as probable, possible, unlikely, or unrelated. For ceftriaxone-calcium-related adverse events, 7.7% and 20.2% of the events were classified as probable and possible for embolism, respectively. Ceftazidime-calcium resulted in fewer probable embolic events (4%) but more possible embolic events (30.3%). Among cases that considered ceftriaxone or ceftazidime and calcium as the primary or secondary drug, one case was classified as a probable embolic event. That patient received ceftriaxone-calcium and died, although an attribution of causality was not possible. Our analysis suggests a lack of support for the occurrence of ceftriaxone-calcium precipitation events in adults. The results of the current analysis reinforce the revised FDA recommendations suggesting that patients >28 days old may receive ceftriaxone and calcium sequentially and provide a transparent and reproducible methodology for such evaluations.
Breast cancer mortality rates in South Carolina (SC) are 40% higher among African-American (AA) than European-American (EA) women. Proposed reasons include race-associated variations in care and/or tumor characteristics, which may be subject to income effects. We evaluated race-associated differences in tumor biologic phenotype and stage among low-income participants in a government-funded screening program.
Best Chance Network (BCN) data were linked with the SC Central Cancer Registry. Characteristics of breast cancers diagnosed in BCN participants aged 47–64 years during 1996–2006 were abstracted. Race-specific case proportions and incidence rates based on estrogen receptor (ER) status and histologic grade were estimated.
Among 33,880 low-income women accessing BCN services, repeat breast cancer screening utilization was poor, especially among EAs. Proportionally, stage at diagnosis did not differ by race (607 cancers, 53% among AAs), with about 40% advanced stage. Compared to EAs, invasive tumors in AAs were 67% more likely (proportions) to be of poor-prognosis phenotype (both ER-negative and high-grade); this was more a result of the 46% lesser AA incidence (rates) of better-prognosis (ER+ lower-grade) cancer than the 32% greater incidence of poor-prognosis disease (p-values <0.01). When compared to the general SC population, racial disparities in poor prognostic features within the BCN population were attenuated; this was due to more frequent adverse tumor features in EAs rather than improvements for AAs.
Among low-income women in SC, closing the breast cancer racial and income mortality gaps will require improved early diagnosis, addressing causes of racial differences in tumor biology, and improved care for cancers of poor-prognosis biology.
Breast cancer; Health disparities; Racial disparities; Low-income population; Cancer screening; Rates and proportions
In recent decades, extensive resources have been invested to develop cellular, molecular and genomic technologies with clinical applications that span the continuum of cancer care.
In December 2006, the National Cancer Institute sponsored the first workshop to uniquely examine the state of health services research on cancer-related cellular, molecular and genomic technologies and identify challenges and priorities for expanding the evidence base on their effectiveness in routine care.
This article summarizes the workshop outcomes, which included development of a comprehensive research agenda that incorporates health and safety endpoints, utilization patterns, patient and provider preferences, quality of care and access, disparities, economics and decision modeling, trends in cancer outcomes, and health-related quality of life among target populations.
Ultimately, the successful adoption of useful technologies will depend on understanding and influencing the patient, provider, health care system and societal factors that contribute to their uptake and effectiveness in ‘real-world’ settings.
Genomics; Health services research; Emerging technologies; Translational research
Drug- and device-associated hypersensitivity reactions are serious toxicities that can result in respiratory failure or acute cardiac ischemic events, or even severe hypersensitivity syndromes such as Stevens–Johnson syndrome. These toxicities are usually poorly described in the “black box” warnings section of the product labels.
Adverse event reports contained in databases maintained by the Project on Medical Research on Adverse Drug Events and Reports (Med-RADAR), product labels, safety advisories disseminated by pharmaceutical manufacturers, the Food and Drug Administration (FDA), and the Centers for Disease Control and Prevention (CDC) were reviewed.
Adverse event reports identified three health care workers who developed nevirapine-associated Stevens–Johnson syndrome following occupational exposure to HIV-infected blood or blood products; four persons with localized hypersensitivity and fatal cardiac events associated with rapamycin- or paclitaxel-coated coronary artery stent placements; and six persons with breast cancer who developed severe or fatal anaphylaxis after receiving adjuvant chemotherapy with Cremophor-EL containing paclitaxel. Safety advisories from the FDA, CDC, and the relevant pharmaceutical manufacturers were ambiguous in their description in “black box” warning sections of package inserts describing these serious and potentially fatal toxicities.
Improvements are needed in pharmacovigilance and subsequent dissemination of safety advisories for drug/device-associated hypersensitivity reactions.
adverse events; hypersensivity; toxicity; drug
The authors explain why physicians should refrain from ordering MRIs for patients with renal dysfunction unless the test is essential to provide diagnostic information. A possibly class-wide toxicity from the contrast agent gadolinium has been reported.
The evaluation of research output, such as estimation of the proportion of treatment successes, is of ethical, scientific, and public importance but has rarely been evaluated systematically. We assessed how often experimental cancer treatments that undergo testing in randomized clinical trials (RCTs) result in discovery of successful new interventions.
We extracted data from all completed (published and unpublished) phase 3 RCTs conducted by the National Cancer Institute cooperative groups since their inception in 1955. Therapeutic successes were determined by (1) assessing the proportion of statistically significant trials favoring new or standard treatments, (2) determining the proportion of the trials in which new treatments were considered superior to standard treatments according to the original researchers, and (3) quantitatively synthesizing data for main clinical outcomes (overall and event-free survival).
Data from 624 trials (781 randomized comparisons) involving 216 451 patients were analyzed. In all, 30% of trials had statistically significant results, of which new interventions were superior to established treatments in 80% of trials. The original researchers judged that the risk-benefit profile favored new treatments in 41% of comparisons (316 of 766). Hazard ratios for overall and event-free survival, available for 614 comparisons, were 0.95 (99% confidence interval [CI], 0.93-0.98) and 0.90 (99% CI, 0.87- 0.93), respectively, slightly favoring new treatments. Breakthrough interventions were discovered in 15% of trials.
Approximately 25% to 50% of new cancer treatments that reach the stage of assessment in RCTs will prove successful. The pattern of successes has become more stable over time. The results are consistent with the hypothesis that the ethical principle of equipoise defines limits of discoverability in clinical research and ultimately drives therapeutic advances in clinical medicine.
Background and objective
The antiretroviral nevirapine can cause severe hepatotoxicity when used ‘off-label’ for preventing mother-to-child HIV transmission (PMTCT), newborn post-exposure prophylaxis and for pre- and post-exposure prophylaxis among non-HIV-infected individuals. We describe the incidence of hepatotoxicity with short- versus long-course nevirapine-containing regimens in these groups.
We reviewed hepatotoxicity cases among non-HIV-infected individuals and HIV-infected pregnant women and their offspring receiving short- (≤4 days) versus long-course (≥5 days) nevirapine prophylaxis. Sources included adverse event reports from pharmaceutical manufacturers and the US FDA, reports from peer-reviewed journals/scientific meetings and the Research on Adverse Drug events And Reports (RADAR) project. Hepatotoxicity was scored using the AIDS Clinical Trial Group criteria.
Toxicity data for 8216 patients treated with nevirapine-containing regimens were reviewed. Among 402 non-HIV-infected individuals receiving short- (n = 251) or long-course (n = 151) nevirapine, rates of grade 1–2 hepatotoxicity were 1.99%versus 5.30%, respectively, and rates of grade 3–4 hepatotoxicity were 0.00% versus 13.25%, respectively (p < 0.001 for both comparisons). Among 4740 HIV-infected pregnant women receiving short- (n = 3031) versus long-course (n = 1709) nevirapine, rates of grade 1–2 hepatotoxicity were 0.62% and 7.04%, respectively, and rates of grade 3–4 hepatotoxicity were 0.23% versus 4.39%, respectively (p < 0.001 for both comparisons). The rates of grade 3–4 hepatotoxicity among 3074 neonates of nevirapine-exposed HIV-infected pregnant women were 0.8% for those receiving short-course (n = 2801) versus 1.1%for those receiving long-course (n = 273) therapy (p < 0.72).
Therapy duration appears to significantly predict nevirapine hepatotoxicity. Short-course nevirapine for HIV prophylaxis is associated with fewer hepatotoxic reactions for non-HIV-infected individuals or pregnant HIV-infected women and their offspring, but administration of prophylactic nevirapine for ≥2 weeks appears to be associated with high rates of hepatotoxicity among non-HIV-infected individuals and HIV-infected pregnant mothers. When full highly active antiretroviral therapy (HAART) regimens are not available, single-dose nevirapine plus short-course nucleoside reverse transcriptase inhibitors to decrease the development of HIV viral resistance is an essential therapeutic option for PMTCT and these data support the safety of single-dose nevirapine in this setting.