To compare the incidence rates of serious cardiovascular events in adult initiators of amphetamines or atomoxetine to rates in non-users.
This was a retrospective cohort study of new amphetamines (n = 38,586) or atomoxetine (n = 20,995) users. Each medication user was matched to up to four non-users on age, gender, data source, and state (n = 238,183). The following events were primary outcomes of interest 1) sudden death or ventricular arrhythmia, 2) stroke, 3) myocardial infarction, 4) a composite endpoint of stroke or myocardial infarction. Cox proportional hazard regression was used to calculate propensity-adjusted hazard ratios for amphetamines versus matched non-users and atomoxetine versus matched non-users, with intracluster dependence within matched sets accounted for using a robust sandwich estimator.
The propensity-score adjusted hazard ratio for amphetamines use versus non-use was 1.18 (95% CI: 0.55–2.54) for sudden death/ventricular arrhythmia, 0.80 (95% CI: 0.44–1.47) for stroke, 0.75 (95% CI: 0.42–1.35) for myocardial infarction, and 0.78 (95% CI: 0.51–1.19) for stroke/myocardial infarction. The propensity-score adjusted hazard ratio for atomoxetine use versus non-use was 0.41 (95% CI: 0.10–1.75) for sudden death/ventricular arrhythmia, 1.30 (95% CI: 0.52–3.29) for stroke, 0.56 (95% CI: 0.16–2.00) for myocardial infarction, and 0.92 (95% CI: 0.44–1.92) for stroke/myocardial infarction.
Initiation of amphetamines or atomoxetine was not associated with an elevated risk of serious cardiovascular events. However, some of the confidence intervals do not exclude modest elevated risks, e.g. for sudden death/ventricular arrhythmia.
In the setting of acute decompensated heart failure, worsening renal function (WRF) and improved renal function (IRF) have been associated with similar hemodynamic derangements and poor prognosis. Our aim was to further characterize IRF and its associated mortality risk.
Methods and Results
Consecutive patients with a discharge diagnosis of congestive heart failure at the Hospital of the University of Pennsylvania were reviewed. IRF was defined as a ≥20% improvement and WRF as a ≥20% deterioration in glomerular filtration rate. Overall, 903 patients met eligibility criteria, 31.4% experiencing IRF. Baseline venous congestion/right sided cardiac dysfunction was more common (p≤0.04) and volume of diuresis (p=0.003) was greater in patients with IRF. IRF was associated with a greater incidence of pre-admission (OR=4.2, 95% CI 2.6–6.7, p<0.0001) and post-discharge (OR=1.8, 95% CI 1.2–2.7 p=0.006) WRF. IRF was associated with increased mortality (adjusted HR=1.3, 95% CI 1.1–1.7, p=0.011), a finding largely restricted to patients with post-discharge recurrence of renal dysfunction (p interaction=0.038).
IRF is associated with significantly worsened survival and may represent the resolution of venous congestion induced pre-admission WRF. Unlike WRF, the renal dysfunction in IRF patients occurs independent of the confounding effects of acute decongestion and may provide incremental information for the study of cardio-renal interactions.
Cardio-renal syndrome; Worsening renal function; Venous congestion
Renal neurohormonal activation leading to a reduction in glomerular filtration rate (GFR) has been suggested as a mechanism for renal insufficiency (RI) in the setting of heart failure. We hypothesized that RI occurring in the presence of renal neurohormonal activation may be prognostically more important than RI in the absence of renal neurohormonal activation.
Methods and results
Subjects in the Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness (ESCAPE) trial (n = 429), Beta-Blocker Evaluation of Survival Trial (BEST) (n = 2691), and Studies Of Left Ventricular Dysfunction (SOLVD) trial (n = 6782) limited datasets were studied. The blood urea nitrogen to creatinine ratio (BUN/Creatinine) was employed as a surrogate for renal neurohormonal activation and the primary outcome was the interaction between BUN/Creatinine and RI associated mortality. Baseline RI (GFR < 60 mL/min/1.73 m²) was associated with mortality in all study populations (P < 0.001). In patients with higher BUN/Creatinine, the risk of mortality was consistently greater in patients with RI [adjusted hazard ratio (HR) ESCAPE = 2.8, 95% confidence interval (CI) 1.3–14.3, P = 0.019; BEST = 1.6, 95% CI 1.2–2.2, P = 0.002; SOLVD = 1.6, 95% CI 1.3–2.0, P = 0.001]. However, in patients with lower BUN/Creatinine, the risk of mortality was not elevated in patients with RI (adjusted HR ESCAPE = 0.94, 95% CI 0.35–2.4, P = 0.90, P interaction = 0.005; BEST = 0.97, 95% CI 0.64–1.4, P = 0.90, P interaction = 0.02; SOLVD = 1.0, 95% CI 0.8–1.3, P = 0.71, P interaction = 0.005).
The association between RI and poor survival observed in heart failure populations appears to be contingent not simply on the presence of a reduced GFR, but possibly on the mechanism by which GFR is reduced.
Cardio-renal syndrome; Heart Failure; Chronic kidney disease; Neurohormonal activation; Mortality
Worsening renal function (WRF) in the setting of heart failure has been associated with increased mortality. However, it is unclear if this decreased survival is a direct result of the reduction in glomerular filtration rate (GFR) or if the mechanism underlying the deterioration in GFR is driving prognosis. Given that WRF in the setting of angiotensin converting enzyme inhibitor (ACE-I) initiation is likely mechanistically distinct from spontaneously occurring WRF, we sought to investigate the relative early WRF associated mortality rates in subjects randomized to ACE-I or placebo.
Methods and Results
Subjects in the Studies Of Left Ventricular Dysfunction limited data set were studied (6,377 patients). The interaction between early WRF (decrease in estimated GFR ≥20% at 14 days), randomization to enalapril, and mortality was the primary endpoint. In the overall population, early WRF was associated with increased mortality (adjusted HR=1.2, 95% CI 1.0–1.4, p=0.037). When analysis was restricted to the placebo group, this association strengthened (adjusted HR=1.4, 95% CI 1.1–1.8, p=0.004). However, in the enalapril group, early WRF had no adverse prognostic significance (adjusted HR=1.0, 95% CI 0.8–1.3, p=1.0, p interaction=0.09). In patients that continued study drug despite early WRF, a survival advantage remained with enalapril therapy (adjusted HR=0.66, 95% CI 0.5–0.9, p=0.018).
These data support the notion that the mechanism underlying WRF is important in determining its prognostic significance. Specifically, early WRF in the setting of ACE-I initiation appears to represent a benign event which is not associated with a loss of benefit from continued ACE-I therapy.
cardio-renal syndrome; worsening renal function; kidney; ACE inhibitor
To examine the association between exposure to antidepressants and emergency department or inpatient admission for sudden cardiac death and ventricular arrhythmia (SD/VA), and to examine the impact of dose and cytochrome P-450 inhibition
A cohort study was conducted within 1999–2003 Medicaid claims data from beneficiaries of five large states, supplemented with Medicare claims for dually-eligible individuals. Exposures were prescription claims for antidepressants of interest or a reference antidepressant. Outcomes were incident first-listed emergency department or principal inpatient diagnoses indicative of SD/VA originating in the outpatient setting, an outcome previously found to have a positive predictive value of 85%.
In 1.3 million person-years of antidepressant exposure, we identified 4,222 SD/VA outcomes for a rate of 3.3/1,000 person-years (95% CI, 3.2 – 3.4). Compared to paroxetine (a referent with a putatively favorable cardiovascular risk profile), adjusted hazard ratios (HRs) were 0.80 (0.67 – 0.95) for bupropion, 1.24 (0.93 – 1.65) for doxepin, 0.79 (0.55 – 1.15) for lithium, and 1.26 (1.11 – 1.42) for mirtazapine. HRs for amitriptyline, citalopram, fluoxetine, nefazodone, nortriptyline, sertraline, trazodone, and venlafaxine were near unity. For antidepressants having non-null risks (bupropion and mirtazapine), we observed no relationship with antidepressant dose and some relationships with concomitant cytochrome P-450 inhibition.
Of antidepressants studied, only mirtazapine had a statistically significantly greater SD/VA risk versus paroxetine. However, baseline differences between these users suggest that this finding may be attributable to residual confounding. Eleven other antidepressants had SD/VA risks no greater than that of paroxetine, thereby providing reassurance regarding the comparative cardiovascular safety of antidepressants.
Antidepressive Agents; Death, Sudden, Cardiac; Arrhythmias, Cardiac; Medicaid; Medicare; Centers for Medicare and Medicaid Services; Cohort Studies; Pharmacoepidemiology
Increasing epidemiological evidence suggests independent associations between psoriasis and cardiovascular and metabolic disease. Our objective was to test the hypothesis that directly-assessed psoriasis severity relates to the prevalence of metabolic syndrome and its components.
Population-based, cross-sectional study using computerized medical records from The Health Improvement Network Study population included individuals aged 45-65 years with psoriasis and practice-matched controls. Psoriasis diagnosis and extent were determined using provider-based questionnaires. Metabolic syndrome was defined using National Cholesterol Education Program (NCEP) Adult Treatment Panel (ATP) III criteria.
44,715 individuals were included: 4,065 with psoriasis and 40,650 controls. 2,044 participants had mild psoriasis (≤2% body surface area (BSA)), 1,377 had moderate (3-10% BSA), and 475 had severe psoriasis (>10% BSA). Psoriasis was associated with metabolic syndrome, adjusted odds ratio (OR) 1.41 (95% CI 1.31-1.51), varying in a “dose-response” manner, from mild (adj. OR 1.22, 95% CI 1.11-1.35) to severe psoriasis (adj. OR 1.98, 95% CI 1.62-2.43).
Psoriasis is associated with metabolic syndrome and the association increases with increasing disease severity. Furthermore, associations with obesity, hypertriglyceridemia and hyperglycemia increase with increasing disease severity independent of other metabolic syndrome components. These findings suggest that screening for metabolic disease should be considered for psoriasis, especially when extensive.
One of the primary determinants of blood flow in regional vascular beds is perfusion pressure. Our aim was to investigate if reduction in blood pressure during the treatment of decompensated heart failure would be associated with worsening renal function (WRF). Our secondary aim was to evaluate the prognostic significance of this potentially treatment-induced form of WRF.
Methods and results
Subjects included in the Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness (ESCAPE) trial limited data were studied (386 patients). Reduction in systolic blood pressure (SBP) was greater in patients experiencing WRF (−10.3 ± 18.5 vs. −2.8 ± 16.0 mmHg, P < 0.001) with larger reductions associated with greater odds for WRF (odds ratio = 1.3 per 10 mmHg reduction, P < 0.001). Systolic blood pressure reduction (relative change > median) was associated with greater doses of in-hospital oral vasodilators (P ≤ 0.017), thiazide diuretic use (P = 0.035), and greater weight reduction (P = 0.023). In patients with SBP-reduction, WRF was not associated with worsened survival [adjusted hazard ratio (HR) = 0.76, P = 0.58]. However, in patients without SBP-reduction, WRF was strongly associated with increased mortality (adjusted HR = 5.3, P < 0.001, P interaction = 0.001).
During the treatment of decompensated heart failure, significant blood pressure reduction is strongly associated with WRF. However, WRF that occurs in the setting of SBP-reduction is not associated with an adverse prognosis, whereas WRF in the absence of this provocation is strongly associated with increased mortality. These data suggest that WRF may represent the final common pathway of several mechanistically distinct processes, each with potentially different prognostic implications.
Cardio-renal syndrome; Worsening renal function; Kidney; Decompensated heart failure; Blood pressure
The objective of this study was to compare the rate of severe cardiovascular events and death in children who use attention-deficit/hyperactivity disorder (ADHD) medications versus nonusers.
PATIENTS AND METHODS:
We performed a large cohort study using data from 2 administrative databases. All children aged 3 to 17 years with a prescription for an amphetamine, atomoxetine, or methylphenidate were included and matched with up to 4 nonusers on the basis of data source, gender, state, and age. Cardiovascular events were validated using medical records. Proportional hazards regression was used to calculated hazard ratios.
We identified 241 417 incident users (primary cohort). No statistically significant difference between incident users and nonusers was observed in the rate of validated sudden death or ventricular arrhythmia (hazard ratio: 1.60 [95% confidence interval (CI): 0.19–13.60]) or all-cause death (hazard ratio: 0.76 [95% CI: 0.52–1.12]). None of the strokes identified during exposed time to ADHD medications were validated. No myocardial infarctions were identified in ADHD medication users. No statistically significant difference between prevalent users and nonusers (secondary cohort) was observed (hazard ratios for validated sudden death or ventricular arrhythmia: 1.43 [95% CI: 0.31–6.61]; stroke: 0.89 [95% CI: 0.11–7.11]; stroke/myocardial infarction: 0.72 [95% CI: 0.09–5.57]; and all-cause death: 0.77 [95% CI: 0.56–1.07).
The rate of cardiovascular events in exposed children was very low and in general no higher than that in unexposed control subjects. Because of the low number of events, we have limited ability to rule out relative increases in rate.
children; adolescents; amphetamines; atomoxetine; methylphenidate; cardiovascular; death
Warfarin therapy has been used clinically for over 60 years, yet continues to be problematic because of its narrow therapeutic index and large inter-individual variability in patient response. As a result, warfarin is a leading cause of serious medication-related adverse events, and its efficacy is also suboptimal.
To review factors that are responsible for variable response to warfarin, including clinical, environmental, and genetic factors, and to explore some possible approaches to improving warfarin therapy.
Recent efforts have focused on developing dosing algorithms that included genetic information to try to improve warfarin dosing. These dosing algorithms hold promise, but have not been fully validated or tested in rigorous clinical trials. Perhaps equally importantly, adherence to warfarin is a major problem that should be addressed with innovative and cost-effective interventions.
Additional research is needed to further test whether interventions can be used to improve warfarin dosing and outcomes.
adherence; pharmacogenetics; prediction; warfarin
Worsening renal function (WRF) commonly complicates the treatment of acute decompensated heart failure. Despite considerable investigation in this area, it remains unclear to what degree WRF is a reflection of treatment versus patient related factors. We hypothesized that if WRF is significantly influenced by factors intrinsic to the patient than WRF during an index hospitalization should predict WRF during subsequent hospitalization.
Consecutive admissions to the Hospital of the University of Pennsylvania with a discharge diagnosis of congestive heart failure were reviewed. Patients with >1 hospitalization were retained for analysis.
In total 181 hospitalization pairs met the inclusion criteria. Baseline patient characteristics demonstrated significant correlation between hospitalizations (p≤0.002 for all) but minimal association with WRF. In contrast, variables related to the aggressiveness of diuresis were weakly correlated between hospitalizations but significantly associated with WRF (p≤0.024 for all). Consistent with the primary hypothesis, WRF during the index hospitalization was strongly associated with WRF during subsequent hospitalization (OR=2.7, p=0.003). This association was minimally altered after controlling for traditional baseline characteristics (OR=2.5, p=0.006) and in-hospital treatment related parameters (OR=2.8, p=0.005).
A prior history of WRF is strongly associated with subsequent episodes of WRF, independent of in-hospital treatment received. These results suggest that baseline factors intrinsic to the patient’s cardiorenal pathophysiology have substantial influence on the subsequent development of WRF.
By guiding initial warfarin dose, pharmacogenetic (PGx) algorithms may improve the safety of warfarin initiation. However, once INR response is known, the contribution of PGx to dose refinements is uncertain. This study sought to develop and validate clinical and PGx dosing algorithms for warfarin dose refinement on days 6–11 after therapy initiation.
Materials and Methods
An international sample of 2,022 patients at 13 medical centers on 3 continents provided clinical, INR, and genetic data at treatment days 6–11 to predict therapeutic warfarin dose. Independent derivation and retrospective validation samples were composed by randomly dividing the population (80%/20%). Prior warfarin doses were weighted by their expected effect on S-warfarin concentrations using an exponential-decay pharmacokinetic model. The INR divided by that “effective” dose constituted a treatment response index.
Treatment response index, age, amiodarone, body surface area, warfarin indication, and target INR were associated with dose in the derivation sample. A clinical algorithm based on these factors was remarkably accurate: in the retrospective validation cohort its R2 was 61.2% and median absolute error (MAE) was 5.0 mg/week. Accuracy and safety was confirmed in a prospective cohort (N=43). CYP2C9 variants and VKORC1-1639 G→A were significant dose predictors in both the derivation and validation samples. In the retrospective validation cohort, the PGx algorithm had: R2= 69.1% (P<0.05 vs. clinical algorithm), MAE= 4.7 mg/week.
A pharmacogenetic warfarin dose-refinement algorithm based on clinical, INR, and genetic factors can explain at least 69.1% of therapeutic warfarin dose variability after about one week of therapy.
warfarin; VKORC1; CYP2C9; pharmacogenetic
Rofecoxib has been proposed to increase the risk of myocardial infarction (MI) through suppression of cyclooxygenase (COX)-2 mediated prostacyclin. Estrogen may have protective effects through augmenting COX-2 expression and subsequently increasing prostacyclin. Estrogen may attenuate the association between rofecoxib and MI. We used 1999–2002 Medicaid claims data to measure the MI-hazard ratio (HR) attributed to rofecoxib exposure in estrogen exposed and unexposed 45–65 year old women. We identified 184,169 female rofecoxib users who contributed 309,504 person-years and experienced 1217 first MIs. Estrogen exposure appeared protective (MI-HR 0.72, 95% CI 0.62–0.84) in this cohort. Rofecoxib was associated with an elevated MI-HR in both estrogen exposed (2.01, 95% CI 1.60–2.54) and estrogen unexposed women (1.69, 95% CI 1.43–1.99). The rofecoxib-estrogen interaction ratio was not significantly different from 1 (1.19, 95% CI 0.91–1.57). Although estrogen use was associated with a lower risk of MI, it did not appear to attenuate the association between rofecoxib and MI.
myocardial infarction; hormone replacement therapy; pharmacoepidemiology; cyclooxygenase inhibitor
Worsening renal function (RF) and improved RF during the treatment of decompensated heart failure have traditionally been thought of as hemodynamically distinct events. We hypothesized that if pulmonary artery catheter derived measures are relevant in the evaluation of cardiorenal interactions comparison of patients with improved vs. worsening RF should highlight any important hemodynamic differences. All subjects in the ESCAPE trial limited data set with admission and discharge creatinine values available were included (401 patients). There were no differences in baseline, final, or change in pulmonary artery catheter derived hemodynamic variables, inotrope and intravenous vasodilator use, or survival between patients with improved and worsening RF (p=NS for all). Both groups were equally likely to be in the bottom quartile of cardiac index (CI) (p=0.32), have a 25% improvement in CI (p=0.97), or have any worsening in CI (p=0.90). When patients with any significant change in renal function (positive or negative) were compared to patients with stable renal function, strong associations between variables such as reduced CI (OR=2.2, p=0.02), increased intravenous inotrope and vasodilator use (OR=2.9, p<0.001), and worsened all cause mortality (HR=1.8, p=0.01) became apparent. Contrary to traditionally held views, patients with improved RF and worsening RF have similar hemodynamic parameters and outcomes. Combining these groups identifies a hemodynamically compromised population with significantly worse survival than patients with stable renal function. In Conclusion, changes in renal function, regardless of direction, likely identify a population with an advanced disease state and poor prognosis.
Cardio-renal syndrome; worsening renal function; improved renal function; acute heart failure; kidney
Overly aggressive diuresis leading to intravascular volume depletion has been proposed as a cause for worsening renal function (WRF) during the treatment of decompensated heart failure. If diuresis occurs at a rate greater than extravascular fluid can refill the intravascular space, intravascular substances such as hemoglobin and plasma proteins increase in concentration. We hypothesized that hemoconcentration would be associated with WRF and possibly provide insight into the relationship between aggressive decongestion and outcomes.
Methods and Results
Subjects in the Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness trial limited data set with a baseline/discharge pair of hematocrit, albumin, or total protein values were included (336 patients). Baseline to discharge increases in these parameters were evaluated and patients with ≥2 in the top tertile were considered to have evidence of hemoconcentration. The group experiencing hemoconcentration received higher doses of loop diuretics, lost more weight/fluid, and had greater reductions in filling pressures (p<0.05 for all). Hemoconcentration was strongly associated with WRF (OR=5.3, p<0.001) whereas change in right atrial pressure (p=0.36) and change in pulmonary capillary wedge pressure (p=0.53) were not. Patients with hemoconcentration had significantly lower 180 day mortality (HR=0.31, p=0.013). This relationship persisted after adjustment for baseline characteristics (HR=0.16, p=0.001).
Hemoconcentration is significantly associated with measures of aggressive fluid removal and deterioration in renal function. Despite this relationship, hemoconcentration is associated with substantially improved survival. These observations raise the question whether aggressive decongestion, even in the setting of WRF, can positively impact survival.
Diuretics; Heart failure; Kidney
Idiopathic pulmonary fibrosis (IPF) has not been shown to respond to corticosteroid therapy; however, many patients receive these drugs at the time of diagnosis. The factors that are associated with the decision to prescribe corticosteroids have not been examined.
We conducted a retrospective cohort study of 1126 patients with a new diagnosis of IPF using The Health Improvement Network database from the United Kingdom. We used generalized estimating equation (GEE) regression models to test the association of patient characteristics, co-morbid diseases, and disease characteristics with the use of corticosteroids within 30 days of IPF diagnosis.
Bivariable analyses demonstrated an association between female sex, the presence of dyspnea, the need for oxygen, past steroid use, and the use of corticosteroids immediately prior to diagnosis with the use of corticosteroids at the time of diagnosis. After adjustment with multivariable GEE regression, only the use of oxygen at the time of diagnosis (OR 1.69, CI 1.14-2.49), the past use of corticosteroids (OR 1.50, CI 1.04-2.15), and use of corticosteroids immediately prior to diagnosis (OR 5.72, CI 3.80-8.60) remained significantly associated with increased use of corticosteroids. No association was found between prior diabetes, osteoporosis, glaucoma, hypertension, congestive heart failure, obesity or peptic ulcer disease and use of corticosteroids at diagnosis.
The decision to prescribe corticosteroids is associated with oxygen use and past corticosteroid use but is not influenced by factors such as age, gender, or common co-morbid conditions that may pre-dispose patients to adverse events of therapy.
Idiopathic pulmonary fibrosis; prescribing; drug therapy; corticosteroids; adverse events
Patients receiving hemodialysis have high rates of cardiovascular morbidity and mortality that may be related to the hemodynamic effects of rapid ultrafiltration. Here we tested whether higher dialytic ultrafiltration rates are associated with greater all-cause and cardiovascular mortality, and hospitalization for cardiovascular disease. We used data from the Hemodialysis Study, an almost-7-year randomized clinical trial of 1846 patients receiving thrice-weekly chronic dialysis. The ultrafiltration rates were divided into three categories: up to 10 ml/h/kg, 10–13 ml/h/kg, and over 13 ml/h/kg. Compared to ultrafiltration rates in the lowest group, rates in the highest were significantly associated with increased all-cause and cardiovascular-related mortality with adjusted hazard ratios of 1.59 and 1.71, respectively. Overall, ultrafiltration rates between 10–13 ml/h/kg were not associated with all-cause or cardiovascular mortality; however, they were significantly associated among participants with congestive heart failure. Cubic spline interpolation suggested that the risk of all-cause and cardiovascular mortality began to increase at ultrafiltration rates over 10 ml/h/kg regardless of the status of congestive heart failure. Hence, higher ultrafiltration rates in hemodialysis patients are associated with a greater risk of all-cause and cardiovascular death.
cardiovascular death; hemodialysis; mortality; ultrafiltration
The Clarification of Optimal Anticoagulation through Genetics (COAG) trial is a large, multicenter, double-blinded, randomized trial to determine whether use of a genotype-guided dosing algorithm (using clinical and genetic information) to initiate warfarin treatment will improve anticoagulation status when compared to a dosing algorithm using only clinical information.
This article describes prospective alpha allocation and balanced alpha allocation for the design of the COAG trial.
The trial involves two possibly heterogeneous populations, which can be distinguished by the difference in warfarin dose as predicted by the two algorithms. A statistical approach is detailed, which allows an overall comparison as well as a comparison of the primary endpoint in the subgroup for which sufficiently different doses are predicted by the two algorithms. Methods of allocating alpha for these analyses are given – a prospective alpha allocation and allocating alpha so that the two analyses have equal power, which we call a `balanced alpha allocation.'
We show how to include an analysis of the primary endpoint in a subgroup as a co-primary analysis. Power can be improved by incorporating the correlation between the overall and subgroup analyses in a prospective alpha allocation approach. Balanced alpha allocation for the full cohort and subgroup tests to achieve the same desired power for both of the primary analyses is discussed in detail.
In the COAG trial, it is impractical to stratify the randomization on subgroup membership because genetic information may not be available at the time of randomization. If imbalances in the treatment arms in the subgroup are found, they will need to be addressed.
The design of the COAG trial assures that the subgroup in which the largest treatment difference is expected is elevated to a co-primary analysis. Incorporating the correlation between the full cohort and the subgroup analyses provides an improvement in power for the subgroup comparison, and further improvement may be achieved via a balanced alpha allocation approach when the parameters involved in the sample size calculation are reasonably well estimated.
The aim of this study was to compare the accuracy of genetic tables and formal pharmacogenetic algorithms for warfarin dosing.
Pharmacogenetic algorithms based on regression equations can predict warfarin dose, but they require detailed mathematical calculations. A simpler alternative, recently added to the warfarin label by the U.S. Food and Drug Administration, is to use genotype-stratified tables to estimate warfarin dose. This table may potentially increase the use of pharmacogenetic warfarin dosing in clinical practice; however, its accuracy has not been quantified.
A retrospective cohort study of 1,378 patients from 3 anticoagulation centers was conducted. Inclusion criteria were stable therapeutic warfarin dose and complete genetic and clinical data. Five dose prediction methods were compared: 2 methods using only clinical information (empiric 5 mg/day dosing and a formal clinical algorithm), 2 genetic tables (the new warfarin label table and a table based on mean dose stratified by genotype), and 1 formal pharmacogenetic algorithm, using both clinical and genetic information. For each method, the proportion of patients whose predicted doses were within 20% of their actual therapeutic doses was determined. Dosing methods were compared using McNemar’s chi-square test.
Warfarin dose prediction was significantly more accurate (all p < 0.001) with the pharmacogenetic algorithm (52%) than with all other methods: empiric dosing (37%; odds ratio [OR]: 2.2), clinical algorithm (39%; OR: 2.2), warfarin label (43%; OR: 1.8), and genotype mean dose table (44%; OR: 1.9).
Although genetic tables predicted warfarin dose better than empiric dosing, formal pharmacogenetic algorithms were the most accurate.
coumarins; dose prediction; dosing algorithms; FDA label; genetic tables; pharmacogenetics; warfarin
Sudden cardiac death (SD) and ventricular arrhythmias (VAs) caused by medications have arisen as an important public health concern in recent years. The validity of diagnostic codes in identifying SD/VA events originating in the ambulatory setting is not well known. This study examined the positive predictive value (PPV) of hospitalization and emergency department encounter diagnoses in identifying SD/VA events originating in the outpatient setting.
We selected random samples of hospitalizations and emergency department claims with principal or first-listed discharge diagnosis codes indicative of SD/VA in individuals contributing at least 6 months of baseline time within 1999–2002 Medicaid and Medicare data from five large states. We then obtained and reviewed medical records corresponding to these events to serve as the reference standard.
We identified 5239 inpatient and 29 135 emergency department events, randomly selected 100 of each, and obtained 119 medical records, 116 of which were for the requested courses of care. The PPVs for an outpatient-originating SD/VA precipitating hospitalization or emergency department treatment were 85.3% (95% confidence interval [CI]=77.6–91.2) overall, 79.7% (95%CI=68.3–88.4) for hospitalization claims, and 93.6% (95%CI=82.5–98.7) for emergency department claims.
First-listed SD/VA diagnostic codes identified in inpatient or emergency department encounters had very good agreement with clinical diagnoses and functioned well to identify outpatient-originating events. Researchers using such codes can be confident of the PPV when conducting studies of SD/VA originating in the outpatient setting.
validation studies; death; sudden; cardiac; arrhythmias; cardiac; pharmacoepidemiology
There is currently much interest in pharmacogenetics: determining variation in genes that regulate drug effects, with a particular emphasis on improving drug safety and efficacy. The ability to determine such variation motivates the application of personalized drug therapies that utilize a patient's genetic makeup to determine a safe and effective drug at the correct dose. To ascertain whether a genotype-guided drug therapy improves patient care, a personalized medicine intervention may be evaluated within the framework of a randomized controlled trial. The statistical design of this type of personalized medicine intervention requires special considerations: the distribution of relevant allelic variants in the study population; and whether the pharmacogenetic intervention is equally effective across subpopulations defined by allelic variants.
The statistical design of the Clarification of Optimal Anticoagulation through Genetics (COAG) trial serves as an illustrative example of a personalized medicine intervention that uses each subject's genotype information. The COAG trial is a multicenter, double blind, randomized clinical trial that will compare two approaches to initiation of warfarin therapy: genotype-guided dosing, the initiation of warfarin therapy based on algorithms using clinical information and genotypes for polymorphisms in CYP2C9 and VKORC1; and clinical-guided dosing, the initiation of warfarin therapy based on algorithms using only clinical information.
We determine an absolute minimum detectable difference of 5.49% based on an assumed 60% population prevalence of zero or multiple genetic variants in either CYP2C9 or VKORC1 and an assumed 15% relative effectiveness of genotype-guided warfarin initiation for those with zero or multiple genetic variants. Thus we calculate a sample size of 1238 to achieve a power level of 80% for the primary outcome. We show that reasonable departures from these assumptions may decrease statistical power to 65%.
In a personalized medicine intervention, the minimum detectable difference used in sample size calculations is not a known quantity, but rather an unknown quantity that depends on the genetic makeup of the subjects enrolled. Given the possible sensitivity of sample size and power calculations to these key assumptions, we recommend that they be monitored during the conduct of a personalized medicine intervention.
Neuregulin-1 (NRG-1) is a paracrine factor released by microvascular endothelial cells that has cardioprotective effects in animal models of heart failure. However, circulating NRG-1 has not been studied in human heart disease. We used a novel immunoassay to test whether circulating neuregulin-1β (NRG-1β) is associated with disease severity and clinical outcome in chronic heart failure.
Methods and Results
Serum NRG-1β was quantified in 899 outpatients in the Penn Heart Failure Study, a referral cohort representing a broad spectrum of systolic heart failure. Circulating NRG-1β was significantly elevated in patients with worse disease severity (NYHA Class IV median 6.2 versus 4.4ng/ml for Class I, p=0.002). In adjusted models, NRG-1β was independently associated with an increased risk of death or cardiac transplantation over a median follow-up of 2.4 years (adjusted HR 1.58 [95% CI 1.04–2.39, p=0.03] comparing 4th versus 1st NRG-1β quartile). Associations with outcome differed by heart failure etiology and symptom severity, with the strongest associations observed in patients with ischemic cardiomyopathy (interaction p=0.008) and NYHA Class III/IV symptoms (interaction p=0.01). These findings were all independent of BNP, and assessment of NRG-1β and BNP jointly provided better risk stratification than each biomarker individually in patients with ischemic or NYHA Class III/IV heart failure.
Circulating NRG-1β is independently associated with heart failure severity and risk of death or cardiac transplantation. These findings support a role for NRG-1/ErbB signaling in human heart failure and identify serum NRG-1β as a novel biomarker that may have clinical applications.
Neuregulin; Heart Failure; Cardiomyopathy
To determine whether a potential pharmacokinetic interaction between warfarin and orally administered anti-infectives increases the risk of hospitalization for gastrointestinal (GI) bleeding in warfarin users.
We conducted a nested case-control and case-crossover study in US Medicaid data. Logistic regression was used to determine the association between GI bleeding and prior use of ciprofloxacin, levofloxacin, gatifloxacin, cotrimoxazole, or fluconazole, all versus no exposure and versus cephalexin, which would not be expected to interact with warfarin.
All anti-infectives examined exhibited an elevated odds ratio (OR) vs. no exposure. Using cephalexin as the reference category, ORs for cotrimoxazole (OR:1.68 [95% CI:1.21–2.33] in the prior 6–10 days) and fluconazole (OR:2.09 [95% CI:1.34–3.26] in the prior 11–15 days) were significantly elevated.
Warfarin users who had received an anti-infective agent showed a substantially increased risk of GI bleeding. Nonetheless, a drug-drug interaction with warfarin was evident only for cotrimoxazole and fluconazole.
The objective of this study was to determine whether clinical, environmental, and genetic factors can be used to develop dosing algorithms for Caucasians and African Americans that perform better than giving empirical 5 mg/day.
From April 2002 through December 2005, 259 warfarin initiators were prospectively followed until they reached maintenance dose.
The Caucasian algorithm included 11 variables (R2=0.43). This model (51% within 1 mg) performed better compared with 5 mg/day (29% within 5±1 mg). The African American algorithm included 10 variables (R2=0.28). This model predicted 37% of doses within 1 mg of the observed dose; a small improvement compared with 5 mg/day (34%). These results were similar to the results we obtained from testing other (published) algorithms.
The dosing algorithms in Caucasians explained <45% of the variability and the algorithms in African Americans performed only marginally better than giving 5 mg empirically.
Patient adherence to warfarin may influence anticoagulation control; yet, adherence among warfarin users has not been rigorously studied.
Our goal was to quantify warfarin adherence over time and to compare electronic medication event monitoring systems (MEMS) cap measurements with both self-report and clinician assessment of patient adherence.
We performed a prospective cohort study of warfarin users at 3 Pennsylvania-based anticoagulation clinics and assessed pill-taking behaviors using MEMS caps, patient reports, and clinician assessments.
Among 145 participants, the mean percent of days of nonadherence by MEMS was 21.8% (standard deviation±21.1%). Participants were about 6 times more likely to take too few pills than to take extra pills (18.8 vs. 3.3%). Adherence changed over time, initially worsening over the first 6 months of monitoring, which was followed by improvement beyond 6 months. Although clinicians were statistically better than chance at correctly labeling a participant’s adherence (odds ratio = 2.05, p = 0.015), their estimates often did not correlate with MEMS-cap data; clinicians judged participants to be “adherent” at 82.8% of visits that were categorized as moderately nonadherent using MEMS-cap data (≥20% nonadherence days). Similarly, at visits when participants were moderately nonadherent by MEMS, they self-reported perfect adherence 77.9% of the time.
These results suggest that patients may benefit from adherence counseling even when they claim to be taking their warfarin or the clinician feels they are doing so, particularly several months into their course of therapy.
patient adherence; warfarin; medication event monitoring system
Cardiac troponin is more accurate than creatine kinase (CK) testing for detecting myocardial injury in patients with acute coronary syndromes (ACS), but its effects on clinical care compared with CK testing alone is open to question.
To test the effects of troponin I on medical decisions for patients undergoing cardiac enzyme testing.
Randomized, controlled trial.
Urban academic Veterans Affairs medical center.
Three hundred ninety-two patients presenting to the emergency department (ED) and outpatient settings with symptoms and/or electrocardiograms suggestive but not diagnostic of ACS.
Random assignment to linked CK-troponin I (CKTnI) testing or CK testing alone.
ED discharge and cardiac catheterization incidence (primary); ED medication use, inpatient noninvasive testing, revascularization procedures, discharge medications, and 8-week ED visits, hospitalizations, and procedures (secondary).
Groups were similar in all variables except history of heart failure (CK 26.8% vs CKTnI 17.0%). ACS comprised 12.2% of the cohort. ED discharge incidence was greater in the CKTnI arm (18% vs 9.6%; relative risk [RR], 1.83; 95% CI, 1.08 to 3.31; P=.02; number needed to test=12.6; 95% CI, 4.5 to 130). Troponin testing had no significant effect on catheterization incidence (18.2% vs 14.5%; RR, 1.19; 95% CI, 0.72 to 1.92; P>.20) or other outcomes except follow-up echocardiography (13.4% vs 7.4%; RR, 2.24; 95% CI, 1.11 to 4.69; P=.02).
In a veterans population undergoing cardiac enzyme testing, CKTnI testing led to more ED discharges than CK testing alone but had no effect on inpatient care and was associated with more echocardiograms in a follow-up period.
troponin; cardiac enzymes; cardiovascular diseases; acute coronary syndromes; chest pain