Perioperative myocardial infarction is a serious complication after non-cardiac surgery. We hypothesized that preoperative cardiac troponin T detected with a novel high-sensitivity (hs-cTnT) assay will identify patients at risk of acute myocardial infarction (AMI) and long-term mortality after major non-cardiac surgery.
This was a prospective cohort study within the Vitamins in Nitrous Oxide (VINO) trial (n=608). Patients had been diagnosed with or had multiple risk factors for coronary artery disease and underwent major non-cardiac surgery. Cardiac troponin I (contemporary assay) and troponin T (high-sensitivity assay), and 12-lead electrocardiograms were obtained before and immediately after surgery and on postoperative day 1, 2 and 3.
At baseline before surgery, 599 patients (98.5%) had a detectable hs-cTnT concentration and 247 (41%) were above 14 ng/L (99th percentile). After surgery, 497 patients (82%) had a rise in hs-cTnT (median Δhs-cTnT +2.7 ng/L [IQR 0.7, 6.8]). During the first three postoperative days, 9 patients (2.5%) with a preoperative hs-cTnT <14 ng/L suffered from AMI, compared to 21 patients (8.6%) with a preoperative hs-cTnT >14 ng/L (odds ratio, 3.67; 95% CI 1.65 – 8.15). During long-term follow-up, 80 deaths occurred. The 3-year mortality rate was 11% in patients with a preoperative hs-cTnT concentration <14 ng/L compared to 25% in patients with a preoperative hs-cTnT >14 ng/L (adjusted hazard ratio, 2.17; 95% CI 1.19 – 3.96).
In this cohort of high-risk patients, preoperative hs-cTnT concentrations were significantly associated with postoperative myocardial infarction and long-term mortality after non-cardiac surgery.
The risk of venous thromboembolism (VTE) is higher after total hip or knee replacement surgery than after almost any other surgical procedure; warfarin sodium is commonly prescribed to reduce this peri-operative risk. Warfarin has a narrow therapeutic window with high inter-individual dose variability and can cause hemorrhage. The Genetics-InFormatics Trial (GIFT) of Warfarin to Prevent Deep Vein Thrombosis (DVT) is a 2×2 factorial-design, randomized controlled trial designed to compare the safety and effectiveness of warfarin-dosing strategies. GIFT will answer two questions: (1) Does pharmacogenetic (PGx) dosing reduce the rate of adverse events in orthopedic patients; and (2) Is a lower target International Normalized Ratio (INR) non-inferior to a higher target INR in orthopedic participants? The composite primary endpoint of the trial is symptomatic and asymptomatic VTE (identified on screening ultrasonography), major hemorrhage, INR ≥ 4, and death.
pharmacogenetics; warfarin; randomized controlled trial; dosing algorithm
Via generation of vitamin K-dependent proteins, gamma-glutamyl carboxylase (GGCX) plays a critical role in the vitamin K cycle. Single nucleotide polymorphisms (SNPs) in GGCX, therefore, may affect dosing of the vitamin K antagonist, warfarin.
In a multi-centered, cross-sectional study of 985 patients prescribed warfarin therapy, we genotyped for two GGCX SNPs (rs11676382 and rs12714145) and quantified their relationship to therapeutic dose.
GGCX rs11676382 was a significant (p=0.03) predictor of residual dosing error and was associated with a 6.1% reduction in warfarin dose (95% CI: 0.6%-11.4%) per G allele. The prevalence was 14.1% in our predominantly (78%) Caucasian cohort, but the overall contribution to dosing accuracy was modest (partial R2 = 0.2%). GGCX rs12714145 was not a significant predictor of therapeutic dose (p = 0.26).
GGCX rs11676382 is a statistically significant predictor of warfarin dose, but the clinical relevance is modest. Given the potentially low marginal cost of adding this SNP to existing genotyping platforms, we have modified our non-profit website (www.WarfarinDosing.org) to accommodate knowledge of this variant.
gamma-glutamyl carboxylase; warfarin; pharmacogenetics; algorithm
Warfarin is commonly prescribed for prophylaxis and treatment of thromboembolism after orthopedic surgery. During warfarin initiation, out-of-range International Normalized Ratio (INR) values and adverse events are common.
In orthopedic patients beginning warfarin therapy, we developed and prospectively validated pharmacogenetic and clinical dose refinement algorithms to revise the estimated therapeutic dose after 4 days of therapy.
The pharmacogenetic algorithm used the cytochrome P450 (CYP) 2C9 genotype, smoking status, perioperative blood loss, liver disease, INR values, and dose history to predict the therapeutic dose. The R2 was 82% in a derivation cohort (N = 86), and 70% when used prospectively (N = 146). The R2 of the clinical algorithm that used INR values and dose history to predict the therapeutic dose was 57% in a derivation cohort (N = 178), and 48% in a prospective validation cohort (N = 146). In one month of prospective follow-up, the percent time spent in the therapeutic range was 7% higher (95% CI: 2.7%–11.7%) in the pharmacogenetic cohort. The risk of laboratory or clinical adverse event was also significantly reduced in the pharmacogenetic cohort (Hazard Ratio 0.54; 95% CI: 0.29–0.97).
Warfarin dose adjustments that incorporate genotype and clinical variables available after four warfarin doses are accurate. In this non-randomized, prospective study, pharmacogenetic dose refinements were associated with more time spent in the therapeutic range and fewer laboratory or clinical adverse events. To facilitate gene-guided warfarin dosing we created a non-profit website, www.WarfarinDosing.org.
Warfarin; Pharmacogenetics; Dosing Algorithm; Anticoagulants; Orthopedic Surgery
Nitrous oxide causes an acute increase in plasma homocysteine that is more pronounced in patients with the MTHFR C677T or A1298C gene variant. In this randomized controlled trial we sought to determine if patients carrying the MTHFR C677T or A1298C variant had a higher risk for perioperative cardiac events after nitrous oxide anesthesia and if this risk could be mitigated by B-vitamins.
We randomized adult patients with cardiac risk factors undergoing noncardiac surgery to receive nitrous oxide plus intravenous B-vitamins before and after surgery or to nitrous oxide and placebo. Serial cardiac biomarkers and 12-lead electrocardiograms were obtained. The primary study endpoint was the incidence of myocardial injury, as defined by cardiac troponin I elevation within the first 72 hours after surgery.
A total of 500 patients completed the trial. Patients who were homozygous for either MTHFR C677T or A1298C gene variant (n= 98; 19.6%) had no increased rate of postoperative cardiac troponin I elevation compared to wild-type and heterozygous patients (11.2% vs. 14.0%; relative risk 0.96, 95% CI 0.85 to 1.07, p=0.48). B-vitamins blunted the rise in homocysteine, but had no effect on cardiac troponin I elevation compared to patients receiving placebo (13.2% vs. 13.6%; relative risk 1.02, 95% CI 0.78 to 1.32, p=0.91).
Neither MTHFR C677T and A1298C gene variant nor acute homocysteine increase are associated with perioperative cardiac troponin elevation after nitrousoxide anesthesia. B-vitamins blunt nitrous oxide-induced homocysteine increase but have no effect on cardiac troponin elevation.
The clinical utility of genotype-guided (pharmacogenetically based) dosing of warfarin has been tested only in small clinical trials or observational studies, with equivocal results.
We randomly assigned 1015 patients to receive doses of warfarin during the first 5 days of therapy that were determined according to a dosing algorithm that included both clinical variables and genotype data or to one that included clinical variables only. All patients and clinicians were unaware of the dose of warfarin during the first 4 weeks of therapy. The primary outcome was the percentage of time that the international normalized ratio (INR) was in the therapeutic range from day 4 or 5 through day 28 of therapy.
At 4 weeks, the mean percentage of time in the therapeutic range was 45.2% in the genotype-guided group and 45.4% in the clinically guided group (adjusted mean difference, [genotype-guided group minus clinically guided group], −0.2; 95% confidence interval, −3.4 to 3.1; P=0.91). There also was no significant between-group difference among patients with a predicted dose difference between the two algorithms of 1 mg per day or more. There was, however, a significant interaction between dosing strategy and race (P=0.003). Among black patients, the mean percentage of time in the therapeutic range was less in the genotype-guided group than in the clinically guided group. The rates of the combined outcome of any INR of 4 or more, major bleeding, or thromboembolism did not differ significantly according to dosing strategy.
Genotype-guided dosing of warfarin did not improve anticoagulation control during the first 4 weeks of therapy. (Funded by the National Heart, Lung, and Blood Institute and others; COAG ClinicalTrials.gov number, NCT00839657.)
Treatments for non–ST-segment elevation myocardial infarction (NSTEMI) reduce ischemic events but increase bleeding. Baseline prediction of bleeding risk can complement ischemic risk prediction for optimizing NSTEMI care; however, existing models are not well suited for this purpose.
Methods and Results
We developed (n=71,277) and validated (n=17,857) a model that identifies 8 independent baseline predictors of in-hospital major bleeding among community-treated NSTEMI patients enrolled in the CRUSADE Quality Improvement Initiative. Model performance was tested by c statistics in the derivation and validation cohorts and according to post-admission treatment (i.e., invasive and antithrombotic therapy). The CRUSADE bleeding score (range 1–100 points) was created by assigning weighted integers corresponding to the coefficient of each variable. The rate of major bleeding increased by bleeding risk score quintiles: 3.1% very low risk (≤20); 5.5% low risk (21–30); 8.6% moderate risk (31–40); 11.9% high risk (41–50); and 19.5% very high risk (>50) (Ptrend<0.001). The c statistics for the major bleeding model (derivation=0.72 and validation=0.71) and risk score (derivation=0.71 and validation=0.70) were similar. The c statistics for the model among treatment subgroups were: ≥2 antithrombotics=0.72; <2 antithrombotics=0.73; invasive approach=0.73; conservative approach=0.68.
The CRUSADE bleeding score quantifies risk for in-hospital major bleeding across all post-admission treatments, enhancing baseline risk assessment for NSTEMI care.
non-ST-segment elevation myocardial infarction; bleeding; risk assessment
CYP2C9 and VKORC1 genotypes predict therapeutic warfarin dose at initiation of therapy; however, the predictive ability of genetic information after a week or longer is unknown. Experts have hypothesized that genotype becomes irrelevant once International Normalized Ratio (INR) values are available because INR response reflects warfarin sensitivity.
We genotyped the participants in the Prevention of Recurrent Venous Thromboembolism (PREVENT) trial, who had idiopathic venous thromboemboli and began low-intensity warfarin (therapeutic INR 1.5-2.0) using a standard dosing protocol. To develop pharmacogenetic models, we quantified the effect of genotypes, clinical factors, previous doses, and INR on therapeutic warfarin dose in the 223 PREVENT participants who were randomized to warfarin and achieved stable therapeutic INRs.
A pharmacogenetic model using data from day 0 (before therapy initiation) explained 54% of the variability in therapeutic dose (R2). The R2 increased to 68% at day 7, 75% at day 14, and 77% at day 21, because of increasing contributions from prior doses and INR response. Although CYP2C9 and VKORC1 genotypes were significant independent predictors of therapeutic dose at each weekly interval, the magnitude of their predictive ability diminished over time: partial R2 of genotype was 43% at day 0, 12% at day 7, 4% at day 14, and 1% at day 21.
Over the first weeks of warfarin therapy, INR and prior dose become increasingly predictive of therapeutic dose, and genotype becomes less relevant. However, at day 7, genotype remains clinically relevant, accounting for 12% of therapeutic dose variability.
Pressure overload due to aortic stenosis (AS) causes maladaptive ventricular and vascular remodeling that can lead to pulmonary hypertension, heart failure symptoms, and adverse outcomes. Retarding or reversing this maladaptive remodeling and its unfavorable hemodynamic consequences has potential to improve morbidity and mortality. Preclinical models of pressure overload have shown that phosphodiesterase type 5 (PDE5) inhibition is beneficial, however the use of PDE5 inhibitors in patients with AS is controversial because of concerns about vasodilation and hypotension.
Methods and Results
We evaluated the safety and hemodynamic response of 20 subjects with severe symptomatic AS (mean aortic valve area 0.7±0.2 cm2, ejection fraction 60±14%) who received a single oral dose of sildenafil (40mg or 80mg). Compared to baseline, after 60 minutes sildenafil reduced systemic (−12%, p<0.001) and pulmonary (−29%, p=0.002) vascular resistance, mean pulmonary artery (−25%, p<0.001) and wedge (−17%, p<0.001) pressure, and increased systemic (+13%, p<0.001) and pulmonary (+45%, p<0.001) vascular compliance and stroke volume index (+8%, p=0.01). These changes were not dose dependent. Sildenafil caused a modest decrease in mean systemic arterial pressure (−11%, p<0.001), but was well-tolerated with no episodes of symptomatic hypotension.
This study shows for the first time that a single dose of a PDE5 inhibitor is safe and well-tolerated in patients with severe AS and is associated with acute improvements in pulmonary and systemic hemodynamics resulting in biventricular unloading. These findings support the need for longer-term studies to evaluate the role of PDE5 inhibition as adjunctive medical therapy in patients with AS.
aortic valve stenosis; heart failure; phosphodiesterase type 5 inhibitors; pulmonary hypertension; hemodynamics
AHA Scientific Statements; atrial fibrillation; atrium; epidemiology; prevention; risk factors
Approximately 1 million people in the United States and over 30 million worldwide are living with human immunodeficiency virus type 1 (HIV-1). While mortality from untreated infection approaches 100%, survival improves markedly with use of contemporary antiretroviral therapies (ART). In the United States, 25 drugs are approved for treating HIV-1, and increasing numbers are available in resource-limited countries. Safe and effective ART is a cornerstone in the global struggle against the acquired immunodeficiency syndrome. Variable responses to ART are due at least in part to human genetic variants that affect drug metabolism, drug disposition, and off-site drug targets. Defining effects of human genetic variants on HIV treatment toxicity, efficacy, and pharmacokinetics has far-reaching implications. In 2010, the National Institute of Allergy and Infectious Diseases sponsored a workshop entitled, Pharmacogenomics – A Path Towards Personalized HIV Care. This article summarizes workshop objectives, presentations, discussions, and recommendations derived from this meeting.
HIV therapy; pharmacogenetics; pharmacogenomics; workshop
By guiding initial warfarin dose, pharmacogenetic (PGx) algorithms may improve the safety of warfarin initiation. However, once INR response is known, the contribution of PGx to dose refinements is uncertain. This study sought to develop and validate clinical and PGx dosing algorithms for warfarin dose refinement on days 6–11 after therapy initiation.
Materials and Methods
An international sample of 2,022 patients at 13 medical centers on 3 continents provided clinical, INR, and genetic data at treatment days 6–11 to predict therapeutic warfarin dose. Independent derivation and retrospective validation samples were composed by randomly dividing the population (80%/20%). Prior warfarin doses were weighted by their expected effect on S-warfarin concentrations using an exponential-decay pharmacokinetic model. The INR divided by that “effective” dose constituted a treatment response index.
Treatment response index, age, amiodarone, body surface area, warfarin indication, and target INR were associated with dose in the derivation sample. A clinical algorithm based on these factors was remarkably accurate: in the retrospective validation cohort its R2 was 61.2% and median absolute error (MAE) was 5.0 mg/week. Accuracy and safety was confirmed in a prospective cohort (N=43). CYP2C9 variants and VKORC1-1639 G→A were significant dose predictors in both the derivation and validation samples. In the retrospective validation cohort, the PGx algorithm had: R2= 69.1% (P<0.05 vs. clinical algorithm), MAE= 4.7 mg/week.
A pharmacogenetic warfarin dose-refinement algorithm based on clinical, INR, and genetic factors can explain at least 69.1% of therapeutic warfarin dose variability after about one week of therapy.
warfarin; VKORC1; CYP2C9; pharmacogenetic
The objective of this study was to evaluate emergency medicine physician and nurse acceptance of nonnurse, nonphysician screening for geriatric syndromes.
This was a single-center emergency department (ED) survey of physicians and nurses after an 8-month project. Geriatric technicians were paid medical student research assistants evaluating consenting ED patients older than 65 years for cognitive dysfunction, fall risk, or functional decline. The primary objective of this anonymous survey was to evaluate ED nurse and physician perceptions about the geriatric screener feasibility and barriers to implementation. In addition, as a secondary objective, respondents reported ongoing geriatric screening efforts independent of the research screeners.
The survey was completed by 72% of physicians and 33% of nurses. Most nurses and physicians identified geriatric technicians as beneficial to patients without impeding ED throughput. Fewer than 25% of physicians routinely screen for any geriatric syndromes. Nurses evaluated for fall risk significantly more often than physicians, but no other significant differences were noted in ongoing screening efforts.
Dedicated geriatric technicians are perceived by nurses and physicians as beneficial to patients with the potential to improve patient safety and clinical outcomes. Most nurses and physicians are not currently screening for any geriatric syndromes.
The application of pharmacogenetic results requires demonstrable correlations between a test result and an indicated specific course of action. We developed a computational decision-support tool that combines patient-specific genotype and phenotype information to provide strategic dosage guidance. This tool, through estimating quantitative and temporal parameters associated with the metabolism- and concentration-dependent response to warfarin, provides the necessary patient-specific context for interpreting international normalized ratio (INR) measurements.
We analyzed clinical information, plasma S-warfarin concentration, and CYP2C9 (cytochrome P450, family 2, subfamily C, polypeptide 9) and VKORC1 (vitamin K epoxide reductase complex, subunit 1) genotypes for 137 patients with stable INRs. Plasma S-warfarin concentrations were evaluated by VKORC1 genotype (−1639G>A). The steady-state plasma S-warfarin concentration was calculated with CYP2C9 genotype–based clearance rates and compared with actual measurements.
The plasma S-warfarin concentration required to yield the target INR response is significantly (P < 0.05) associated with VKORC1 −1639G>A genotype (GG, 0.68 mg/L; AG, 0.48 mg/L; AA, 0.27 mg/L). Modeling of the plasma S-warfarin concentration according to CYP2C9 genotype predicted 58% of the variation in measured S-warfarin concentration: Measured [S-warfarin] = 0.67(Estimated [S-warfarin]) + 0.16 mg/L.
The target interval of plasma S-warfarin concentration required to yield a therapeutic INR can be predicted from the VKORC1 genotype (pharmacodynamics), and the progressive changes in S-warfarin concentration after repeated daily dosing can be predicted from the CYP2C9 genotype (pharmacokinetics). Combining the application of multivariate equations for estimating the maintenance dose with genotype-guided pharmacokinetics/pharmacodynamics modeling provides a powerful tool for maximizing the value of CYP2C9 and VKORC1 test results for ongoing application to patient care.
The aim of this study was to compare the accuracy of genetic tables and formal pharmacogenetic algorithms for warfarin dosing.
Pharmacogenetic algorithms based on regression equations can predict warfarin dose, but they require detailed mathematical calculations. A simpler alternative, recently added to the warfarin label by the U.S. Food and Drug Administration, is to use genotype-stratified tables to estimate warfarin dose. This table may potentially increase the use of pharmacogenetic warfarin dosing in clinical practice; however, its accuracy has not been quantified.
A retrospective cohort study of 1,378 patients from 3 anticoagulation centers was conducted. Inclusion criteria were stable therapeutic warfarin dose and complete genetic and clinical data. Five dose prediction methods were compared: 2 methods using only clinical information (empiric 5 mg/day dosing and a formal clinical algorithm), 2 genetic tables (the new warfarin label table and a table based on mean dose stratified by genotype), and 1 formal pharmacogenetic algorithm, using both clinical and genetic information. For each method, the proportion of patients whose predicted doses were within 20% of their actual therapeutic doses was determined. Dosing methods were compared using McNemar’s chi-square test.
Warfarin dose prediction was significantly more accurate (all p < 0.001) with the pharmacogenetic algorithm (52%) than with all other methods: empiric dosing (37%; odds ratio [OR]: 2.2), clinical algorithm (39%; OR: 2.2), warfarin label (43%; OR: 1.8), and genotype mean dose table (44%; OR: 1.9).
Although genetic tables predicted warfarin dose better than empiric dosing, formal pharmacogenetic algorithms were the most accurate.
coumarins; dose prediction; dosing algorithms; FDA label; genetic tables; pharmacogenetics; warfarin
There is currently much interest in pharmacogenetics: determining variation in genes that regulate drug effects, with a particular emphasis on improving drug safety and efficacy. The ability to determine such variation motivates the application of personalized drug therapies that utilize a patient's genetic makeup to determine a safe and effective drug at the correct dose. To ascertain whether a genotype-guided drug therapy improves patient care, a personalized medicine intervention may be evaluated within the framework of a randomized controlled trial. The statistical design of this type of personalized medicine intervention requires special considerations: the distribution of relevant allelic variants in the study population; and whether the pharmacogenetic intervention is equally effective across subpopulations defined by allelic variants.
The statistical design of the Clarification of Optimal Anticoagulation through Genetics (COAG) trial serves as an illustrative example of a personalized medicine intervention that uses each subject's genotype information. The COAG trial is a multicenter, double blind, randomized clinical trial that will compare two approaches to initiation of warfarin therapy: genotype-guided dosing, the initiation of warfarin therapy based on algorithms using clinical information and genotypes for polymorphisms in CYP2C9 and VKORC1; and clinical-guided dosing, the initiation of warfarin therapy based on algorithms using only clinical information.
We determine an absolute minimum detectable difference of 5.49% based on an assumed 60% population prevalence of zero or multiple genetic variants in either CYP2C9 or VKORC1 and an assumed 15% relative effectiveness of genotype-guided warfarin initiation for those with zero or multiple genetic variants. Thus we calculate a sample size of 1238 to achieve a power level of 80% for the primary outcome. We show that reasonable departures from these assumptions may decrease statistical power to 65%.
In a personalized medicine intervention, the minimum detectable difference used in sample size calculations is not a known quantity, but rather an unknown quantity that depends on the genetic makeup of the subjects enrolled. Given the possible sensitivity of sample size and power calculations to these key assumptions, we recommend that they be monitored during the conduct of a personalized medicine intervention.
Using a two-period group randomized study, we tested whether a technology assisted pharmacist intervention improved physician adherence to coronary heart disease (CHD) secondary prevention medication guidelines. After an observation period, physician practices were randomized to intervention or control arms. In the intervention arm, alerts prompted a pharmacist to communicate with the responsible physician about secondary prevention medications. The intervention significantly improved the proportion of patients discharged on appropriate secondary prevention medications.
In order to institute early hospital-wide interventions, we constructed
a reliable automated model for identifying newly admitted patients with
congestive heart failure using electronically captured administrative
and clinical data.
To compare the satisfaction and knowledge of patients who have their warfarin managed by their physician or by a multidisciplinary, telephone-based anticoagulation service (ACS) and to assess referring physicians' satisfaction with the ACS.
DESIGN AND PARTICIPANTS
We surveyed 300 patients taking warfarin (mean age 73 years): 150 at health centers randomized to have access to an ACS, and 150 at control health centers without ACS access. We also surveyed 17 physicians who refer patients to the ACS.
Eight outpatient health centers in Missouri and Southern Illinois.
We asked patients about the timeliness of international normalized ratio (INR) monitoring, perceived safety of warfarin, overall satisfaction with their warfarin management, and knowledge of what a high INR meant. We asked physicians at ACS-available health centers how many minutes they saved per INR by referring patients to the ACS, their satisfaction with the ACS, and their willingness to recommend the ACS to a colleague.
As compared with patients at control health centers, patients at ACS-available health centers were more satisfied with the timeliness of getting blood test results (mean 4.31 vs 4.03, P = .02), were more likely to know what a safe INR value was (45% vs 15%, P = .001), and felt safer taking warfarin (mean 5.7 vs 5.2, P = .04). Physicians reported that using the ACS saved, on average, four minutes of their time and 13 minutes of their staff's time, per INR. All physicians recommended use of the ACS to a colleague and were highly satisfied with the ACS.
A telephone-based ACS can be endorsed by primary-care physicians and improve patients' satisfaction with and knowledge about their antithrombotic therapy.
patient satisfaction; physician satisfaction; anticoagulation; disease-state management; warfarin
Cost effective in patients at high risk of stroke, unless INR is well controlled