Obesity increases the risk for venous thromboembolism (VTE), but whether high-dose thromboprophylaxis is safe and effective in morbidly obese inpatients is unknown.
To quantify the efficacy and safety of high-dose thromboprophylaxis with heparin or enoxaparin in inpatients with weight > 100 kilograms (kg) within the BJC HealthCare system.
In a retrospective cohort study, we analyzed 9241 inpatients with weight > 100 kg discharged from three hospitals in the BJC HealthCare system from 2010 through 2012. We compared the incidence of VTE in patients who received high-dose thromboprophylaxis (heparin 7500 units three times daily or enoxaparin 40 milligrams (mg) twice daily) to those who received standard doses (heparin 5000 units two or three times daily or enoxaparin 40 mg once daily). The primary efficacy outcome was hospital-acquired VTE identified by International Classification of Diseases (ICD)-9 diagnosis codes. The primary safety outcome was bleeding events identified by ICD-9 codes.
Among the 3928 morbidly obese inpatients (weight > 100kg and body mass index (BMI) ≥ 40 kg/m2), high-dose thromboprophylaxis approximately halved the odds of symptomatic VTE (odds ratio (OR) 0.52, 95% CI 0.27-1.00; p-value (p) = 0.050). The rate of VTE was 1.48% (35/2369) in these morbidly obese inpatients who received standard doses of thromboprophylaxis, compared to 0.77% (12/1559) in those who received high doses. High-dose thromboprophylaxis did not increase bleeding (OR 0.84, 95% CI 0.66-1.07, p = 0.15). Independent predictors of VTE include surgery, male, cancer, and BMI.
High-dose thromboprophylaxis nearly halves the rate of VTE in morbidly obese inpatients.
Obesity; obese inpatient; thromboprophylaxis; venous thromboembolism
Venous thromboembolism (VTE) is the most common preventable cause of morbidity and mortality in the hospital. Adequate thromboprophylaxis has reduced the rate of hospital-acquired VTE substantially; however, some inpatients still develop VTE even when they are prescribed thromboprophylaxis. Predictors associated with thromboprophylaxis failure are unclear. In this study, we aimed to identify risk factors for inpatient VTE despite thromboprophylaxis.
Materials and methods
We conducted a case-control study to identify independent predictors for inpatient VTE. Among patients discharged from the BJC HealthCare system between January 2010 and May 2011, we matched 94 cases who developed in-hospital VTE while taking thromboprophylaxis to 272 controls who did not develop VTE. Matching was done by hospital, patient age, month and year of discharge. We used multivariate conditional logistic regression to develop a VTE prediction model.
We identified five independent risk factors for in-hospital VTE despite thromboprophylaxis: hospitalization for cranial surgery, intensive care unit admission, admission leukocyte count >13,000/mm3, presence of an indwelling central venous catheter, and admission from a long-term care facility.
We identified five risk factors associated with the development of VTE despite thromboprophylaxis in the hospital setting. By recognizing these high-risk patients, clinicians can prescribe aggressive VTE prophylaxis judiciously and remain vigilant for signs or symptoms of VTE.
venous thromboembolism; thromboprophylaxis; anticoagulation; risk factors
Increasingly, clinicians and researchers are using administrative data for clinical and outcomes research. However, they continue to question the accuracy of using International Classification of Diseases 9th Revision (ICD-9) codes alone to capture diagnoses, especially venous thromboembolism (VTE), in administrative data.
We tested the hypothesis that incorporation of treatment data and/or common procedural terminology (CPT) codes could improve accuracy of administrative data in detecting VTE.
Using the Veterans Affairs Central Cancer Registry, we compared three competing algorithms by performing three cross-sectional studies. Algorithm 1 identified patients by ICD-9 codes alone. Algorithm 2 required VTE treatment in addition to ICD-9 codes. Algorithm 3 required a VTE diagnostic CPT code in addition to treatment and ICD-9 criteria.
The accuracy of ICD-9 codes alone for detection of VTE was marginal, with a PPV of 72%. The PPV was improved to 91% after addition of treatment data (algorithm 2). As compared to algorithm 2, addition of CPT codes (algorithm 3) did not significantly increase the accuracy of detecting VTE (PPV 92%), but decreased sensitivity from 72% to 67%.
Accuracy of VTE detection significantly improved with addition of treatment data to ICD-9 codes. This approach should facilitate use of administrative data to assess the incidence, epidemiology, and outcomes of VTE.
Venous Thromboembolism; Administrative Data; Health Service Research
We evaluated clinical outcomes associated with ESA use in LVAD-supported patients.
Use of erythropoiesis stimulating agents (ESAs) in patients with left ventricular assist devices (LVADs) may minimize blood transfusions and decrease allosensitization. ESAs increase thrombotic events which is concerning as LVADs are sensitive to pump thrombosis (PT).
We retrospectively reviewed 221 patients at our center who received a HeartMate II® LVAD between 1/2009 and 6/2013. Patients were divided into those who received ESAs during index admission (n = 121) and those who did not (n = 100). Suspected PT was defined as evidence of thrombus in the LVAD or severe hemolysis (LDH > 1,000 mg/dL or plasma free hemoglobin > 40mg/dL). Outcomes were compared between cohorts using inverse probability-weighted analyses.
During a mean follow-up of 14.2 ± 11.9 months, suspected PT occurred in 37 patients (ESA 23%, no-ESA 12%; P =0.03). The ESA cohort received ESAs 13.9 ± 60.9 days after LVAD implantation. At 180-days, event-free rates for suspected PT were ESA 78.6% vs. no-ESA 94.5% (P < 0.001). ESA use had higher rates of suspected PT (HR 2.35, 95% CI 1.38-4.00; P = 0.002). For every 100 unit increase in cumulative ESA dosage, the hazard of suspected PT increased by 10% (HR 1.10, 95% 1.04-1.16; P < 0.001). After inverse probability weighting, ESA use was associated with a significantly higher rate of all-cause mortality (HR 1.62, 95% 1.12-2.33; P = 0.01).
ESA use in LVAD patients is associated with higher rates of suspected PT.
Erythropoiesis stimulating agent; left ventricular assist device; thrombosis
Perioperative myocardial infarction is a serious complication after non-cardiac surgery. We hypothesized that preoperative cardiac troponin T detected with a novel high-sensitivity (hs-cTnT) assay will identify patients at risk of acute myocardial infarction (AMI) and long-term mortality after major non-cardiac surgery.
This was a prospective cohort study within the Vitamins in Nitrous Oxide (VINO) trial (n=608). Patients had been diagnosed with or had multiple risk factors for coronary artery disease and underwent major non-cardiac surgery. Cardiac troponin I (contemporary assay) and troponin T (high-sensitivity assay), and 12-lead electrocardiograms were obtained before and immediately after surgery and on postoperative day 1, 2 and 3.
At baseline before surgery, 599 patients (98.5%) had a detectable hs-cTnT concentration and 247 (41%) were above 14 ng/L (99th percentile). After surgery, 497 patients (82%) had a rise in hs-cTnT (median Δhs-cTnT +2.7 ng/L [IQR 0.7, 6.8]). During the first three postoperative days, 9 patients (2.5%) with a preoperative hs-cTnT <14 ng/L suffered from AMI, compared to 21 patients (8.6%) with a preoperative hs-cTnT >14 ng/L (odds ratio, 3.67; 95% CI 1.65 – 8.15). During long-term follow-up, 80 deaths occurred. The 3-year mortality rate was 11% in patients with a preoperative hs-cTnT concentration <14 ng/L compared to 25% in patients with a preoperative hs-cTnT >14 ng/L (adjusted hazard ratio, 2.17; 95% CI 1.19 – 3.96).
In this cohort of high-risk patients, preoperative hs-cTnT concentrations were significantly associated with postoperative myocardial infarction and long-term mortality after non-cardiac surgery.
Bleeding complications are common and decrease the odds of survival in children supported with extracorporeal membrane oxygenation (ECMO). The role of platelet dysfunction on ECMO-induced coagulopathy and resultant bleeding complications is not well understood. The primary objective of this pilot study was to determine the incidence and magnitude of platelet dysfunction according to thromboelastography (TEG®)–platelet mapping (PM) testing.
Retrospective chart review of children <18 years old who required ECMO at a tertiary level hospital. We collected TEG®–PM and conventional coagulation tests data. We also collected demographic, medications, blood products administered, and clinical outcome data. We defined severe platelet dysfunction as <50% aggregation in response to an agonist.
We identified 24 out of 46 children on ECMO, who had TEG®–PM performed during the study period. We found the incidence of severe bleeding was 42% and mortality was 54% in our study cohort. In all samples measured, severe qualitative platelet dysfunction was more common for adenosine diphosphate (ADP)-mediated aggregation (92%) compared to arachidonic acid (AA)-mediated aggregation (75%) (p = 0.001). Also, ADP-mediated percent of platelet aggregation was significant lower than AA-mediated platelet aggregation [15% (interquartile range, IQR 2.8–48) vs. 49% (IQR 22–82.5), p < 0.001]. There was no difference in kaolin-activated heparinase TEG® parameters between the bleeding group and the non-bleeding group. Only absolute platelet count and TEG®–PM had increased predictive value on receiver operating characteristics analyses for severe bleeding and mortality compared to activated clotting time.
We found frequent and severe qualitative platelet dysfunction on TEG®–PM testing in children on ECMO. Larger studies are needed to determine if the assessment of qualitative platelet function by TEG®–PM can improve prediction of bleeding complications for children on ECMO.
platelet mapping; thromboelastography; ECMO; anticoagulation; platelet dysfunction; heparin; ECMO-induced coagulopathy
The risk of venous thromboembolism (VTE) is higher after total hip or knee replacement surgery than after almost any other surgical procedure; warfarin sodium is commonly prescribed to reduce this peri-operative risk. Warfarin has a narrow therapeutic window with high inter-individual dose variability and can cause hemorrhage. The Genetics-InFormatics Trial (GIFT) of Warfarin to Prevent Deep Vein Thrombosis (DVT) is a 2×2 factorial-design, randomized controlled trial designed to compare the safety and effectiveness of warfarin-dosing strategies. GIFT will answer two questions: (1) Does pharmacogenetic (PGx) dosing reduce the rate of adverse events in orthopedic patients; and (2) Is a lower target International Normalized Ratio (INR) non-inferior to a higher target INR in orthopedic participants? The composite primary endpoint of the trial is symptomatic and asymptomatic VTE (identified on screening ultrasonography), major hemorrhage, INR ≥ 4, and death.
pharmacogenetics; warfarin; randomized controlled trial; dosing algorithm
To determine whether genetic variants associated with warfarin dose variability were associated with increased risk of major bleeding during warfarin therapy.
Materials & methods
Using Vanderbilt’s DNA biobank we compared the prevalence of CYP2C9, VKORC1 and CYP4F2 variants in 250 cases with major bleeding and 259 controls during warfarin therapy.
CYP2C9*3 was the only allele that differed significantly among cases (14.2%) and controls (7.8%; p = 0.022). In the 214 (85.6%) cases with a major bleed 30 or more days after warfarin initiation, CYP2C9*3 was the only variant associated with bleeding (adjusted odds ratio: 2.05; 95% CI: 1.04, 4.04).
The CYP2C9*3 allele may double the risk of major bleeding among patients taking warfarin for 30 or more days.
CYP2C9; CYP4F2; pharmacogenetics; risk of major bleeding; VKORC1; warfarin
Via generation of vitamin K-dependent proteins, gamma-glutamyl carboxylase (GGCX) plays a critical role in the vitamin K cycle. Single nucleotide polymorphisms (SNPs) in GGCX, therefore, may affect dosing of the vitamin K antagonist, warfarin.
In a multi-centered, cross-sectional study of 985 patients prescribed warfarin therapy, we genotyped for two GGCX SNPs (rs11676382 and rs12714145) and quantified their relationship to therapeutic dose.
GGCX rs11676382 was a significant (p=0.03) predictor of residual dosing error and was associated with a 6.1% reduction in warfarin dose (95% CI: 0.6%-11.4%) per G allele. The prevalence was 14.1% in our predominantly (78%) Caucasian cohort, but the overall contribution to dosing accuracy was modest (partial R2 = 0.2%). GGCX rs12714145 was not a significant predictor of therapeutic dose (p = 0.26).
GGCX rs11676382 is a statistically significant predictor of warfarin dose, but the clinical relevance is modest. Given the potentially low marginal cost of adding this SNP to existing genotyping platforms, we have modified our non-profit website (www.WarfarinDosing.org) to accommodate knowledge of this variant.
gamma-glutamyl carboxylase; warfarin; pharmacogenetics; algorithm
Current dosing practices for warfarin are empiric and result in the need for frequent dose changes as the international normalized ratio gets too high or too low. As a result, patients are put at increased risk for thromboembolism, bleeding, and premature discontinuation of anticoagulation therapy. Prior research has identified clinical and genetic factors that can alter warfarin dose requirements, but few randomized clinical trials have examined the utility of using clinical and genetic information to improve anticoagulation control or clinical outcomes among a large, diverse group of patients initiating warfarin.
The COAG trial is a multicenter, double-blind, randomized trial comparing 2 approaches to guiding warfarin therapy initiation: initiation of warfarin therapy based on algorithms using clinical information plus an individual's genotype using genes known to influence warfarin response (“genotype-guided dosing”) versus only clinical information (“clinical-guided dosing”) (www.clinicaltrials.gov Identifier: NCT00839657).
The COAG trial design is described. The study hypothesis is that, among 1,022 enrolled patients, genotype-guided dosing relative to clinical-guided dosing during the initial dosing period will increase the percentage of time that patients spend in the therapeutic international normalized ratio range in the first 4 weeks of therapy.
The COAG will determine if genetic information provides added benefit above and beyond clinical information alone. (Am Heart J 2013;166:435-441.e2.)
Warfarin is commonly prescribed for prophylaxis and treatment of thromboembolism after orthopedic surgery. During warfarin initiation, out-of-range International Normalized Ratio (INR) values and adverse events are common.
In orthopedic patients beginning warfarin therapy, we developed and prospectively validated pharmacogenetic and clinical dose refinement algorithms to revise the estimated therapeutic dose after 4 days of therapy.
The pharmacogenetic algorithm used the cytochrome P450 (CYP) 2C9 genotype, smoking status, perioperative blood loss, liver disease, INR values, and dose history to predict the therapeutic dose. The R2 was 82% in a derivation cohort (N = 86), and 70% when used prospectively (N = 146). The R2 of the clinical algorithm that used INR values and dose history to predict the therapeutic dose was 57% in a derivation cohort (N = 178), and 48% in a prospective validation cohort (N = 146). In one month of prospective follow-up, the percent time spent in the therapeutic range was 7% higher (95% CI: 2.7%–11.7%) in the pharmacogenetic cohort. The risk of laboratory or clinical adverse event was also significantly reduced in the pharmacogenetic cohort (Hazard Ratio 0.54; 95% CI: 0.29–0.97).
Warfarin dose adjustments that incorporate genotype and clinical variables available after four warfarin doses are accurate. In this non-randomized, prospective study, pharmacogenetic dose refinements were associated with more time spent in the therapeutic range and fewer laboratory or clinical adverse events. To facilitate gene-guided warfarin dosing we created a non-profit website, www.WarfarinDosing.org.
Warfarin; Pharmacogenetics; Dosing Algorithm; Anticoagulants; Orthopedic Surgery
Nitrous oxide causes an acute increase in plasma homocysteine that is more pronounced in patients with the MTHFR C677T or A1298C gene variant. In this randomized controlled trial we sought to determine if patients carrying the MTHFR C677T or A1298C variant had a higher risk for perioperative cardiac events after nitrous oxide anesthesia and if this risk could be mitigated by B-vitamins.
We randomized adult patients with cardiac risk factors undergoing noncardiac surgery to receive nitrous oxide plus intravenous B-vitamins before and after surgery or to nitrous oxide and placebo. Serial cardiac biomarkers and 12-lead electrocardiograms were obtained. The primary study endpoint was the incidence of myocardial injury, as defined by cardiac troponin I elevation within the first 72 hours after surgery.
A total of 500 patients completed the trial. Patients who were homozygous for either MTHFR C677T or A1298C gene variant (n= 98; 19.6%) had no increased rate of postoperative cardiac troponin I elevation compared to wild-type and heterozygous patients (11.2% vs. 14.0%; relative risk 0.96, 95% CI 0.85 to 1.07, p=0.48). B-vitamins blunted the rise in homocysteine, but had no effect on cardiac troponin I elevation compared to patients receiving placebo (13.2% vs. 13.6%; relative risk 1.02, 95% CI 0.78 to 1.32, p=0.91).
Neither MTHFR C677T and A1298C gene variant nor acute homocysteine increase are associated with perioperative cardiac troponin elevation after nitrousoxide anesthesia. B-vitamins blunt nitrous oxide-induced homocysteine increase but have no effect on cardiac troponin elevation.
The clinical utility of genotype-guided (pharmacogenetically based) dosing of warfarin has been tested only in small clinical trials or observational studies, with equivocal results.
We randomly assigned 1015 patients to receive doses of warfarin during the first 5 days of therapy that were determined according to a dosing algorithm that included both clinical variables and genotype data or to one that included clinical variables only. All patients and clinicians were unaware of the dose of warfarin during the first 4 weeks of therapy. The primary outcome was the percentage of time that the international normalized ratio (INR) was in the therapeutic range from day 4 or 5 through day 28 of therapy.
At 4 weeks, the mean percentage of time in the therapeutic range was 45.2% in the genotype-guided group and 45.4% in the clinically guided group (adjusted mean difference, [genotype-guided group minus clinically guided group], −0.2; 95% confidence interval, −3.4 to 3.1; P=0.91). There also was no significant between-group difference among patients with a predicted dose difference between the two algorithms of 1 mg per day or more. There was, however, a significant interaction between dosing strategy and race (P=0.003). Among black patients, the mean percentage of time in the therapeutic range was less in the genotype-guided group than in the clinically guided group. The rates of the combined outcome of any INR of 4 or more, major bleeding, or thromboembolism did not differ significantly according to dosing strategy.
Genotype-guided dosing of warfarin did not improve anticoagulation control during the first 4 weeks of therapy. (Funded by the National Heart, Lung, and Blood Institute and others; COAG ClinicalTrials.gov number, NCT00839657.)
Treatments for non–ST-segment elevation myocardial infarction (NSTEMI) reduce ischemic events but increase bleeding. Baseline prediction of bleeding risk can complement ischemic risk prediction for optimizing NSTEMI care; however, existing models are not well suited for this purpose.
Methods and Results
We developed (n=71,277) and validated (n=17,857) a model that identifies 8 independent baseline predictors of in-hospital major bleeding among community-treated NSTEMI patients enrolled in the CRUSADE Quality Improvement Initiative. Model performance was tested by c statistics in the derivation and validation cohorts and according to post-admission treatment (i.e., invasive and antithrombotic therapy). The CRUSADE bleeding score (range 1–100 points) was created by assigning weighted integers corresponding to the coefficient of each variable. The rate of major bleeding increased by bleeding risk score quintiles: 3.1% very low risk (≤20); 5.5% low risk (21–30); 8.6% moderate risk (31–40); 11.9% high risk (41–50); and 19.5% very high risk (>50) (Ptrend<0.001). The c statistics for the major bleeding model (derivation=0.72 and validation=0.71) and risk score (derivation=0.71 and validation=0.70) were similar. The c statistics for the model among treatment subgroups were: ≥2 antithrombotics=0.72; <2 antithrombotics=0.73; invasive approach=0.73; conservative approach=0.68.
The CRUSADE bleeding score quantifies risk for in-hospital major bleeding across all post-admission treatments, enhancing baseline risk assessment for NSTEMI care.
non-ST-segment elevation myocardial infarction; bleeding; risk assessment
CYP2C9 and VKORC1 genotypes predict therapeutic warfarin dose at initiation of therapy; however, the predictive ability of genetic information after a week or longer is unknown. Experts have hypothesized that genotype becomes irrelevant once International Normalized Ratio (INR) values are available because INR response reflects warfarin sensitivity.
We genotyped the participants in the Prevention of Recurrent Venous Thromboembolism (PREVENT) trial, who had idiopathic venous thromboemboli and began low-intensity warfarin (therapeutic INR 1.5-2.0) using a standard dosing protocol. To develop pharmacogenetic models, we quantified the effect of genotypes, clinical factors, previous doses, and INR on therapeutic warfarin dose in the 223 PREVENT participants who were randomized to warfarin and achieved stable therapeutic INRs.
A pharmacogenetic model using data from day 0 (before therapy initiation) explained 54% of the variability in therapeutic dose (R2). The R2 increased to 68% at day 7, 75% at day 14, and 77% at day 21, because of increasing contributions from prior doses and INR response. Although CYP2C9 and VKORC1 genotypes were significant independent predictors of therapeutic dose at each weekly interval, the magnitude of their predictive ability diminished over time: partial R2 of genotype was 43% at day 0, 12% at day 7, 4% at day 14, and 1% at day 21.
Over the first weeks of warfarin therapy, INR and prior dose become increasingly predictive of therapeutic dose, and genotype becomes less relevant. However, at day 7, genotype remains clinically relevant, accounting for 12% of therapeutic dose variability.
Pressure overload due to aortic stenosis (AS) causes maladaptive ventricular and vascular remodeling that can lead to pulmonary hypertension, heart failure symptoms, and adverse outcomes. Retarding or reversing this maladaptive remodeling and its unfavorable hemodynamic consequences has potential to improve morbidity and mortality. Preclinical models of pressure overload have shown that phosphodiesterase type 5 (PDE5) inhibition is beneficial, however the use of PDE5 inhibitors in patients with AS is controversial because of concerns about vasodilation and hypotension.
Methods and Results
We evaluated the safety and hemodynamic response of 20 subjects with severe symptomatic AS (mean aortic valve area 0.7±0.2 cm2, ejection fraction 60±14%) who received a single oral dose of sildenafil (40mg or 80mg). Compared to baseline, after 60 minutes sildenafil reduced systemic (−12%, p<0.001) and pulmonary (−29%, p=0.002) vascular resistance, mean pulmonary artery (−25%, p<0.001) and wedge (−17%, p<0.001) pressure, and increased systemic (+13%, p<0.001) and pulmonary (+45%, p<0.001) vascular compliance and stroke volume index (+8%, p=0.01). These changes were not dose dependent. Sildenafil caused a modest decrease in mean systemic arterial pressure (−11%, p<0.001), but was well-tolerated with no episodes of symptomatic hypotension.
This study shows for the first time that a single dose of a PDE5 inhibitor is safe and well-tolerated in patients with severe AS and is associated with acute improvements in pulmonary and systemic hemodynamics resulting in biventricular unloading. These findings support the need for longer-term studies to evaluate the role of PDE5 inhibition as adjunctive medical therapy in patients with AS.
aortic valve stenosis; heart failure; phosphodiesterase type 5 inhibitors; pulmonary hypertension; hemodynamics
AHA Scientific Statements; atrial fibrillation; atrium; epidemiology; prevention; risk factors
Approximately 1 million people in the United States and over 30 million worldwide are living with human immunodeficiency virus type 1 (HIV-1). While mortality from untreated infection approaches 100%, survival improves markedly with use of contemporary antiretroviral therapies (ART). In the United States, 25 drugs are approved for treating HIV-1, and increasing numbers are available in resource-limited countries. Safe and effective ART is a cornerstone in the global struggle against the acquired immunodeficiency syndrome. Variable responses to ART are due at least in part to human genetic variants that affect drug metabolism, drug disposition, and off-site drug targets. Defining effects of human genetic variants on HIV treatment toxicity, efficacy, and pharmacokinetics has far-reaching implications. In 2010, the National Institute of Allergy and Infectious Diseases sponsored a workshop entitled, Pharmacogenomics – A Path Towards Personalized HIV Care. This article summarizes workshop objectives, presentations, discussions, and recommendations derived from this meeting.
HIV therapy; pharmacogenetics; pharmacogenomics; workshop
By guiding initial warfarin dose, pharmacogenetic (PGx) algorithms may improve the safety of warfarin initiation. However, once INR response is known, the contribution of PGx to dose refinements is uncertain. This study sought to develop and validate clinical and PGx dosing algorithms for warfarin dose refinement on days 6–11 after therapy initiation.
Materials and Methods
An international sample of 2,022 patients at 13 medical centers on 3 continents provided clinical, INR, and genetic data at treatment days 6–11 to predict therapeutic warfarin dose. Independent derivation and retrospective validation samples were composed by randomly dividing the population (80%/20%). Prior warfarin doses were weighted by their expected effect on S-warfarin concentrations using an exponential-decay pharmacokinetic model. The INR divided by that “effective” dose constituted a treatment response index.
Treatment response index, age, amiodarone, body surface area, warfarin indication, and target INR were associated with dose in the derivation sample. A clinical algorithm based on these factors was remarkably accurate: in the retrospective validation cohort its R2 was 61.2% and median absolute error (MAE) was 5.0 mg/week. Accuracy and safety was confirmed in a prospective cohort (N=43). CYP2C9 variants and VKORC1-1639 G→A were significant dose predictors in both the derivation and validation samples. In the retrospective validation cohort, the PGx algorithm had: R2= 69.1% (P<0.05 vs. clinical algorithm), MAE= 4.7 mg/week.
A pharmacogenetic warfarin dose-refinement algorithm based on clinical, INR, and genetic factors can explain at least 69.1% of therapeutic warfarin dose variability after about one week of therapy.
warfarin; VKORC1; CYP2C9; pharmacogenetic
The objective of this study was to evaluate emergency medicine physician and nurse acceptance of nonnurse, nonphysician screening for geriatric syndromes.
This was a single-center emergency department (ED) survey of physicians and nurses after an 8-month project. Geriatric technicians were paid medical student research assistants evaluating consenting ED patients older than 65 years for cognitive dysfunction, fall risk, or functional decline. The primary objective of this anonymous survey was to evaluate ED nurse and physician perceptions about the geriatric screener feasibility and barriers to implementation. In addition, as a secondary objective, respondents reported ongoing geriatric screening efforts independent of the research screeners.
The survey was completed by 72% of physicians and 33% of nurses. Most nurses and physicians identified geriatric technicians as beneficial to patients without impeding ED throughput. Fewer than 25% of physicians routinely screen for any geriatric syndromes. Nurses evaluated for fall risk significantly more often than physicians, but no other significant differences were noted in ongoing screening efforts.
Dedicated geriatric technicians are perceived by nurses and physicians as beneficial to patients with the potential to improve patient safety and clinical outcomes. Most nurses and physicians are not currently screening for any geriatric syndromes.
The application of pharmacogenetic results requires demonstrable correlations between a test result and an indicated specific course of action. We developed a computational decision-support tool that combines patient-specific genotype and phenotype information to provide strategic dosage guidance. This tool, through estimating quantitative and temporal parameters associated with the metabolism- and concentration-dependent response to warfarin, provides the necessary patient-specific context for interpreting international normalized ratio (INR) measurements.
We analyzed clinical information, plasma S-warfarin concentration, and CYP2C9 (cytochrome P450, family 2, subfamily C, polypeptide 9) and VKORC1 (vitamin K epoxide reductase complex, subunit 1) genotypes for 137 patients with stable INRs. Plasma S-warfarin concentrations were evaluated by VKORC1 genotype (−1639G>A). The steady-state plasma S-warfarin concentration was calculated with CYP2C9 genotype–based clearance rates and compared with actual measurements.
The plasma S-warfarin concentration required to yield the target INR response is significantly (P < 0.05) associated with VKORC1 −1639G>A genotype (GG, 0.68 mg/L; AG, 0.48 mg/L; AA, 0.27 mg/L). Modeling of the plasma S-warfarin concentration according to CYP2C9 genotype predicted 58% of the variation in measured S-warfarin concentration: Measured [S-warfarin] = 0.67(Estimated [S-warfarin]) + 0.16 mg/L.
The target interval of plasma S-warfarin concentration required to yield a therapeutic INR can be predicted from the VKORC1 genotype (pharmacodynamics), and the progressive changes in S-warfarin concentration after repeated daily dosing can be predicted from the CYP2C9 genotype (pharmacokinetics). Combining the application of multivariate equations for estimating the maintenance dose with genotype-guided pharmacokinetics/pharmacodynamics modeling provides a powerful tool for maximizing the value of CYP2C9 and VKORC1 test results for ongoing application to patient care.
The aim of this study was to compare the accuracy of genetic tables and formal pharmacogenetic algorithms for warfarin dosing.
Pharmacogenetic algorithms based on regression equations can predict warfarin dose, but they require detailed mathematical calculations. A simpler alternative, recently added to the warfarin label by the U.S. Food and Drug Administration, is to use genotype-stratified tables to estimate warfarin dose. This table may potentially increase the use of pharmacogenetic warfarin dosing in clinical practice; however, its accuracy has not been quantified.
A retrospective cohort study of 1,378 patients from 3 anticoagulation centers was conducted. Inclusion criteria were stable therapeutic warfarin dose and complete genetic and clinical data. Five dose prediction methods were compared: 2 methods using only clinical information (empiric 5 mg/day dosing and a formal clinical algorithm), 2 genetic tables (the new warfarin label table and a table based on mean dose stratified by genotype), and 1 formal pharmacogenetic algorithm, using both clinical and genetic information. For each method, the proportion of patients whose predicted doses were within 20% of their actual therapeutic doses was determined. Dosing methods were compared using McNemar’s chi-square test.
Warfarin dose prediction was significantly more accurate (all p < 0.001) with the pharmacogenetic algorithm (52%) than with all other methods: empiric dosing (37%; odds ratio [OR]: 2.2), clinical algorithm (39%; OR: 2.2), warfarin label (43%; OR: 1.8), and genotype mean dose table (44%; OR: 1.9).
Although genetic tables predicted warfarin dose better than empiric dosing, formal pharmacogenetic algorithms were the most accurate.
coumarins; dose prediction; dosing algorithms; FDA label; genetic tables; pharmacogenetics; warfarin
There is currently much interest in pharmacogenetics: determining variation in genes that regulate drug effects, with a particular emphasis on improving drug safety and efficacy. The ability to determine such variation motivates the application of personalized drug therapies that utilize a patient's genetic makeup to determine a safe and effective drug at the correct dose. To ascertain whether a genotype-guided drug therapy improves patient care, a personalized medicine intervention may be evaluated within the framework of a randomized controlled trial. The statistical design of this type of personalized medicine intervention requires special considerations: the distribution of relevant allelic variants in the study population; and whether the pharmacogenetic intervention is equally effective across subpopulations defined by allelic variants.
The statistical design of the Clarification of Optimal Anticoagulation through Genetics (COAG) trial serves as an illustrative example of a personalized medicine intervention that uses each subject's genotype information. The COAG trial is a multicenter, double blind, randomized clinical trial that will compare two approaches to initiation of warfarin therapy: genotype-guided dosing, the initiation of warfarin therapy based on algorithms using clinical information and genotypes for polymorphisms in CYP2C9 and VKORC1; and clinical-guided dosing, the initiation of warfarin therapy based on algorithms using only clinical information.
We determine an absolute minimum detectable difference of 5.49% based on an assumed 60% population prevalence of zero or multiple genetic variants in either CYP2C9 or VKORC1 and an assumed 15% relative effectiveness of genotype-guided warfarin initiation for those with zero or multiple genetic variants. Thus we calculate a sample size of 1238 to achieve a power level of 80% for the primary outcome. We show that reasonable departures from these assumptions may decrease statistical power to 65%.
In a personalized medicine intervention, the minimum detectable difference used in sample size calculations is not a known quantity, but rather an unknown quantity that depends on the genetic makeup of the subjects enrolled. Given the possible sensitivity of sample size and power calculations to these key assumptions, we recommend that they be monitored during the conduct of a personalized medicine intervention.
Using a two-period group randomized study, we tested whether a technology assisted pharmacist intervention improved physician adherence to coronary heart disease (CHD) secondary prevention medication guidelines. After an observation period, physician practices were randomized to intervention or control arms. In the intervention arm, alerts prompted a pharmacist to communicate with the responsible physician about secondary prevention medications. The intervention significantly improved the proportion of patients discharged on appropriate secondary prevention medications.
In order to institute early hospital-wide interventions, we constructed
a reliable automated model for identifying newly admitted patients with
congestive heart failure using electronically captured administrative
and clinical data.