Our study found little randomized trial data available to support the hypothesis that pharmacogenetic dosing at the onset of warfarin therapy reduces major bleeding events. An extensive search yielded only three small randomized trials evaluating pharmacogenetic dosing, and among these, there was significant variability in terms of design quality, length of follow-up, intervention and outcome measures. No study had adequate power to evaluate differences in major bleeding rates between groups. In the pooled estimates, there was a trend towards less bleeding with pharmacogenetic dosing, but this should be interpreted with caution because of the differences in design between studies. Percentage time within therapeutic range varied significantly across the studies even with standardized INR range and more uniform follow-up time. This disparity raises concern that methods of ascertainment of this outcome are likely to have differed between studies. There was some evidence that time to stable warfarin dose may be decreased with genotype-guided dosing.
The study by Anderson el al.57
is the highest quality trial published to date, and the only study that incorporated both VKORC1
It is notable that there were more variant alleles in the standard dosing arm compared to pharmacogenetic arm of this study.57
Because patients with variant alleles are known to be more likely to have out of range INR and bleeding complications, this difference could have biased the results in favor of the pharmacogenetic arm. Indeed, some outcome estimates in this trial favored pharmacogenetic dosing, but none achieved statistical significance.
Only the Caraco study48
showed statistically significant improvement in nearly all surrogate outcomes with pharmacogenetic dosing. However, the lack of true randomization and allocation concealment, the high loss to follow-up, the lack of intention to treat analysis and the different lengths of follow-up between groups challenge the internal validity of these results. Specifically, the outcomes of total number of bleeding events, percentage time INR in therapeutic range, days of supratherapeutic INR and total number of INR draws are invalidated on the basis of detection bias as a result of the nearly two-fold increased follow-up time for the control group. As an example, the total number of INR draws was 36% higher in the control group, but the average interval between consecutive INR draws was the same between groups.
Both the Hillman47
studies used a multivariable algorithm to select the initial dose for the patients in the intervention arm, taking into account not only the contribution of genetic variation, but also other well-established factors that are known to affect overall warfarin dose such as age, sex and weight. In contrast, patients in the control arms of these two studies47,57
all received the same initial dose. Despite this seemingly unfair advantage at the outset, neither of these studies demonstrated statistically significant improvement of outcomes for the pharmacogenetic arm.
Is pharmacogenetic dosing of warfarin more safe and effective than a one-size-fits all strategy followed by careful INR monitoring? The results of our study demonstrate that we still do not know. An uncontrolled study 59
evaluating a CYP2C9
dosing algorithm 40
in patients initiating warfarin further highlights this uncertainty. Although the algorithm estimated the maintenance warfarin dose well (R2
0.001), carriers of variant CYP2C9
alleles continued to have a significantly increased risk of INR
>4 (HR 4.6, p
0.01) compared to those with the wild-type allele. There is evidence that the greatest risk of warfarin-induced adverse events is at the induction of warfarin therapy,60
and that INR levels prior to day 4 of therapy do not predict dose response differences. Thus, the traditional “trial and error” method may result in delays in estimating the appropriate dose.41
However, Li and colleagues61
recently found that CYP2C9
genotypes did not add to early INR response as a predictor of warfarin sensitivity. Even if pharmacogenetic dosing does not reduce major bleeding, it may still be useful and cost effective if it results in shorter time to stable dose, and fewer blood draws to attain stable INR. It is possible, however, that physicians may become more complacent with pharmacogenetic dosing resulting in reduced surveillance and a paradoxical increase in bleeding during the initiation of warfarin therapy.
Our study has limitations. First, very little high-quality evidence has been published in this area: we identified only three small randomized trials evaluating pharmacogenetic dosing of warfarin. Second, important differences in designs, outcome definitions and follow-up intervals used by these three trials reduced the degree to which we could pool their individual findings. We did not perform meta-analysis of secondary outcomes because of significant heterogeneity of the trials. Third, we did not evaluate genotype-specific outcomes, because these are not relevant when providers are unaware of genotypes in advance. Lastly, we did not include the three prospective cohort studies using genotype-guided algorithms,59,62,63
because others have reviewed this literature,64,65
and we decided a priori that only randomized trials could reliably demonstrate whether pharmacogenetic dosing improves patient outcomes. It may be considered early to perform a systematic review on a topic where so few randomized controlled trials are available; however, given the FDA relabeling, we feel it is important to evaluate the current evidence.
The package insert of warfarin advises that “lower initiation doses should be considered for patients with certain genetic variations in CYP2C9
enzymes.” This FDA labeling change was made on the basis of accumulation of data66
demonstrating that allelic variants in CYP2C9
are associated with increased plasma warfarin levels, out of range INR and increased bleeding risk.15,19,23,35,67,68
However, as our study demonstrates, there is no evidence that a more accurate initiation dose reduces the risk of bleeding. Results from ongoing clinical trials will help to clarify the role of genetic testing in warfarin management. A target enrollment of at least 2,000 patients has been suggested,57
and currently the cumulative experience of >2,500 patients is anticipated.
Each of the randomized trials reviewed in our study used different pharmacogenetic and control group dosing algorithms. The most comprehensive and widely available pharmacogenomic algorithm, http://www.WarfarinDosing.org
, has been recently validated by the The International Warfarin Pharmacogenetics Consortium and will be used in the largest randomized trial sponsored by the NHLBI.38
Until recently, however, there was no widely accepted pharmacogenetic algorithm to guide the initiation of warfarin therapy, and new models are still being developed and validated.69
Although warfarin dosing algorithms do not eliminate the need for frequent INR monitoring and dose titration, these algorithms can, even in the absence of genotype information, provide a very good estimate of the patient’s warfarin dose by taking into account readily available information such as age, gender, weight and smoking status.38
Whether these algorithms improve outcomes compared to other warfarin initiation strategies is not known.
The products of genetic discovery are becoming increasingly relevant to the practice of clinical medicine, particularly in the realm of pharmacogenetics. Genotype-guided warfarin prescribing is currently the focus of much attention and is positioned to set a precedent for how integration of genetic technologies in clinical practice will proceed. In the case of warfarin, it seems intuitive that adjusting warfarin dose to match patients’ genetic makeup will result in fewer complications; however, our review, along with at least one unfavorable cost-effectiveness analysis,44
demonstrates that additional clinical trial data are needed prior to endorsing a new standard of care for warfarin dosing.