Recent research on warfarin dose prediction has focused on the use of formal pharmacogenetic algorithms based on complex regression equations. However, in January 2010, the Food and Drug Administration updated the warfarin label to include pharmacogenetic dosing information in the form of a more straightforward table of estimated warfarin dose, stratified by genotype. This change may potentially increase the use of pharmacogenetic dose prediction by clinicians prescribing warfarin; thus, it is critical to determine the accuracy of this method. In this analysis, we compared the accuracy of these various dosing methods on the basis of whether the predicted dose fell within a clinically meaningful range of within 20% of the stable therapeutic dose, as done previously (4
In our study, dosing based on genetic tables was somewhat more accurate than that based on nongenetic methods. Using the warfarin label or the genotype mean tables, accurate dosing could be predicted 43% or 44% of the time, respectively, compared with 39% using a clinical algorithm and 37% using an empiric 5 mg/day dose. However, the pharmacogenetic algorithm was superior to all other methods, including these tables. Using a pharmacogenetic algorithm, accurate dose prediction was achieved 52% of the time. Furthermore, the pharmacogenetic algorithm had lower rates of both dose overestimation and underestimation than all other dosing methods () as well as better dosing prediction in those requiring high and low warfarin doses (Online Fig. 1
Both genetic tables were found to be an improvement over empiric dosing and a clinical algorithm, affirming the importance of genetics in determining warfarin dose. Because both genetic tables performed similarly, it seems likely that their suboptimal performance relative to the pharmacogenetic algorithm is based on inherent limitations of using tables, rather than the result of inaccurate estimates. Formal algorithms would be expected to more precisely quantify the effects of multiple genetic factors on dose, compared with the average effects that are used in dosing tables. Moreover, age, BSA, target INR, drug interactions, and other clinical factors have all previously been shown to affect warfarin dose (11
). Thus, it is unsurprising that the combination of clinical factors with genotype in the context of a formal pharmacogenetic algorithm seemed to be the most beneficial for improving the accuracy of dose prediction.
The warfarin label does in fact state that clinical factors should be accounted for; however, it does not give guidelines on how both clinical and genetic factors should be combined to determine an appropriate dose. As a result, there was no way to formally test the impact of clinical factors on the use of the genetic table in the warfarin label. Analyses that approximated how clinicians may account for clinical factors led to minimal improvement in the accuracy of the warfarin label; however, the formal pharmacogenetic algorithm remained superior.
One possible reason that adjusting for clinical factors did not result in more improvement in dosing accuracy is that the magnitude of the effect on warfarin dose varies from factor to factor and that level of complexity is difficult to account for in a simple table. Additionally, another limitation of the warfarin label is that it gives a wide relative dosing range for those most sensitive to the drug. Thus, any simplified table most likely will not be able to capture the full complexity of personalized warfarin dosing, because it will not be able to easily and accurately quantify the effects of both genetic and clinical factors.
Nonetheless, even with the pharmacogenetic algorithm, 48% of patients were ultimately not dosed accurately. There are several reasons why such a substantial proportion of patients may not be dosed correctly. First, clinical factors other than those included in our study (and, to our knowledge, not included in well-validated dosing algorithms) can affect warfarin dose (16
). Second, genes other than CYP2C9
may affect warfarin dose, such as CYP4F2
, and ApoE
), although their effects on warfarin dose require confirmation (26
). These and other genes could be incorporated into future pharmacogenetic algorithms or tables to improve their accuracy; however, studies are needed to assess their clinical utility.
Finally, as expected on the basis of previous results (9
), both the pharmacogenetic algorithm and the genetic tables were less accurate in African Americans. In fact, the clinical algorithm seemed to perform just as well as these methods in African Americans, although our analysis cohort did not have sufficient power to formally compare dosing methods within this group. To show their greatest benefit, pharmacogenetic algorithms will likely have to incorporate genetic information from studies done in African Americans. In short, it is clear that despite the extensive amount of research that has been done on warfarin dosing variability, a substantial percent of that variability remains to be explained (9
Perhaps most important, although we know that pharmacogenetic algorithms lead to more accurate warfarin dose prediction, it remains unknown whether using a formal pharmacogenetic algorithm will actually improve laboratory or clinical outcomes. Although some observational research suggests that this may be the case (35
), the answer to these questions awaits the results of large randomized clinical trials that are currently under way (36
The average cost of warfarin-related genetic testing is currently almost $400 per patient (39
). This cost could fall in the near future, making genotyping more feasible for routine clinical practice. However, given the substantial number of patients started on warfarin each year, the added cost of genotyping over the current standard of care will still be significant. Thus, it is critical to use the information gained from genetic testing in the most effective way possible.
Our results suggest that the genetic table provided in the updated warfarin label is an improvement over empiric dosing and thus may be appropriate under certain circumstances. However, tables are less accurate than a formal pharmacogenetic algorithm for warfarin dose prediction. As a result, it seems advisable for clinicians to use a formal pharmacogenetic algorithm instead of a genetic dosing table when feasible. Should dosing algorithms prove effective at improving clinical outcomes in ongoing randomized trials, methods to improve access to these algorithms could include publication in the warfarin label, web-based access, and incorporation into handheld devices.