|Home | About | Journals | Submit | Contact Us | Français|
Warfarin therapy has been used clinically for over 60 years, yet continues to be problematic because of its narrow therapeutic index and large inter-individual variability in patient response. As a result, warfarin is a leading cause of serious medication-related adverse events, and its efficacy is also suboptimal.
To review factors that are responsible for variable response to warfarin, including clinical, environmental, and genetic factors, and to explore some possible approaches to improving warfarin therapy.
Recent efforts have focused on developing dosing algorithms that included genetic information to try to improve warfarin dosing. These dosing algorithms hold promise, but have not been fully validated or tested in rigorous clinical trials. Perhaps equally importantly, adherence to warfarin is a major problem that should be addressed with innovative and cost-effective interventions.
Additional research is needed to further test whether interventions can be used to improve warfarin dosing and outcomes.
Preventing thromboembolism is of major public health importance. Medical conditions known to increase the risk of thromboembolism affect a large proportion of the population and include atrial fibrillation, mechanical heart valves, deep venous thrombosis, and dilated cardiomyopathies. For example, atrial fibrillation (AF) alone affects over 2 million people in the US , and, with the ageing of the population, more than 5.6 million Americans are projected to have AF by 2050 .
Given the large number of patients with conditions that increase the risk of thromboembolism, it is not surprising that thromboembolism causes substantial morbidity and mortality in the population. It has been estimated that pulmonary embolism results in 250,000 hospitalizations each year  and, because many patients die before being hospitalized, as many as 200,000 deaths . Approximately 200,000 strokes each year may be attributable to thromboembolism, which result in an estimated 12,000 deaths just within several months of the event and over 58,000 disabled survivors [5,6]. Emboli to the kidneys are estimated to cause 5 – 10% of all cases of acute renal failure , and emboli to the limbs result in a 19% mortality rate .
At present, only aspirin and warfarin are available in oral form for long-term prevention of thromboembolism. However, for the majority of patients, only warfarin is recommended for prevention of thromboembolism . Newer drugs, such as direct thrombin inhibitors and oral factor Xa inhibitors, hold promise but are not yet available or approved to prevent thromboembolism . Warfarin sodium is a remarkably efficacious drug for the prevention of thromboembolism. Because of the efficacy of warfarin and the number of patients at risk for thromboembolism, it is not surprising that warfarin is the tenth most commonly prescribed medication in the US , with an estimated 4 million patients in the US alone having indications for warfarin therapy . This will only increase as the prevalence of indications for the drug increases with the ageing of the population .
Despite over 60 years of clinical use, warfarin remains a very challenging drug to use in practice. This is because of its narrow therapeutic index and the large inter-individual variability in patient response to the drug.
With respect to therapeutic index, although warfarin is highly efficacious when titrated to the appropriate level of anticoagulation, it loses its effectiveness when anticoagulation levels are too low. When anticoagulation levels are too high, there is a significant risk of major bleeding complications.
Bleeding is the most common side effect of warfarin and occurs in up to 41% of patients treated with warfarin, with rates of major bleeding in practice of about 7 – 8% per year [14,15]. Although major bleeding is of substantial concern, even minor bleeding can lead to withdrawal of therapy , thus depriving patients of the most effective, and often only, therapy available to prevent thromboembolism. Minor bleeding also leads to repeat office visits and sometimes emergency room visits. The most consistent and strongest predictors of bleeding complications are the International Normalized Ratio (INR) itself and the variability in anticoagulation control [16,17]. The risk of over-anticoagulation is highest during the dose-titration period of warfarin use . Not surprisingly, then, the risk of bleeding may be highest during the initiation phase of warfarin, before establishing a stable dose of the drug . Thus, better dosing of warfarin could reduce the risk both of major bleeding, with its attendant morbidity and mortality, and of discontinuation of a highly efficacious therapy due to even minor bleeding.
The effectiveness of warfarin is substantially reduced by insufficient levels of anticoagulation. The risk of thromboembolism increases dramatically as the INR falls below 2, even at levels of 1.9 . In addition, among patients on warfarin who suffer a stroke, those who are under-anticoagulated at the time of the event have significantly higher morbidity and mortality compared with those with proper anticoagulation control .
Along with bleeding and thromboembolism, under- and over-anticoagulation have other medical and economic consequences. Patients who have out-of-range INRs must be carefully reassessed within a short period and often require dosage changes. This generates additional clinic visits, blood tests, and potential for miscalculations of dosage requirements . The costs associated with improper warfarin dosing include greater warfarin-related visits to emergency rooms and hospitalizations  and more frequent follow-up . Patients whose INRs fluctuate excessively may have their warfarin discontinued , depriving them of the substantial benefit of the drug. Improper anticoagulation that leads to bleeding events, even minor ones, can also result in diminished quality of life  and, as previously noted, permanent discontinuation of warfarin .
Importantly, better control of warfarin levels has been associated with significant reductions in bleeding, thromboembolism, and costs. For example, patients treated in specialized anticoagulation clinics have been shown to have lower rates of significant bleeding and thromboembolic events than warfarin patients receiving usual care [12,24]. Similarly, trials of patient self-management of INR have shown reductions in the risk of thromboembolism events by 55%, major bleeds by 45%, and mortality by 39% [25-27]. However, self-management is estimated to be used by < 1% of patients treated in US anticoagulation clinics .
Along with its narrow therapeutic index, there is large interpatient variability in response to warfarin. Warfarin dose requirements varies widely among patients. Although the average maintenance dose is 4 – 6 mg per day, there is a very wide range of doses (such as 4.5 – 77 mg per week ) required to achieve the same INR among different patients.
Despite this fact, dosing remains largely empirical. Most patients are begun empirically on a fixed dose each day (such as 5 mg/day) during the ‘initiation phase’ of warfarin on the basis of population averages , and the dose titrated based on response, as measured by the INR. Because warfarin dosing is based on altering the dose in response to subtherapeutic and supratherapeutic levels of anticoagulation, patients are put at risk simply because the correct dose is not known in the majority of patients. Importantly, reductions in the time that a patient is under- or over-anticoagulated have been associated with reductions in bleeding, thromboembolism, and costs . These benefits are particularly relevant during the initiation phase of warfarin, when the proper dose is being determined. Because the initiation phase is a period when anticoagulation control is particularly vulnerable to dosing errors and is a period when significant bleeding and thromboembolism can occur, efforts to improve our ability to predict warfarin maintenance dose are clearly needed to enhance the safety and efficacy of the drug and to reduce the associated costs and early discontinuations.
Given our extensive knowledge of the factors discussed above and the long history of warfarin therapy, why is it still so hard to manage patients on the drug? The two main reasons are that the response to warfarin is multifactorial and multigenetic, and that patients have difficulty adhering to therapy.
Many patient and environmental factors influence warfarin response. Patient factors that have been shown to alter warfarin dose requirements include age, sex, heart failure, coronary artery disease, body weight/body mass index (BMI)/body surface area, diabetes mellitus, and indication for warfarin [29-31]. Other factors that can affect warfarin response include thyroid disease, advanced liver disease, and malignancy . Environmental factors include interacting medications, alcohol, and diet [32-34]. Medications can affect the pharmacokinetics of warfarin by reducing its absorption from the intestine, by altering its clearance, or by competitive protein binding . Drugs can also influence the pharmacodynamics of warfarin by mechanisms such as inhibition of the synthesis of vitamin K-dependent coagulation factors or increased clearance of these factors . Warfarin metabolism also may be affected by alcohol intake , and the pharmacodynamic properties of warfarin may be altered by dietary vitamin K intake . In addition to factors that alter warfarin’s effect on the INR, the use of antiplatelet drugs, such as aspirin and clopidogrel, can further increase the risk of warfarin-related bleeding, independent of INR.
Despite our in-depth understanding of the patient and environmental factors that influence warfarin dosing, a large proportion of interpatient variability in warfarin response remains unexplained. This variability is consistent with multigenetic effects on drug response. Thus, recent research has focused on the effects of genetics on warfarin response. Two genes in particular, the cytochrome P450 family 2 subfamily C polypeptide 9 enzyme (CYP2C9) gene and the vitamin K epoxide reductase complex 1 (VKORC1) gene, are the best studied. Warfarin consists of a racemic mixture of (R)- and (S)-warfarin, and these two forms are metabolized to the inactive metabolite by different cytochrome CYP450 enzymes. CYP2C9 is largely responsible for the metabolic clearance of (S)-warfarin, the more active of the two CYP2C9 isoforms , accounting for 60 – 70% of warfarin’s overall anticoagulant activity . Two variants within CYP2C9, designated * 2 and * 3, are clearly associated with lower warfarin dose requirements in Caucasian populations and increased bleeding risk [38-45]. However, their effect is less clear in other populations, particularly African Americans [44,46]. VKORC1 is the warfarin-sensitive and rate-limiting enzyme of the vitamin K cycle that recycles the epoxide and quinone form of vitamin K to the reduced non-oxidized form. Several variants within the VKORC1 gene have been associated with altered warfarin dose requirements, and haplotypes have been described that are associated with a relatively low hepatic VKORC1 mRNA expression and with lower warfarin dose requirements [43,47-49]. In Caucasians, a single-nucleotide polymorphism (SNP), 1173C/T, was as informative as VKORC1 haplotypes for predicting warfarin dose in a Caucasian population  and has been shown to similarly predict dose in African Americans .
Numerous other genes also have been postulated to alter warfarin response but are less well studied. These include genes within three pathways: pharmacokinetic, pharmacodynamic, and the coagulation pathway. Genes that can influence the pharmacokinetic response to warfarin include other cytochrome P450 genes [50,51]; albeit likely to play only a minor role, combinations of variants in these genes could still be useful for dose prediction. In addition, polymorphisms in genes encoding the plasma-binding proteins albumin (ALB) and α1-acid glycoproteins (ORM1 and ORM2) could alter warfarin response [52-54]. Along with VKORC1, other genes in the pharmacodynamic pathway could affect warfarin response, including the microsomal epoxide hydroxylase (EPHX1) that is needed for the expression of VKORC1 enzymatic activity  and γ-glutamyl carboxylase (GGCX) . The carboxylation of the coagulation factors and proteins is regulated by calumenin (CALU), which might also alter warfarin response . Another recently studied gene encoding apolipoprotein E (APOE), which facilitates cellular uptake of chylomicrons, the main vehicle of vitamin K transport to the liver,  could be important in altering warfarin response. However, studies of APOE have been contradictory as to whether there is an effect and, if so, in which direction APOE variants alter warfarin dose requirements [59-62]. Coagulation pathway genes could also alter warfarin dose requirements. These include genes encoding for factors II, VII, IX, and X, and genes that can accelerate the inactivation of factors Va and VIIIa: endothelial protein C receptor (PROCR) and protein S (PROS1) [42,63-67].
Adherence to warfarin also clearly affects the degree of anticoagulation control and the ability to maintain patients on warfarin. For example, in one study that used Medication Event Monitoring System (MEMS) caps to record pill-taking, 40% of warfarin-treated patients took their warfarin incorrectly ≥ 20% of the time . Even modest levels of poor adherence to warfarin contributed to significantly poor anticoagulation control in this study. For every 10% increase in missed pills, there was a 14% increase in the odds of under-anticoagulation (p < 0.001); and participants with > 20% missed doses had a greater than twofold increase in the risk of under-anticoagulation. Participants who had extra pill bottle openings on > 10% of days exhibited a statistically significant, almost twofold, increased risk of over-anticoagulation .
Numerous methods have been devised or proposed to improve the treatment of warfarin patients. These methods include specialized anticoagulation clinics, home INR self-monitoring, adherence interventions, and dosing algorithms. Different interventions might be expected to be applicable at different times during the course of therapy. For example, dosing algorithms would be expected to be most useful during the initiation phase of therapy, when a steady-state dose is being determined. It is less likely that dosing algorithms will be important once steady state is reached. Here other factors related to patient adherence and healthcare structure and practice patterns are likely to be more useful. Adherence interventions might be important at all phases of therapy , but may be most beneficial and logistically applicable during maintenance phase, when patients are being maintained on steady-state dosing and being seen by practitioners less frequently.
Both anticoagulation clinics and home INR monitoring have been shown to improve anticoagulation control and outcomes. Patients whose warfarin administration is monitored at anticoagulation clinics have lower rates of adverse events and a higher proportion of time with anticoagulation in range [12,69]. Home INR monitoring can also be beneficial , but requires self-administered fingersticks and careful education and self-management of dosing . Few patients currently use home monitoring .
Adherence interventions have not been well studied . However, given the strong relationship between adherence and anticoagulation control and difficulties maintaining adherence with warfarin, adherence interventions could be of great benefit. Interventions could target, for example, patient knowledge [72,73] and cognition, or adherence incentives .
The area that has received the most attention as a means to improve warfarin therapy is dosing algorithms that include multiple factors that might predict warfarin dose requirements; this will be discussed separately below.
Because of the multifactorial nature of warfarin response, the concept of dosing algorithms that use clinical variables to improve anticoagulation management, reduce complications, and enhance efficacy has existed for decades [75-77]. Unfortunately, dosing algorithms have, to date, had limited success. Earlier efforts to develop warfarin dosing algorithms have included serial INR measurements in the first days of warfarin use to predict subsequent doses [78,79] and/or limited clinical data (such as sex only) . Such approaches have not been well validated, and are not widely used [40,78,79,81]. One reason for this is that these algorithms do not incorporate other patient, environmental, or genetic factors that alter warfarin dose requirement.
Before discussing some of the current dosing algorithms, it is useful to clarify the nature and characteristics of dosing algorithms. Developing and testing an algorithm that predicts warfarin dosing requires a different approach from association studies. A dosing algorithm must follow the general approach of prediction model development and testing . The goal for warfarin is to best predict maintenance dose by including those variables that add predictive ability, even if variables are not statistically significant . For example, a variable that has a p-value of 0.09 may not be considered conventionally ‘associated’ with dose in an association study, but may be useful in a dosing algorithm if it is common in the population and adds predictive ability to the other variables in the prediction model. In contrast, association studies try to identify individual variables that are statistically significantly associated with dose, independent of the confounding effects of other variables.
The evaluation of a dosing algorithm also is different from that of an association study. When deriving a dosing algorithm, one is most concerned with the overall prediction of dose (rather than the individual effects of the variables in the model). Most studies of warfarin dosing algorithms to date have relied primarily on the R-squared (R2) statistic to develop and assess warfarin dosing algorithms. The R2 statistic measures the variability of warfarin dosing that is explained by the prediction model, and is clearly a useful component of model development and assessment. However, there are other characteristics of a dosing algorithm that have been and should continue to be considered. This includes how close the predicted dose is to the actual required dose, which may not be adequately captured by the R2 statistic. The percentage of predicted doses that are within a certain range of actual doses may be a useful measure of a dosing algorithm’s performance. A predicted dose within 1 mg of actual dose is a reasonable measure of predictive ability [31,79,84]; a roughly 1 mg/day change in warfarin dose from a baseline of 5 mg is sufficient to change the INR by 0.5, a clinically meaningful difference when trying to maintain a patient within the typical 1-point INR range of 2 – 3 . Other measures include the mean dose difference between predicted and actual dose and how often the dosing algorithm over-predicts (which is more important from a patient safety viewpoint) and under-predicts (which is more important from a healthcare utilization viewpoint, particularly because many patients will remain on in-patient intravenous or inpatient/outpatient subcutaneous anticoagulation until a therapeutic INR is reached).
Dosing algorithms require both development in a derivation data set and then independent testing in a separate data set . An algorithm may not validate well in an independent population for several reasons . First, the derived algorithm may reflect associations between given predictors and warfarin dose that arise purely by chance. Second, predictors may be important in one population but not in another due to differences in patient characteristics (such as racial distribution), clinical practices (such as the use of interacting medications differs across practices), or to other differences such as health system factors (such as availability of specialized anticoagulation clinics). Third, a dosing algorithm may not be properly implemented in another setting. For example, data may not be collected properly or the timing of application of the dosing algorithm may differ (for example, the algorithm is developed to be used prior to the first dose but is applied after patients have already received several doses).
In addition, a dosing algorithm needs to be accepted and clinically applicable [82,86]. Ease of use must be balanced with accuracy. For example, a dosing algorithm could provide categories of dosing (such as ‘lower than usual,’ ‘usual,’ ‘higher than usual dose’) and thereby be relatively easily applied. However, such an approach is likely to be suboptimal because it discards important information that a multivariable model can generate and still relies an averages within a category, thus not completely accounting for difference in inter-individual dosing that may be very important. It is more useful to develop an algorithm that provides the actual predicted warfarin dose for an individual patient. This calculation of predicted dose from a regression equation requires only simple math, but does require a calculator to ensure accuracy. Such a calculator could be web-based  or put on a handheld device or desktop computer. The calculation would require inputting the data into the algorithm, and the computer would produce the recommended starting dose for the individual patient. Although a bit more complicated, it is likely to be more acceptable clinically, assuming its accuracy is indeed better than categorizations of dose.
Finally, after development and validation in observational studies, dosing algorithms should be formally tested in randomized trials to determine their utility. Although a dosing algorithm may better predict warfarin dose requirements in observational studies, this does not mean that it will improve outcomes. Practical issues may limit success; for example, applying a dosing algorithm that requires rapid-turnaround genotyping and careful incorporation of clinical information presents pragmatic challenges. Also, just because a dosing algorithm predicts the ultimate maintenance dose of warfarin does not necessarily mean that starting at that maintenance dose early in therapy will improve the degree of anticoagulation control or reduce the risk of complications of therapy. Further, cost-effectiveness may be an important consideration that is best quantified in a randomized trial. Clinical trials could compare a dosing algorithm that combines clinical and genetic information with usual care and/or with dosing algorithms that use only clinical data (to test whether genetic information, which requires more resources and logistical support, really does improve outcomes).
Several recent studies have developed and tested dosing algorithms that incorporate genotype as well as clinical characteristics (Table 1) [31,40,84,88-93]. Most algorithms share in common the inclusion of age, measures of body size (such as body surface area or weight), and CYP2C9 variants. More recent algorithms incorporate VKORC1. Some models include additional variable such as diabetes, sex, interacting medications, target INR, and valve replacement. Models that incorporate CYP2C9 all have R2 values of 34 – 37%; inclusion of VKORC1 increases the R2 to 54 – 60%. However, model prediction of dose may still be suboptimal. For example, in one study using CYP2C9, the model performed better than empiric 5 mg daily dosing, but half of all patients had a predicted dose more than 1 mg/day different than predicted . This prediction index was tested prospectively in 48 orthopedic patients with an R2 of 42% . Despite being statistically better than chance, the prediction rule again often predicted a dose that could be clinically meaningfully different than the actual dose in about 48% of patients. Another retrospective study that targeted patients with low warfarin dose requirements derived a model with age, height, and variants in CYP2C9 and VKORC1 genes and had an R2 of 54%, but when tested in a different sample of 38 patients, the model often miscalculated the actual warfarin dose . The vast majority of patients in all these studies were Caucasian.
Along with CYP2C9 and VKORC1, other genes may be important in predicting warfarin dose. One recent study in a Caucasian Swedish population  identified additional genetic factors that may be associated with warfarin dose. However, these genetic factors have not, to my knowledge, been formally incorporated into a dosing algorithm.
Warfarin is a commonly used drug that is highly efficacious when used correctly. However, because of its narrow therapeutic index and large inter-individual variability in patient response, warfarin leads to a significant risk of hemorrhagic complications and suboptimal efficacy. Numerous clinical, environmental, and recently discovered genetic factors can all alter warfarin response. Dosing algorithms have been developed to try to better predict warfarin dose requirements. The possible benefits of these algorithms is most likely during the early, dose-titration phase of warfarin therapy. In addition, adherence to warfarin is problematic and strongly associated with poor anticoagulation control during all phases of therapy. Addressing the issues of proper warfarin dosing and poor adherence holds great promise for improving the use of one of the most commonly prescribed drugs available.
Warfarin represents perhaps the ‘perfect storm’ for the proof of concept that pharmacogenetics can improve patient care and clinical outcomes: it is one of the most commonly used medications, it has a narrow therapeutic index, dosing is mostly empirical, there is large inter-individual variability, and there are well-defined genetic variants that alter warfarin dose requirements. At the same time, and perhaps driven by these issues, there is a lot at stake in warfarin pharmacogenetics. Most important, of course, is patient outcomes. Dosing algorithms must work better than usual care and ultimately will be judged on whether they reduce complications and improve efficacy. At the same time, warfarin may be seen as a model in which pharmacogenetics is tested. If dosing algorithms fail, it is likely to lead to additional skepticism about the utility of pharmacogenetics in practice . It is thus critical, both to patients and to the scientific community, that dosing algorithms are properly developed and tested prior to being used in practice.
Proper dosing algorithm development will require appropriately designed studies and/or databases that include accurate data and use proper statistical methods. Although the latter is clearly important, proper study design and execution may be even more important. The best statistics cannot save a poorly designed study. Studies must minimize the many biases that can occur in observational studies, including selection bias (arising from the manner in which patients are selected or included in a study), information bias (which includes not only bias from inaccurate genotyping but also clinical and environmental factors that may go into a dosing algorithm, such as medication use or alcohol use), and misclassification bias (incorrect dosing information). In addition, a dosing algorithm must be properly assessed to determine whether it is the optimal one, examining both the R2 statistic in model development, and, perhaps equally importantly for prediction purposes, how close the predicted doses are to within the actual required doses over the range of predicted doses.
Once a comprehensive, robust dosing algorithm is developed, it must undergo additional testing to determine its generalizability. Poor performance on external validation may require recalibration of the prediction model and then retesting in a different external data set.
Ultimately, randomized trials of dosing algorithms versus usual care must be performed prior to widespread clinical use of dosing algorithms. Several small clinical trials are testing the effectiveness of dosing algorithms in practice. These trials will provide important data on the feasibility of applying dosing algorithms in practice.
Results from two small trials have recently been reported [95,96]. One trial of 191 patients, using CYP2C9-based dosing (not including VKORC1 or clinical variables), demonstrated a significant improvement in both primary outcomes: the time to reach the first therapeutic INR (2.73 days earlier with CYP2C9-based dosing versus empiric dosing) and the time to reach maintenance dose (18 days earlier with CYP2C9-based dosing versus empiric dosing) . The study also demonstrated a significant improvement in the percentage of time within INR range and found a significantly lower rate of minor bleeding when using CYP2C9-based dosing. However, the study was not blinded, assigned therapy on the basis of medical record number (that is, was not completely randomized), and had a 25% drop-out rate during warfarin induction. The second study, the Couma-Gen study, used clinical variables, CYP2C9, and VKORC1 to determine dosing and compared this algorithm-based approach in 101 patients with empiric dosing in 99 patients . The genotype-based strategy clearly was better at predicting maintenance dose, but failed to alter the primary outcome of the percentage of INRs out of range. The fact that the trial was able to demonstrate clear benefit of predicting dose in the genotyping arm, but could not demonstrate a benefit on the primary outcome of the proportion of out-of-range INRs, raises the question of whether better dose prediction will always lead to better clinical outcomes. However, because the study was not fully blinded (the treating clinicians knew the treatment arm), had limited power, and demonstrated imbalances in the study arms (for example, the prevalence of genetic variants differed by study arm, suggesting that randomization may not have been completely successful), further, larger-scale trials are needed. Such a larger-scale, multicenter study has recently been proposed by the NIH, and could begin as early as the middle of 2008. A carefully designed and rigorous trial will require considerable effort and substantial resources, but will be critical to determining whether pharmacogenetic-based prescribing is ready for clinical practice. Until such studies are completed, pharmacogenetic-based prescribing is not yet ready for prime time. Even then, the cost-effectiveness of pharmacogenetic-based prescribing must be quantified and evaluated. Some estimates are that genetic-based dosing will be cost-effective, but this should also be evaluated formally within a randomized trial [97,98]. Furthermore, clinical-based algorithms that do not require genetic information should be compared with genetic-based algorithms because, if the former work as well as the latter, they will be much more cost-effective.
One also must not forget that there are other means to improve the care of warfarin-treated patients. Adherence remains, perhaps, the ‘elephant in the room.’ Everyone knows it is important, but no one is doing anything about it. Innovative, sustainable, and cost-effective strategies to improve adherence could make warfarin safer and more effective for existing patients and expand the number of patients who could benefit but who do not currently receive the medication due to concerns about adherence. The potential public health impact of a successful intervention to improve warfarin adherence among the millions of patients who require preventive therapy for thromboembolism is enormous.
Although the recent direct thrombin inhibitor, ximelagatran, failed to gain marketing approval, newer drugs may ultimately come along to try to replace warfarin. It is hard to imagine a more efficacious drug than warfarin, but it is equally hard to imagine a drug that is as difficult to use in practice. In the meantime, we must continue to try to improve our therapeutic application of warfarin. Hopefully, in the future, any new drug that comes along will then be compared against a new standard of warfarin therapy.
Declaration of Interest
Stephen E Kimmel has received research grants and consulting fees from several pharmaceutical companies, unrelated to warfarin. He has also received research grants from NIH and the Aetna Foundation for warfarin research.
Papers of special note have been highlighted as either of interest (•) or of considerable interest (••) to readers.