The recent emergence of WNV in the US has led to a perceived need to safeguard the blood supply from viremic blood donations. Strategies for screening blood for emerging viral infections such as WNV are often put into place without systematic evaluation of their costs, benefits, and cost-effectiveness. In this study, we conducted a cost-effectiveness analysis of alternative strategies for blood screening and considered the efficacy of these strategies in areas with varying epidemic intensity, exploring the effect of variable assay characteristics, transfusion outcomes, and pricing that may affect current and future policy decisions.
Our analyses demonstrated that in areas with high infection rates, in the order of those seen in Mississippi and Nebraska in 2002, seasonal screening of blood designated for immunocompromised recipients prolongs quality-adjusted life expectancy compared with implementing a baseline questionnaire alone. Although other strategies, such as screening pooled samples designated for all donors, provided some benefit compared to a questionnaire alone, they were more costly and either less effective or only marginally more effective than restricted seasonal screening. In areas with low infection and seasonal transmission, none of the NAT strategies offered additional clinical benefit given current test-sensitivity estimates, although they were associated with substantial costs. These results suggest that the general screening of blood for WNV may not be as attractive a public health strategy as it first appeared to be, and that more restricted screening strategies may be preferable to currently mandated policies.
The finding that blood-screening strategies for WNV may be outside the usually accepted cost-effectiveness thresholds is consistent with previous cost-effectiveness analyses for blood screening for infectious agents [21
]. A recent analysis of NAT screening for hepatitis B and C and HIV compared to serological testing alone showed that the ICER exceeded US$1.5 million/QALY gained, well beyond the US$50,000–100,000 threshold commonly used as an indicator of willingness to pay for a health-care intervention [21
]. AuBuchon et al. [29
] have previously enumerated some of the reasons why cost-effectiveness estimates of blood-screening tests are so unattractive; risks are relatively low, transfusion recipients often have a reduced quality-adjusted life expectancy, and costs are incurred for all donations, few of which are infectious. Despite this, blood-screening tests are often implemented for reasons that are not captured in a cost-effectiveness analysis. There is a perception that blood recipients cannot be held responsible for avoiding risk, and therefore the system must protect them at any cost. Individuals are willing to pay more to avoid a catastrophic outcome, even when the risk is low compared to other outcomes. Furthermore, policy makers are more likely to apply an intervention to a small and defined group such as blood recipients rather than to a less-visible group [32
]. Nonetheless, in an era of major cuts in public health expenditure and increasingly limited resources available for health care, it is worthwhile reconsidering the economic implications of this priority; resources spent preventing the rare case of transfusion-associated WNV might be better utilized in a host of other interventions against infectious disease, including those focused on reducing WNV transmission through mosquito vectors. If such an approach were successful, it might obviate the need for screening blood for the virus in many areas.
Our analysis has a number of limitations as the ecology of WNV in the US, and the clinical course and sequelae of transfusion-acquired WNV infection, have not been clearly defined. Transmission intensities have varied over the years within some geographic regions since the emergence of WNV. Choosing the most cost-effective approach to screening within a specific area will depend on the ability to predict transmission intensity for the current season. Recently, a risk equation based on mosquito abundance, infectivity, vector competence, and host feeding behavior was developed to predict short-term future human WNV infections in an area [33
]. The utility of this index to predict human infections is under investigation. Methods to validate, and improve upon, current prediction tools for WNV infection would enhance our ability to select the most cost-effective screening strategies. Improved methods to both measure and efficiently monitor the mosquito population parameters that determine virus transmission to humans would allow us to shift policy in response to important temporal changes in transmission patterns.
Lacking other data, we estimated the risk of developing NI after an infected transfusion from a single study in which patients with cancer were deliberately inoculated with WNV [13
]. These data may overestimate the true risk of disease, since the patients studied may have been more susceptible to severe disease than healthier blood recipients. Such a bias would exaggerate the benefits of screening in our analyses. Otherwise, if the potentially higher dose of virus from a transfusion-associated infection results in an even higher risk of developing NI than we estimated, the benefits of screening are underestimated in our analyses [10
]. In addition, in the absence of large-cohort data, we assumed that data from small studies on the long-term clinical consequences of WNV-associated NI patients represent the expected sequelae of infection.
Among the least well-defined parameters used in this analysis were those reflecting the performance of the newly introduced nucleic acid–screening tests. These were FDA-approved prior to establishing their sensitivity and specificity, and these characteristics have yet to be published. Given the imprecision of these estimates together with our expectation that the assay will improve with further product development, we repeated our analyses assuming that NAT for WNV was highly sensitive and specific. However, this analysis assumed that an improved test bore no additional cost. While various measures to enhance detection of low levels of viremia have been proposed, these would add further steps to screening rather than replace existing approaches, possibly adding substantially to expense. New methods that achieve small boosts in sensitivity (such as IgM antibody testing as an adjunct method to detect positive samples that have escaped NAT detection) are unlikely to be cost-effective under the assumptions made in this analysis. Custer et al. [34
] demonstrated that while continuous ID-NAT screening would overburden blood-testing laboratories, ID-NAT screening during select times of the transmission season is needed currently, since minipool assays fail to detect 23% of the viremic samples detected by ID-NAT.
In conclusion, we found that NAT screening of blood donations for WNV improved clinical outcomes only in those areas where the incidence of WNV is high, and that limiting screening to high-intensity transmission seasons and to blood donations designated for immunocompromised patients reduced costs without decreasing quality-adjusted life expectancy in most scenarios. We recommend that states adopt screening policies based on the intensity and duration of their WNV epidemics. Regional data, in conjunction with the results of this analysis and consideration of societal risk attitudes and preferences, may collectively point to a relaxation of the current federally mandated NAT screening of all donations in low-intensity areas. When high rates of natural infection indicate that NAT screening is appropriate, we recommend use of ID-NAT rather than minipool screening. States should consider the restricted screening of blood designated for immunocompromised patients alone. Finally, we suggest that blood-screening policies be carefully scrutinized for cost-effectiveness and that their relative contribution to safeguarding public health be considered in the making of policy decisions.