To optimize screening and early intervention programs to prevent progression to severe disease, several questions must be answered: a) How rapidly do individuals with exposure develop BeS? b) How likely are BeS individuals to develop CBD? c) What is the time course of these changes? d) Does the risk change over time since initial exposure? e) How cost-effective are screening methods for BeS and CBD?
We applied a Markov simulation model to assess possible assumptions about the risk of progression. Available empirical information includes cross-sectional prevalence of BeS and CBD soon and many years after initial exposure, and the relationship between numbers of individuals with BeS and CBD. We based our basic model upon the three widely accepted states of beryllium-related health status (BeE, BeS, CBD). Although there is residual uncertainty in the precise values of the two annual transition probabilities (BeE to BeS and BeS to CBD, respectively), the patterns of the distributions under different assumptions are sufficiently different to allow meaningful contrasts. Thus, the time-varying transition-probability mixed-population model was most appropriate across the range of prevalence studies in the published literature.
Inclusion of two populations differing in risk of progression and their respective declines in risk over time improves the fit with the empirical data. Such assumptions are biologically and epidemiologically reasonable. CBD is one of the best examples of gene–environment interaction. Several genes, particularly the Glu69
variant in the β-chain of the HLA-DP allele, are strongly associated with individual risk, making it biologically likely that there are at least two groups in the exposed population groups that have different susceptibility toward progression. Furthermore, job title is closely associated with risk, so machinists have considerably greater risk than do lesser exposed workers (Newman et al. 2005
). Temporal decline in annual risk would occur as the higher risk persons develop BeS and CBD, thereby reducing the average risk of those who remain at risk of progression.
Optimizing screening programs
The declining annual risks of developing new BeS or CBD suggest that screening intensity may be reduced as time since initial exposure increases. This would allow effective focusing of available resources. For example, the frequency of repeated blood lymphocyte proliferation testing of persons with prior exposures should be greater in the early years rather than in later years. Calculations may understate the impact because our models did not incorporate measures of health benefit or health risks. Because persons with long latencies tend to be older, years of life or quality adjusted years of life saved would be lower in long-latency cases. Similarly, the health risks of diagnostic procedures (e.g., bronchoscopy) and of treatment (e.g., high-dose prednisone) are likely to be greater in long-latency cases.
Most of the empirical studies generally do not clearly distinguish years since first exposure from years since last exposure. Therefore, these results should not be interpreted to suggest reduced screening intensity of currently exposed workers who have long latency. However, a high proportion of individuals being screened ceased being exposed many years ago; for example, many had been employed in the former nuclear weapons industry.
Nor do these results apply to persons who present with relevant clinical evidence suggestive of CBD such as radiographic signs (e.g., interstitial or ground glass opacities), pulmonary function abnormalities (e.g., reduced diffusing capacity), or incidental findings on biopsy (e.g., granuloma or lymphocytic infiltrate). Indeed, the reduction over time of screening cost-effectiveness when applied nonselectively argues for focusing resources upon those with higher likelihood of remediable disease.
The results of our more complex models also support the benefit of screening individuals or populations that have not been previously tested. shows that even with long latency, both cost-effectiveness and diagnostic yield are significant when applied to exposed populations not previously tested. Under such circumstances, the screening seeks to identify prevalent rather than incident cases. Therefore, there will be a pool of cases that have accumulated over many years.
Our models do not provide precise estimates of incidence rates and prevalence over time. Nevertheless, they demonstrate that risk of progression declines with time and provide useful insights into optimization of screening programs.
There are significant data gaps in the available population and clinical studies. These studies report divergent prevalence values for several possible reasons. Case definitions for both BeS and CBD differ among studies. The populations are heterogeneous in terms of length and magnitude of exposure. This affects the prevalence because the risk of BeS and CBD is dose related (Henneberger et al. 2001
; Viet et al. 2000
; Yoshida et al. 1997
). Prevalence is also affected by inclusion of retirees (Cummings et al. 2007
; Stange et al. 1996
). The study populations are heterogeneous. Cross-sectional studies include individuals with both short and long latencies. The cross-sectional studies are subject to survivor and ascertainment bias; those who had severe CBD and those who have left the worksite would not appear in several of the studies. Adequate, long-term cohort studies are absent.
We simplified the calculations by using a 1-year “time slice” for applying transition probabilities. A person developing BeS at the beginning of a year would not be considered part of the pool at risk of progressing to CBD until the end of the year. Similar considerations apply to calculating diagnostic yield for triennial in-depth evaluation based upon the average size of the BeS population over that time. Such errors are likely to be relatively small.
The cost data are somewhat arbitrary, and the cumulative cost models do not incorporate either cost inflation or discounting of later versus early expenditures. However, the modeling effectively demonstrates the relative changes in cost-effectiveness. Similar approaches have been applied to occupational asthma (Wild et al. 2005
) and for selecting workers for spirometry screening (Schwartz et al. 1988
). The analysis includes only the direct cost of the testing (e.g., cost per subject tested) and did not include fixed program costs (e.g., program administration) or indirect costs (e.g., lost work time during testing). Furthermore, the approach treated screening with blood beryllium lymphocyte proliferation tests as a single entity; alternative algorithms of test and rapid retest have been suggested (Middleton et al. 2006
In summary, combining published observational data and several possible progression models suggests that the risk of developing BeS is greatest in the first few years after exposure and then declines, and that the annual risk of progressing from BeS to CBD declines over time. However, there is a persistent risk of developing new BeS and new CBD even with long latency, so screening intensity should be adjusted according to years of latency in order to optimally use resources. Screening is also useful for exposed workers who have not been previously tested.