|Home | About | Journals | Submit | Contact Us | Français|
For diagnosis, assessing disease activity, complications and extraintestinal manifestations, and monitoring response to therapy, patients with inflammatory bowel disease undergo many radiological studies employing ionizing radiation. However, the extent of radiation exposure in these patients is unknown.
A population-based inception cohort of 215 patients with inflammatory bowel disease from Olmsted County, Minnesota, diagnosed between 1990 and 2001, was identified. The total effective dose of diagnostic ionizing radiation was estimated for each patient. Linear regression was used to assess the median total effective dose since symptom onset.
The number of patients with Crohn's disease and ulcerative colitis was 103 and 112, with a mean age at diagnosis of 38.6 and 39.4 yr, respectively. Mean follow-up was 8.9 yr for Crohn's disease and 9.0 yr for ulcerative colitis. Median total effective dose for Crohn's disease was 26.6 millisieverts (mSv) (range, 0–279) versus 10.5 mSv (range, 0–251) for ulcerative colitis (P < 0.001). Computed tomography accounted for 51% and 40% of total effective dose, respectively. Patients with Crohn's disease had 2.46 times higher total effective dose than ulcerative colitis patients (P = 0.001), adjusting for duration of disease.
Annualizing our data, the radiation exposure in the inflammatory bowel disease population was equivalent to the average annual background radiation dose from naturally occurring sources in the U.S. (3.0 mSv). However, a subset of patients had substantially higher doses. The development of imaging management guidelines to minimize radiation dose, dose-reduction techniques in computed tomography, and faster, more robust magnetic resonance techniques are warranted.
Crohn's disease (CD) and ulcerative colitis (UC) are chronic, relapsing and remitting inflammatory diseases of the gastrointestinal tract with multiple extraintestinal manifestations. The estimated combined prevalence of the two conditions is 1.4 million people in the United States and 2.2 million in Europe (1). Patients are most frequently diagnosed with inflammatory bowel disease (IBD) in their twenties and thirties, although up to 15% of patients are diagnosed in childhood (1).
For initial diagnosis, assessment of disease activity, evaluation of complications and extraintestinal manifestations, and monitoring of response to therapy, patients with IBD often undergo numerous radiologic imaging studies, including conventional radiography, fluoroscopy, computed tomography (CT), magnetic resonance imaging (MR), and ultrasound (US) examinations (2–6). With the exception of MR and US, these studies subject patients to ionizing radiation (7, 8).
To describe the radiation dose delivered to the patient, the concept of “effective dose” was introduced in 1975 (9). Effective dose, expressed in millisieverts (mSv), takes into account the absorbed dose of radiation and the susceptibility of each organ to radiation-induced somatic or genetic effects. The most radiosensitive organs are the gonads, bone marrow, colon, lung, and stomach. Susceptibility to ionizing radiation is also affected by gender and age at exposure (10). The frequency of diagnostic X-rays in the U.S. is about one per person per year (11). The 2000 U.S. Nationwide Evaluation of X-Ray Trends (NEXT) survey estimated that there were 58 ± 9 million CT examinations performed in the U.S. in 2000 (12, 13). Although CT accounts for only 13% of all diagnostic radiology procedures in U.S. hospitals, it accounts for >70% of the diagnostic ionizing radiation exposure to patients (14).
Annually, the exposure to environmental ionizing radiation is approximately 3 mSv (with regional variability) in the United States (15). There is not a current estimation of the annual diagnostic ionizing radiation exposure in a broad U.S. population; however, two studies of German and Norwegian populations demonstrated the mean annual effective dose per patient to range from 1.1 to 1.9 mSv, assuming between 910 and 1,550 X-ray examinations per 1,000 inhabitants (16, 17). At intermediate to high doses of ionizing radiation (>100 mSv), in acute or protracted doses, scientific models and epidemiological studies demonstrate deleterious effects, including carcinogenesis. However, the risks of low-dose ionizing radiation, such as the levels of diagnostic ionizing radiation, are highly debated (13, 15, 18–22).
Using a “linear no-threshold dose-effect model,” which assumes that any level of exposure can result in carcinogenesis, studies estimate that the cumulative cancer risk to age 75 yr attributable to diagnostic radiation is 0.9%, corresponding to approximately 2,500–5,695 cases of radiation-induced cancer per year in the U.S. (1, 18). However, these data are highly debated as some authors suggest that doses below a “practical threshold” have a lower effectiveness for carcinogenesis, and, thus, extrapolating from high-dose exposures may grossly overestimate the risk (20, 21). To answer these critics, in a separate study of nuclear industry workers from 15 countries, the risk of carcinogenesis after protracted exposure to low closes of ionizing radiation was estimated to model the potential risk of repeated diagnostic exams (23). The mortality from cancers, excluding leukemia, was estimated at two to three times higher than the general population, although the confidence intervals were large (23). Overall, it is difficult to assess the risks of diagnostic ionizing radiation given that the effects of ionizing radiation are often not seen for decades after the exposure, as well as the high natural background incidence of malignancy, and the added variables of single versus repeated exposures and the age at exposure (20, 21).
The chronicity of disease, the fact that symptoms poorly reflect disease activity, and the typically young age at the time of IBD diagnosis often lead to multiple radiologic tests being performed in these patients. This fact presents a challenge to balance the benefit of these tests employing ionizing radiation against any possible risk. Our purpose was to estimate the total effective dose of ionizing radiation since the date of symptom onset in a population-based cohort of subjects with IBD. This information can then be used to inform future evaluation protocols and management guidelines, encourage efforts to minimize diagnostic radiation exposure, and further the development and use of radiologic investigations which do not utilize ionizing radiation.
Olmsted County is situated in southeastern Minnesota and had approximately 124,000 people at the 2000 U.S. Census. In 2000, 89% of the population was non-Hispanic white. Although 25% of county residents are employed in health care services (vs 8% nationwide), and the level of education is higher (30% have completed college vs 21% nationwide). the residents of Olmsted County are otherwise socioeconomically similar to the U.S. white population (27).
The Rochester Epidemiology Project (REP) is a unique medical records linkage system developed in the 1960s and funded in part by the National Institutes of Health. It exploits the fact that virtually all of the health care for the residents of Olmsted County is provided by two organizations: Mayo Medical Center (Mayo Clinic and its hospitals, Rochester Methodist and Saint Marys); and Olmsted Medical Center (a multispecialty clinic and its hospital, Olmsted Community Hospital). Diagnoses generated from all outpatient visits, emergency room visits, hospitalizations, nursing home visits, surgical procedures, diagnostic studies, autopsy examinations, and death certificates for all county residents seen since 1908 are recorded in a central diagnostic index (27, 28). In any three-year period, over 90% of county residents are examined at one of the two health care systems (27). Thus, it is possible to identify all diagnosed cases of a given disease, and all diagnostic examinations, including radiological studies, performed in these subjects. The resources of the REP have been used to identify Olmsted County residents diagnosed with CD or UC from 1940 to 2001 (24–26).
With approval from the Institutional Review Boards of Mayo Foundation and Olmsted Medical Center, the records of 220 patients who had not denied permission for research access to their medical records and who were first diagnosed with CD or UC between January 1, 1990 and December 31, 2001 were reviewed. We excluded two patients with CD and three with UC with follow-up of less than 1 yr. Demographic information including date of birth, date of symptom onset, date of diagnosis, disease type and extent were previously determined (24–26). Beginning with the date of symptom onset, medical records were reviewed to identify all diagnostic radiologic studies performed (Tables 1 and and2).2). Using the date of symptom onset to the last date of follow-up, up to March 2006, the duration of follow-up was determined for each patient.
Mean entrance surface dose to an average-sized patient was estimated using standard X-ray technique parameters and fluoroscopy dose rates at our institution. Entrance surface dose is the term used to describe the dose that the region of skin is exposed to by the incident primary beam. The same mean doses were used for all exams, regardless of year performed or facility within Olmsted County where exam was performed. Then, National Radiological Protection Board (NRPB) (29) estimated organ doses were used to calculate effective dose delivered with fluoroscopic and CT studies (Tables 1 and and2).2). Skeletal X-ray mean effective dose was determined by averaging the effective dose for a subset of 50 examinations received by the patient group. Fluoroscopy exam entrance surface doses were determined from the recorded fluoroscopy exposure time and number of acquired images for each exam. The established effective dose for bone mineral density testing was used for calculations (30). Estimations of CT doses were made using our institution's current computed tomography dose index values (CTDI). CTDI provides an indication of the magnitude of doses delivered to the patient as a function of the particular CT scanner model and operation specifics—such as slice thickness and number of scan slices obtained. The total effective dose for each patient was estimated by multiplying the total number of each type of exam by the effective dose for each exam. Each patient's annualized radiation dose was estimated by dividing the estimated total effective dose by the follow-up time.
Categorical data were reported as counts and proportions and compared with chi-square tests or Fisher's exact tests. Continuous data are reported as means with 95% confidence intervals and compared with t-tests, with the exception of radiation dose and time-related variables. Time-related variables are reported with medians and ranges and compared with the Wilcoxon rank sum test. Dosage was skewed to high values; the univariate assessment of dose is reported with medians and ranges and compared with the Wilcoxon rank sum test. Dose was also log-transformed for inclusion in multivariate models. An overall comparison of total effective dose between CD and UC patients was done using linear regression. Natural log transformations were used to compare total effective dose for CD and UC patients; this model was adjusted for years since symptom onset. A natural log transformation was also used to compare annualized dose for CD and UC patients. Linear regression was used to compare the proportion of the cumulative dose that was attributed to CT scans between CD and UC patients. All tests were 2-sided with a 5% type I error rate. SAS version 9.1 (Cary, NC) was used to perform statistical analyses.
Among the 215 county residents included in this study, 118 (55%) were men and the mean age at diagnosis was 38.6 yr (95% CI, 34.8, 42.5; range 2.2–91.4) for CD and 39.4 yr (95% CI, 36.2, 42.6; range 1.2–86.9) for UC (P = 0.761). Nineteen patients (8.8%) were diagnosed under the age of 18 yr. The median time from symptom onset to diagnosis was 1 month (range, 0–62.8 months) for CD and 0.4 month (range, 0–100 months) for UC (P = 0.034). Crohn's disease was diagnosed in 103 patients (48%), with primary involvement of the colon, small bowel only, and both colon and small bowel in 35 (34%), 37 (36%), and 31 (30%) patients, respectively. Ulcerative colitis was diagnosed in 112 patients (52%), with extensive colitis, left-sided colitis, and proctitis in 59 (53%), 46 (41%), and 7 (6%) patients, respectively. Mean follow-up time was 8.9 yr (95% CI, 8.1–9.6; range, 1.8–17.8 yr) for CD and 9.0 yr (95% CI, 8.2, 9.9; range, 1.2–23.6 yr) for UC (P = 0.757).
The total number of imaging exams was 3,782, of which 456 (12%) were CT scans. This total included CT scans of the abdomen and pelvis (45%), nontrunk (i.e., head, sinuses, extremities) (22%), chest (16%), abdomen (6%), coronary (4%), and pelvis (2%). The remaining 5% were CT enterographies. There were 7 abdominal MRs, 5 pelvic MRs, and 117 abdominal ultrasounds. A total of 15 nontrunk CT scans and 86 non-GI X-ray exams (including mammography) were excluded from the total effective dose exposure determination due to small effective doses, few patients with exam, limiting calculation of effective dose, or research-protocol scans and unavailability of scanning parameters.
The median total effective dose was 26.6 mSv (range, 0– 279.2 mSv; upper quartile range, 47.9–279.2 mSv) for CD and 10.5 mSv (range, 0–251.4 mSv; upper quartile range, 26.8–251.4 mSv) for UC (P < 0.001) (Table 3). After adjusting for time since symptom onset, CD patients had 2.46 times greater total effective dose compared to UC patients (95% CI, 1.5, 4.1; P = 0.001). The annual median effective dose for CD and UC was 3.1 mSv/yr and 1.2 mSv/yr, respectively (Table 3). The annualized effective dose was 2.26 times greater for CD than for UC patients (95% CI, 1.4, 3.6; P = 0.001). CT scans accounted for 51% (95% CI, 43, 58) of the total effective dose in CD and 40% (95% CI, 33, 47) in UC (P = 0.046). The difference in effective dose was primarily due to twice the number of abdominopelvic CT scans in the CD patients. Over our study period, the use of CT enterography, which had 1.6 times the effective dose of abdominopelvic CT as performed at the time of this study, increased.
In this population-based cohort of IBD patients, the annualized median total effective dose from diagnostic radiation was equivalent to the average dose of naturally occurring ionizing radiation received annually by the average American or European. With the greater risk of extraluminal complications that are often assessed by CT, CD patients had a median total effective dose more than twice that of UC patients. The upper quartile of patients with IBD had annualized total effective doses 2–11 times above annual natural background exposure.
Computed tomography accounted for 51% of the effective dose in our patients with CD, and 40% in UC, and practice trends demonstrate the increasing use of computed tomography (13). While the risks of ionizing radiation at levels associated with CT imaging are extremely low (the exact value remaining a topic of debate), numerous studies have shown the usefulness of CT imaging for the evaluation of IBD patients (2, 31–35). Computed tomography has been shown to change the management of CD in 28% of patients, while computed tomography enteroclysis changed the management in 62% of patients with symptomatic CD (31, 35). The use of CT for the diagnosis of IBD in a pediatric population, compared to barium studies utilizing less ionizing radiation, was equally effective in 72% of patients and superior in 28%, by identifying skip lesions and better visualizing the terminal ileum (34).
Over the course of our study, we observed the increasing use of CT enterography, particularly in CD patients (36). At our institution, small bowel follow-through exams decreased by 65%, from 2,800 studies per year in 2003–2004 to 975 studies in 2007, while CT enterography increased 840% from 375 studies in 2003 to 3,166 studies in 2007 (Fig. 1) (37). Several studies have demonstrated superior specificity and sensitivity of CT enterography for detecting inflammatory CD compared to conventional CT and fluoroscopic exams (38, 39) and superior specificity compared to capsule endoscopy (CE) (40). Recently, Higgins et al. showed that clinical assessments correlated poorly with CT enterography findings, and CT enterography changed clinicans' perceptions of the likelihood of steroid benefit in more than 60% of patients (41).
Over the period of our study, CT enterography delivered approximately 1.5–2 times the effective dose of conventional abdominopelvic CT scanning. The radiation doses employed at CT enterography arise from the extension of acquisition parameters used from other types of CT exams using narrow slice thicknesses and requiring visualization of soft tissue attenuation differences. In our own practice, after the preliminary results of this study were known, we successfully lowered our tube current (milliampere [mA]) or increased our pitch (depending on scanner types) to reduce radiation dose by over 30%. Further reductions in radiation dose can be achieved by similar changes in CT image acquisition parameters, but even more dramatic dose reduction will likely be possible using newer image reconstruction methods or denoising algorithms applied in image space or after image reconstruction. Dose reduction is particularly important for younger patients with IBD.
Magnetic resonance imaging, US, and capsule endoscopy are alternatives to radiologic studies employing ionizing radiation. In addition to lack of ionizing radiation exposure, MR has the advantage of evaluation of stenotic lesions with MR fluoroscopy technique and conspicuity of abnormal bowel wall signal (6, 42, 43). However, the benefits of CT over MR include greater availability and robustness, shorter exam times, and higher spatial resolution. At this time, MR is used as a first-line test to evaluate perianal disease, but is generally not used to evaluate small bowel disease due to long exam times, problems with patient tolerance and motion, and lack of subspecialized clinical expertise (33). Because of its potential advantages of increased bowel wall signal and lack of ionizing radiation, MR may become the ideal modality for determining response to therapy for refractory inflammatory CD (44).
In European and Canadian centers, ultrasonography is used to evaluate the thickness, stratification, and vascularity of the bowel wall to diagnose CD and UC. In their studies, the diagnostic accuracy of US was comparable to X-ray examinations, including CT (45–48). Yet, widespread use of US in the United States has been limited in part due to the operator-dependent quality, inability to visualize large portions of the bowel due to bowel gas, and inability to evaluate the mesentery.
Lastly, capsule endoscopy has emerged as a relatively noninvasive method for direct evaluation of the small bowel mucosa (49–51). Yet, individual study sizes have been small, and the yield of CE for the initial diagnosis of CD versus evaluation of recurrent, known disease remains unknown. In addition, capsule retention remains a concern in patients with stricturing CD (52).
The strengths of this study include the population-based inception cohort used with a broad distribution of disease subtypes and ages with significant follow-up periods (mean, 9 yr). In addition, the resources of the Rochester Epidemiology Project permitted the accurate accounting of all imaging studies obtained by the patients at the medical centers in Olmsted County, where the vast majority of health care is delivered to these patients. A final strength is the availability of accurate technique data on effective dose values from our institution's medical physics team.
However, there are some limitations of our study. First, with the resources of a large referral center and a community-based hospital, it is possible that our cohort had greater access to health care, which may either increase or decrease the number of scans a patient receives. Second, to find a median effective dose, a subset of actual exams was used, as well as known data from standard exams performed at our institution, and these values were used to represent all scans for all years in this study. With improvements in dose modulation, lower tube voltages, and changes in slice thickness, the effective doses delivered by CT have changed, and these dose reductions arc not reflected in the values estimated (19, 53, 54).
Finally, this study was of a population-based cohort. For a referral cohort, containing patients with greater severity of illness who more often undergo repeated radiological imaging studies, one would expect total exposures of significantly higher effective doses, as suggested by the upper quartile range seen in our study.
For most patients in our cohort, annualized exposure to diagnostic radiation was not significantly greater than naturally occurring background radiation. However, for this upper quartile of patients, the degree of exposure was significantly greater, potentially increasing the risk of cancer induction. Much like gastroenterologists inform patients about the risks of biologic therapy (55), clinicians should inform patients about potential risks of ionizing radiation (particularly those patients undergoing multiple exams), and weigh these risks against the clinical benefit in each situation (56). Further study is needed to identify patients who require more scans and who may be at higher risk, even if only theoretical. Practice guidelines, which take into account patient age, clinical scenario, and potential benefit and risk of alternative imaging strategies, should be developed. The development of dose reduction techniques in CT, faster, more robust MR examinations, refinement of US, and the further study of the clinical usefulness of capsule endoscopy for IBD patients will decrease radiation exposure from cross-sectional small bowel imaging.
In conclusion, in this population-based IBD cohort, patients with CD were exposed to 2.46 times more diagnostic radiation than patients with UC. Overall, the annualized exposure to diagnostic ionizing radiation was equivalent to natural background radiation exposure. However, a subset of patients had substantially higher levels of exposure. Particularly in such patients, it is important to consider dose reduction techniques for CT imaging, as well as further research of imaging modalities that do not employ ionizing radiation and the development of imaging practice guidelines to limit diagnostic radiation exposure. Finally, clinicians caring for patients with IBD need to consider the radiation exposure associated with radiologic studies, particularly repeated CT scanning, and should only use these tests when the results will have an impact on the patient's management.
Financial support: Joanna M. Peloquin was supported by the American Gastroenterological Association Student Research Fellowship Award, 2006–2007. The Rochester Epidemiology Project is supported in part by AR30582, National Institutes of Health.
Guarantor of the article: Darrell S. Pardi, M.D.
Specific author contributions: Contributed to the study design: Dr. Peloquin, Dr. Pardi, Dr. Sandborn, Dr. Fletcher, Dr. McCollough, Dr. Schueler, Dr. Kofler, Dr. Enders, and Dr. Loftus, Jr.; Collection, analysis, and interpretation of data: Dr. Peloquin, Dr. Pardi, Dr. Fletcher, Dr. McCollough, Dr. Schueler, Dr. Kofler, Dr. Enders, Ms. Achenbach, and Dr. Loftus, Jr.; and Writing of the manuscript: Dr. Peloquin, Dr. Pardi, Dr. Sandborn, Dr. Fletcher, Dr. McCollough, Dr. Schueler, Dr. Kofler, Dr. Enders, Ms. Achenbach, and Dr. Loftus, Jr.
Potential competing interests: None.