|Home | About | Journals | Submit | Contact Us | Français|
The increasing exposure to low-dose radiation from diagnostic testing has prompted renewed interest in evaluating its carcinogenic risk, but quantifying health risk from low-dose radiation exposure remains controversial. The current approach is to adopt the linear non-threshold model, which is commonly applied to high-dose exposure, and apply it to assess risk from low-dose exposure. However, existing data are conflicting and limited to epidemiological studies and/or in vitro analyses. In this article, we will discuss the potential cancer risk from low- and high-dose radiation, their effects on DNA repair response pathways, and the best course of action for patients and providers to minimize risk.
Radiation exposure from medical procedures is a potential carcinogen affecting millions worldwide. A major concern is that the total exposure to ionizing radiation in the USA has nearly doubled over the past 20 years , based on a recent report by the National Council on Radiation Protection and Measurements (NCRP). Unfortunately, the incidence of radiation exposure from imaging will continue to rise exponentially for several reasons. First, advances in imaging technology have enabled physicians to evaluate both the anatomy and function using x-ray and nuclear medicine-based techniques, both of which are significant sources of radiation. Second, more physicians place a greater reliance on imaging tests for patient management. Finally, patients are demanding more testing for reassurance of accurate diagnosis and treatment.
Imaging procedures such as computed tomography (CT), single-photon emission computed tomography (SPECT) and positron emission tomography (PET) account for major sources of ionizing radiation. In a study of 952,420 non-elderly adults , approximately 75% of the cumulative effective dose was accounted for by CT and nuclear imaging procedures (i.e., SPECT and PET). The annual mean (± standard deviation) effective dose from imaging procedures was 2.4 ± 6.0 millisieverts (mSv) per subject . By comparison, an average US citizen receives approximately 3.6 mSv of background radiation annually.
Similarly, there has been a dramatic rise in the number of cardiac imaging tests ordered in recent years . For example, in 1990 fewer than 3 million nuclear medicine studies were performed in the USA, compared with 9.9 million in 2002 . Between 2002 and 2003, the number of cardiac CT scans doubled . In addition, the number of cardiac catheterizations increased from 2.45 million in 1993 to 3.85 million in 2002 . Each cardiac imaging test that uses x-rays or radioactive agents can increase exposure to radiation. Estimated exposures range from approximately 10–20 mSv per procedure, depending on the type of imaging test, and multiple tests can result in cumulative exposures of more than 100 mSv . A recent study showed that among patients who underwent more than one cardiac imaging procedure, the mean cumulative effective dose over 3 years was 16.4 mSv (range: 1.5–189.5 mSv) . Of note, 3.3% of patients received more than 20 mSv per year, which is the maximal annual occupational dose limit for radiation workers recommended by the International Commission on Radiological Protection (ICRP).
Not only is the radiation dose per procedure and the cumulative radiation exposure a concern, but the rate of exposure must be considered in estimating cancer risk. Similar to evaluating the effects of dose on cancer risk, estimates of cancer risk from low and moderate dose-rate exposures are based on the risk coefficients derived from atomic bomb survivors with high dose-rate exposure. Risk coefficients are combined with a dose and dose-rate effectiveness factor, which is deduced from experiments with laboratory animals and from radiobiological measurements [8,9]. The Biological Effects of Ionizing Radiation (BEIR) VII Committee on the US National Research Council reduces the corresponding risk value for atomic bomb survivors by a dose and dose-rate effectiveness factor of 2.0 to estimate risk from low dose-rate exposure. However, this estimate of risk may not be accurate. For example, based on a recent metaanalysis of 12 epidemiological studies, the cancer risk from occupational exposure with low and moderate dose-rate exposure was not lower than atomic bomb survivors with high dose-rate exposure .
Despite the growing concern of the public and federal regulators, it remains unclear whether low-dose radiation causes an increased risk of cancer. By contrast, it is well known and generally accepted that exposure to high-dose radiation increases the risk of solid cancers and leukemias, based on evidence from epidemiological studies in atomic bomb survivors and radiation workers [11–14]. These data suggest that the risk of cancer from high-dose radiation is proportional to the dose, following the linear nonthreshold (LNT) model.
Currently, the LNT model is used to extrapolate risk from low-dose radiation, an approach that is endorsed by the BEIR report of the US National Academy of Sciences and the ICRP. Based on this model, even the very lowest dose of radiation poses an increased risk that is proportional to the dose, and there is no safe exposure level. Epidemiologic studies of atomic bomb survivors have shown an increased cancer risk, even in those exposed to low-dose radiation (5–100 mSv) [11–13]. In addition, studies in radiation workers also support this premise. An international study in over 400,000 radiation workers with an average dose of radiation of approximately 20 mSv and cumulative doses of less than 150 mSv showed increased cancer mortality . Consistent with these findings, a second study found that radiation workers followed by a national registry had increased cancer mortality associated with low-dose radiation .
Recent studies have also applied the LNT model to estimate cancer risk from low-dose radiation from imaging tests [16–19]. Using the LNT model, the lifetime attributable risk (LAR) of cancer is calculated based on age, sex and radiation dose. Based on the LNT model, the LAR is adjusted proportionally by the dose of radiation to calculate the excess cancer risk from low-dose radiation. For example, the LAR of having lung cancer after a 100-mSv exposure is 240/100,000 in a 40-year-old woman . Thus, a single exposure of 74 mSv to the lungs after a CT angiography will give her a 0.178% LAR of lung cancer (calculated as 74 mSv/100 mSv multiplied by 240/100,000 multiplied by 100 for the percentage risk). This means that 1 in 562 women who have a CT angiography at 40 years of age are likely to have lung cancer in their lifetime, which is not negligible.
Epidemiologic studies, however, may not adequately control for other risk factors for cancer such as smoking, ultraviolet radiation exposure and genetic susceptibility. Furthermore, it may be difficult to discern a small excess in cancer incidence or mortality from the natural incidence of cancer in the population. Studies that extrapolate cancer risk from the LNT model assume that it accurately estimates cancer risk at low dose [16–19]. Finally, other data suggest that the biological effects of low-dose radiation are more complex than the predictions from the LNT model. Some studies indicate that the LNT model may be over protective, causing unnecessary concern, while others imply it may be underprotective, prompting the need for more stringent regulation. Alternative models include threshold, hormesis (adaptive responses) and hypersensitivity models, as shown in Figure 1.
Unlike the LNT model, the threshold model suggests that there may be a level of radiation exposure that is harmless . This has been supported by evidence from the analysis of studies in atomic bomb survivors, which cannot exclude a threshold effect at 60 mSv, although data at higher doses suggest linearity . In addition, mice that lack an enzyme that excises single base mutations, which is a common injury from low-dose radiation exposure, display only a moderate increase in spontaneous mutation rate and are not prone to have malignancies or pathological findings . Finally, in vitro studies show that exposure of cells to radiation doses of less than 100 milligray (mGy; equivalent to mSv for γ- and x-rays) resulted in only approximately 10% rate of base mutations. These findings suggest that alternative repair systems may minimize the effects of low-dose radiation , and at a low threshold of radiation exposure there may not be a measurable increased health risk [20,23].
An alternative model, termed hormesis (also referred to as adaptive responses), postulates that low-dose radiation may actually be protective [20,24]. In this model, exposure to low-dose radiation conditions DNA repair systems so that they can more effectively respond to a second, higher dose of radiation. In vitro and in vivo studies using various indicators of cellular damage (i.e., cell lethality, chromosomal aberrations, mutation induction, radio-sensitivity and DNA repair) have demonstrated reduced damage after priming with low-dose radiation . For example, in a previous study, mice that received a priming dose of 10 mGy had lower recombination events after exposure to a 1-Gy challenge than those without prior exposure . Furthermore, long-term exposure to low-dose radiation in mice appears to prevent the development of lymphoma after high-dose exposure . The effects of hormesis, however, shows a high degree of variation, which may depend on factors such as dose rate, time lapse between doses, genetic variation and experimental conditions .
In contrast to the threshold and adaptive response models, the hypersensitivity model suggests that the LNT model may underestimate rather than overestimate radiation risk. In the hyper-sensitivity model, radiation induces DNA repair pathways in nonirradiated cells in addition to irradiated cells, thereby increasing the number of damaged cells. Thus, this model would predict that the degree of harm from radiation exceeds the amount predicted by the LNT. Previous studies have shown an increase in the frequency of mutations, apoptosis, DNA damage and DNA repair induction in vitro after co-culture of nonirradiated and irradiated cells, and after transfer of medium from irradiated cells to nonirradiated cells [28,29]. A recent study has also demonstrated that human fibroblasts irradiated with doses as low as 1.2 mGy displayed a stronger than expected induction of genes involved in DNA repair . Although the mechanisms underlying the hypersensitivity model are still uncertain, data support a role for cell–cell signaling, possibly mediated by macrophages. In one study, chromosomal instability was induced in nonirradiated hematopoietic cells after transfer of macrophages from mice receiving 4 Gy of total body radiation .
The conflicting data supporting each of the four models underscore the need for a better understanding of the interactions between radiation damage at low doses, DNA repair response pathways, and the development of cancer . It is well known that ionizing radiation causes DNA damage by base modification and strand breaks. Exposure to radiation leads to DNA double-strand breaks (DSBs), the most serious and potentially lethal type of cellular damage that can result in carcinogenesis. A recent in vitro study using x-rays has shown that the number of DSBs is linear with doses ranging from 1 mGy to 1 Gy in cultured cells . Similar in vivo findings have been found in mice and humans exposed to x-rays with radiation doses of less than 100 mSv [33–35]. In addition, even low-dose radiation (i.e., 50 mSv) can result in loss of heterozygosity and telomere impairment that can result in chromosomal damage, leading to cancer [36,37].
To protect against the development of mutations, DNA damage can be detected by sensors that are transmitted by transducers and controlled by various effector pathways, resulting in the following possible outcomes:
Failure of the DNA damage response pathway to detect and repair mutations can lead to the accumulation of genetic damage and the development of cancer. For example, ataxia telangiectasia mutated (ATM) kinase is crucial in signaling DNA damage . Phosphorylation of ATM activates ATM-dependent signaling that is critical for phosphorylation of H2AX, p53 and checkpoint kinases, which are all involved in DNA repair, cell cycle arrest and chromatin remodeling . Mutations in ATM result in increased radiosensitivity and cancer susceptibility in affected . Similarly, patients with mutations in Artemis, an endonuclease required for repair of DSBs, have a severe immunodeficiency and an increased predisposition to developing lymphomas .
An important question to consider is whether a threshold dose of radiation is required to activate DNA damage response pathways. Initial evidence from an in vitro study suggested that ATM-dependent effector pathways, including p53, Check 1 and Check 2, were not threshold dependent, but only radiation doses higher than 200 mGy were evaluated . More recently, inefficient and even absent DNA repair measured by phosphorylation of H2AX (an ATM-mediated pathway) was noted at very low doses of radiation (<5 mGy) . Furthermore, activation of cell cycle checkpoints (specifically, the G2/M checkpoint) that monitor chromosomal integrity before progression to replication and mitosis requires the presence of at least 10–20 DSBs, which entails exposure to at least 200 mGy of radiation . Failure of checkpoint activation increases radiosensitivity and thus cancer susceptibility . Apoptosis, however, appears to be activated even at doses as low as 2 mGy . A recent study in patients undergoing CT, however, suggests that complete DNA repair occurs in vivo, even at very low doses (~5 mSv), as measured by phosphorylation of H2AX . Further in vivo studies are needed to evaluate whether a threshold for activation of DNA repair exists.
It will also be essential to compare the efficiency of DNA damage response activated at low versus high doses. In vitro studies have shown that dose and dose rate affect radiation-induced gene-expression profiles [48–51]. One study found that only 34 common genes, out of a total of 208 genes changed, were modulated in lymphocytes after exposure to 100, 250 and 500 mGy. Changes in gene expression have been noted at doses as low as 20 mGy [49,51]. Finally, not only is gene expression dose dependent, but it may also be dose-rate dependent. One study found that induction of one group of genes is dose-rate dependent while another set of genes is dose-rate independent . These findings suggest that the response to damage may vary at different doses and at different delivery rates. Whether the DNA damage response is more or less effective at low versus high doses thus remains unclear and warrants further investigation.
Although not currently regulated, the increasing exposure to low-dose radiation (<100 mSv) associated with diagnostic testing has been met with growing concern among medical professionals and patients. Exposure to low-dose radiation may not cause immediate harm to patients but may potentially have long-term biological effects, which has prompted the US FDA to announce a three-point initiative to ensure radiation protection for patients. The program's objectives are to:
In addition to the FDA's initiative, the US Department of Energy has also started the Low-Dose Radiation Research Program to promote investigation into this important area.
The first objective involves promoting the safe use of medical imaging devices. The FDA intends to issue requirements for manufacturers that include safeguards in their machines to minimize radiation risk and to provide appropriate training to support safe use by practitioners. The FDA also encourages providers to develop diagnostic radiation dose reference levels and to develop registries for radiation dose. Another strategy to minimize dose is to modify existing scanning protocols, which has been achieved in several studies using coronary computed tomographic angiography (CTA). In a recent prospective, controlled, nonrandomized study, it was shown that a best-practice protocol for coronary CTA, which minimized scan range, heart rate reduction, electronic-gated tube current modification and reduced tube voltage, decreased the estimated median radiation dose in the follow-up period by 53.3% without compromising image quality . Importantly, a second study found no difference in diagnostic accuracy compared with invasive angiography after using a standardized radiation reduction coronary CT angiography protocol . A radiation dose reduction of 16% was also achieved by using the calcium scoring images instead of the scout view to plan acquisition . Another study found that using minimal padding (i.e., the surrounding x-ray beam on-time) results in reduced radiation time without compromising image quality. Increased padding was associated with greater radiation dose (45% increase per 100-ms increase in padding; p < 0.001) . In addition, a recent study showed that tube current adaption based on anterior–posterior diameter rather than stepwise adaptation based on BMI improved radiation dose optimization in patients with diverse body habitus and significantly improved image quality . This is a simple and practical way to maintain constant image quality, irrespective of body habitus. The addition of a 320-multidetector row CT may further reduce radiation dose, although further study is needed .
The second objective advocates informed clinical decision making, which can be achieved by notifying clinicians of the dose administered to patients at the time of acquisition and encouraging provider adherence to published guidelines. The FDA has proposed that all manufacturers have devices that display, record and transmit the dose to the patient electronic record. An alarm to alert providers when the optimal dose is exceeded should also be installed. The NIH has already mandated manufacturers producing scanners at their clinics to have software to track a patient's radiation dose and log it into the medical record, which will support informed clinical decision making.
Cardiac imaging studies should also be ordered only after evaluating the risks and benefits to the patient, as defined by the ‘appropriateness criteria’ [59,60]. Unfortunately, recent studies have shown that providers do not always follow these guidelines. For example, a recent study found that 12% of the nuclear SPECT tests were still inappropriate . Similarly, another study found that 46% of CT studies were ordered for indications not established by these criteria . Encouragingly, a recent study showed that institution of these criteria has had a positive effect . For example, the number of appropriate CT examinations increased from 69.5 to 78.5%, whereas the number of inappropriate examinations decreased from 11.5 to 4.6%. Of note, cardiologists were more likely to order appropriate CT examinations than noncardiologists.
Furthermore, the increasing use of radiation-based imaging tests for prevention in asymptomatic individuals should be discouraged. The risks often outweigh the benefits in these patients who are typically younger, who have a low likelihood of having disease, and who would require testing at regular intervals. In addition, the utility of these tests remains contentious . For example, the efficacy of lung cancer screening by CT and its effects on mortality are debatable, although results from a recent study funded by the NIH showed improvement in mortality in smokers screened by spiral CT compared with chest x-rays . For atherosclerosis screening, coronary CT is only useful in those patients whose management would be changed (i.e., prescribed more aggressive treatment) if the patient had significant coronary calcium.
The third objective is to increase patients’ awareness of their own exposure. Patients should be encouraged to keep a medical imaging history card that will allow them to track their own medical imaging history and share it with their providers. The FDA is also collaborating with the American College of Radiology and Radiologic Society of North America joint task force to develop and disseminate a patient medical imaging record card that will be available on their website .
Epidemiological and experimental data suggest that the relationship between dose and cancer risk may not be adequately explained by the LNT model. It is likely that cellular and tissue response to radiation, including damage and subsequent repair, are also modulated by specific trigger thresholds, hypersensivity and hormesis. Furthermore, individual genetic susceptibility is an important factor that is not incorporated into current models of cancer risk related to radiation. Overall, at present there is insufficient evidence to jettison the LNT model. The LNT model is used because it follows the precautionary principle. For the time being, the best and safest recommendation is to use caution when ordering any imaging tests and to be cognizant of the risks and benefits to the patient. Radiation exposure, however, is an unavoidable risk of imaging and should not be considered in isolation when ordering imaging procedures. Both providers and patients should be informed of their reasons for the test and the potential risks and benefits of the test.
Without changes in federal regulation and/or the medical reimbursement system, radiation exposure from diagnostic testing will continue to increase exponentially over the next decade. Unfortunately, our current understanding of the carcinogenic risk from low-dose radiation is limited by conflicting data from epidemiological and in vitro studies. These inconsistencies may diminish if future studies detail such factors as the radiation source, dose, dose rate, dose frequency, tissue type/cell irradiated and time of analysis postradiation in order to ensure that valid comparisons of data can be performed. Furthermore, a better understanding of the DNA repair response pathways and difficulties in detecting mutations will enable a more accurate estimate of radiation risk. Improvement in the sensitivity and specificity of molecular and cellular techniques will facilitate the identification of biomarkers of radiation injury that may indicate increased carcinogenic risk. Although estimating the health risk from low-dose radiation remains controversial, the most prudent recommendation is to minimize exposure to all sources of radiation, especially those from unnecessary medical imaging tests. In general, decisions regarding the use of imaging tests should be evaluated based on individualized risk–benefit analysis for each patient. Radiation exposure should not be considered as a contraindication for testing if the procedure is clinically indicated and appropriate for the management of patients.
This work was supported by grants from the ACC-GE Healthcare Career Development Award (PKN), and NIH grants EB009689 and HL093172 (Joseph C Wu). The authors have no other relevant affiliations or financial involvement with any organization or entity with a financial interest in or financial conflict with the subject matter or materials discussed in the manuscript apart from those disclosed.
No writing assistance was utilized in the production of this manuscript.
Papers of special note have been highlighted as:
• of interest
•• of considerable interest