|Home | About | Journals | Submit | Contact Us | Français|
Current guidelines for limiting exposure of humans to ionizing radiation are based on the linear-no-threshold (LNT) hypothesis for radiation carcinogenesis under which cancer risk increases linearly as the radiation dose increases. With the LNT model even a very small dose could cause cancer and the model is used in establishing guidelines for limiting radiation exposure of humans. A slope change at low doses and dose rates is implemented using an empirical dose and dose rate effectiveness factor (DDREF). This imposes usually unacknowledged nonlinearity but not a threshold in the dose-response curve for cancer induction. In contrast, with the hormetic model, low doses of radiation reduce the cancer incidence while it is elevated after high doses. Based on a review of epidemiological and other data for exposure to low radiation doses and dose rates, it was found that the LNT model fails badly. Cancer risk after ordinarily encountered radiation exposure (medical X-rays, natural background radiation, etc.) is much lower than projections based on the LNT model and is often less than the risk for spontaneous cancer (a hormetic response). Understanding the mechanistic basis for hormetic responses will provide new insights about both risks and benefits from low-dose radiation exposure.
Since Roentgen’s discovery of X-ray in 1895 and Becquerel’s discovery of radioactivity in 1896, radiation protection standards have changed repeatedly. It was generally believed at the end of the 19th century and early in 20th century, that radiation exposure has a number of health benefits including radiation therapy for cancer. The first recorded X-ray treatment of a cancer patient was performed in 1896. More recently, the use of radionuclides to treat both malignant and nonmalignant tumors has become established (Prekeges 2003). The “mild radium therapy” involving the oral or parenteral administration of microgram quantities of radium and its daughter isotopes was commonly used (Macklis 1990; Wolff 1992). Over 400,000 bottles of “Radithor” (a popular and expensive mixture of radium-226 and radium-228 in distilled water) were sold between 1925 and 1930. The death in 1932 of the prominent millionaire Eben M. Byers from radium poisoning after consuming a large quantity of Radithor helped bring about an end to the era of “mild radium therapy.” The event alerted the public and much of the medical profession about the potential harmful effects of internal radium (Macklis 1990).
During the early decades of the 20th century consensus emerged that the most fundamental dose-response relationship was the threshold type. With the threshold model there is no harm below a certain “threshold” dose. The threshold model was previously used in establishing limits for radiation exposure of humans (Calabrese 2009a). In the 1920s Herman J. Muller in his pioneer experiments on fruit fly, Drosophila, found that ‘treatment of the sperm with relatively heavy doses of X-rays induces the occurrence of true “gene mutations” in a high proportion of the treated germ cells’ (Muller 1927). Muller later surprisingly and incorrectly concluded that mutation frequency “is exactly proportional to the energy of the dosage absorbed” (Muller 1930), despite the limited data and lack of overall consistent findings for low-dose range (for a detailed critical review, see Calabrese 2009b). Despite of the uncertainty regarding the Muller’s data, this was one of the factors that drove the early use of the linear-no-threshold (LNT) model according to which any dose of radiation can cause detrimental health effects such as cancer or hereditary (genetic) effects. Until 1957 the concept of maximum permissible dose (MPD) for occupational radiation exposures was used, implying a threshold for carcinogenicity. The NCRP report No. 17 (1954) defined the MPD as “that radiation dose which should not be exceeded without careful consideration of the reasons for doing so”. After health studies on the Japanese atomic bombing survivors did indicate that ionizing radiation causes ill effects then the LNT model of radiation carcinogenesis was accepted in 1958 as dogma in radiation protection, but not without criticism (Brues 1958). The LNT model of radiation carcinogenesis is based on the following assumptions: every dose, no matter how low, carries with it some cancer risk; risk per unit dose is constant and independent of dose rate; risk is additive and increases as dose increases. This model reflects the simplified assumption that each radioactive particle or photon can cause a DNA mutations, which, in turn, can lead to a cancer.
The LNT model originated as a prudent operational guideline; however, it has later acquired the aura of a valid scientific theory without any sufficient evidence for it (Strzelczyk et al. 2007; Cohen 2008; Jaworowski 2008; Tubiana et al. 2009). It neglects the fact that life evolved on Earth in conditions of natural background radiation levels which 3.5 billion years ago was about 3 times higher than now (Jaworowski 1997; Karam and Leslie 1999). Therefore, all living beings survived via evolutionarily-derived adaptive responses to the harsher natural radiation environment (Draganić et al. 1990; Jaworowski 2008).
Although background radiation levels are much lower today, these adaptive-response phenotypes can probably be invoked by ionizing radiation doses and dose rates somewhat higher than current natural levels. The debate on the scientific validity of the LNT hypothesis has increased dramatically in recent years. The validity of this hypothesis was supported by the discovery that proto-oncogene can be converted into active oncogene by point mutation. However, when LNT was introduced, the proportionality between dose and DNA damage was already known, but the existence of cell defense mechanisms was largely unknown at that time. There was already some criticisms when LNT was first introduced; see, for example, Brues (1958) cited above. However, after discovery of these mechanisms, this issue became more controversial.
There is a growing body of experimental and epidemiological evidence that does not support the use of the LNT model for estimating cancer risks at low doses of low linear-energy-transfer radiation (Cohen 1995; Azzam et al. 1996; Redpath et al. 2001; Boreham et al. 2006; Sakai et al. 2006; Day et al 2007; Mitchel 2007a,b; Portess et al. 2007; Sanders and Scott 2008; Tubiana et al. 2006, 2009). Instead, the results support the existence of hormetic-type, dose-response relationships with low doses and dose rates being protective and high doses causing harm (Calabrese 2008, 2009b; Pollycove and Feinendegen 2008; Scott, 2007, 2008). The literature that shows health benefits from increased exposure to ionising radiation has more than 3000 reports (Luckey 2008a). The average level of natural radiation for the Earth is 2–3 mSv/y (UNSCEAR 2000, Annex B). According to T.D. Luckey ‘we would have abundant health for any increased level up to the threshold, almost 3000 times the ambient level’ with dose rates from 3 mSv/y to 8 Sv/y (Luckey 2008a, figure 3).
The radiation hormesis model, unlike the LNT model, assumes that adaptive/protective mechanisms can be stimulated by low-dose radiation and that they can prevent both spontaneous and toxicant-related cancers as well as other adverse health effects (Calabrese et al. 2007). Such stimulated adaptive protection can thereby improve health (Prekeges 2003). The LNT model assumes, for example, a constant carcinogenic effect per unit dose irrespective of dose and dose rate, implying that ability of the body to compensate for damages is constant. However, a number of experimental studies have provided strong evidence that it is not correct, as DNA repair is stimulated after low-dose irradiation and inhibited after high-dose irradiation (Cohen 1994). Such differences in response as a function of dose also can be associated with an elevated level of apoptosis after low-dose radiation exposures (Krueger et al. 2007). Rothkamm and Lobrich (2003) found that level of DNA double-strand breaks (DSBs) in human fibroblasts cultures exposed to very low radiation doses (≈1 mGy) decreases to that of unirradiated cell cultures if the cells are allowed to proliferate after irradiation. The authors present evidence that this effect may be caused by an elimination of the cells carrying unrepaired DSBs. In addition, low radiation doses can stimulate immunity against cancer (Cheda et al. 2004, 2009; Nowosielska et al. 2006a,b; Hayase et al. 2008).
It was repeatedly shown that low doses and dose rates of low-linear-energy-transfer (LET) radiation activate a system of cooperative protective processes in the body. They were found to stimulate intracellular and intercellular signaling pathways, that leads to activated natural protection against cancer and other genomic-instability-associated diseases. The key components of the radiation-induced hormetic response are elimination of preneoplastic and other aberrant cells by apoptosis, induction of the DNA-repair pathways, activation of immune functions, production of stress proteins, scavenging of free radicals, activation of membrane receptors, secretion of cytokines and growth factors, and compensatory cell proliferation (for reviews see Luckey 2003; Prekeges 2003; Schollnberger et al. 2004; Feinendegen 2005; Scott and Di Palma 2007; Feinendegen et al. 2007; Liu 2007; Scott et al. 2007, 2009). Low-dose irradiation (gamma or X-rays) was proposed to activate wide-spread, protective epigenetic signaling among normal cells (up-regulation of adaptive-response genes) that facilitate managing future threatening genetic hazards (Scott et al. 2009).
The cell defence strategy depends on the dose, the dose rate and the amount of damage in neighbouring cells (Tubiana 2008). The overall effect results from the confrontation between the initial DNA damage (which increases linearly with the dose) and the cell defense mechanisms which are more effective at low doses, and less effective when the dose increases. System response to low-level radiation exposure can evolve from damage to the basic molecular level, and also from adaptive responses that may occur on whole-body level (Feinendegen et al. 2007). When the protection operates, the dose-carcinogenic effect relationship is not linear and a threshold or hormetic effect can be observed (Calabrese, 2004). The balance between damage and protection depends on tissue dose; at single doses below 0.1 Gy net benefit tends to outweigh detriment (Feinendegen et al. 2007). Based on research observations and mathematical modeling, radiation doses above an individual-specific stochastic threshold have been interpreted to protect from spontaneous neoplastic transformation occurrence both in vitro and in vivo, and to suppress spontaneous cancers in humans (Redpath 2005; Sakai et al. 2003; Scott 2007; Scott and Di Palma 2007).
The LNT hypothesis is testable in epidemiological studies; however, until recently, such test have been hampered by the lack of sensitive methods to detect biological responses to low doses and more importantly by faulty study methods (Scott 2008; Scott et al. 2008). One faulty method is throwing away some of the radiation dose (call dose lagging), which can abolish thresholds. Another faulty method is averaging doses over wide dose intervals and including irradiated persons in the un-irradiated group which can abolish hormetic responses. A number of epidemiological studies provide direct evidence of the effects of radiation exposure on human populations. However, there are problems in interpreting the evidence that they provide in the context of radiation protection in general. One of the most important of these problems arises because many of the studies involved populations which received relatively high radiation exposures in a relatively short time. At high doses and dose rates, there is evidence that the effects of radiation exposure are proportionately greater than at the low doses and dose rates which are usually more relevant to occupational, medical and public exposures (Wall et al 2006).
Under the LNT risk assessment paradigm, the cancer frequency at low doses and dose rates is reduced by a subjective dose and dose rate effectiveness factor (DDREF). Thus, the slope of the dose-response curve observed at high doses and dose rates is reduced at low doses or after low-dose-rate exposure by the DDREF, leading to a different LNT function for use in these cases. The use of the DDREF approach therefore effectively prevents a negative slope to the dose-response curve since any positive slope (determined by high-dose, high-dose-rate data) divided by a fixed positive number yields another positive none-zero slope. The DDREF is also applied to fractionated exposure yielding a positive slope for the resulting LNT function (Hall and Giaccia 2006).
Application of the LNT model to data from animal studies has led to the conclusion that the DDREF is in the range of 2–10 (UNSCEAR 2000, Annex G); however, animal data showing a large threshold for cancer induction by protracted exposure to low-LET radiation (e.g., Tanooka 2000; Yamamato et al. 1998, 2000; Brooks et al. 2007; Mitchel 2009) apparently were not taken into consideration in such evaluations. The summary reports by scientific committees (NRPB 1995; UNSCEAR 2000, Annex G; NCRP 2001; EGIS 2007) all suggest a DDREF of about 2 as a reasonable “judgment” to be used for cancer risks evaluation “for radiation protection purposes.” Consequently, the LNT-model-based quantitative estimates of the risks per unit dose derived from epidemiological studies at high doses and high dose rates are halved in order to estimate probabilities of radiation-induced cancer following diagnostic medical exposures (Wall et al. 2006). However, there is mounting evidence that challenges the concept of dose as an indicator of risk for low dose and dose rate irradiation, and suggests that the uniform application of a fixed DDREF for doses below a specified value may be inappropriate (Azzam et al 1996; Redpath et al. 2001; Mitchel 2007a,b; Cohen 2008; Jaworowski 2008; Scott 2008; Tubiana et al. 2009).
With application of the DDREF to high dose-rate data, there is an imposed discontinuity (steep drop from the high-dose, high-dose-rate line to the lower “low-dose line”) at the highest dose for which the DDREF is employed. This leads to a hockey-stick type dose-response curve (with two disconnected linear arms with different slopes and the same zero-dose intercept) which is not supported by any real data.
In addition, where low-dose or low-dose-rate exposure to ionizing radiation has resulted in a reduction in cancer risk rather than an increase among radiation worker populations, the decrease has usually been automatically credited to a presumed healthy worker effect rather than being a hormetic response; however, the claimed healthy worker effect is not justifiable in many cases (Fornalski and Dobrzynski 2009). For such cases, radiation hormesis is a more plausible explanation (Fornalski and Dobrzynski 2009).
The EGIS report (2007) concluded that “the utility of the DDREF depends upon the assumption that, for exposure to low doses at low dose-rate, the dose-response is linear with a slope that is less than that for high dose, high dose rate exposures. However, in contrast to the situation with low LET radiation, fractionated doses of high LET often produce either a similar number, or even more late effects, than high acute doses. In this case fractionation may reduce the effectiveness of cell killing as a mechanism for reducing effects. In contrast, some low dose and low dose rate studies using low LET radiation in cells and in adult animals have shown that below a threshold dose (about 100 mGy in human cells, rodent cells and normal mice) the detrimental effects of a radiation exposure disappear and are replaced by protective effects, manifested in cells by decreases in transformation frequency and in animals by increases in cancer latency. Moreover, different tissues seem to display different dose thresholds. Observations also suggest that the risk induced by exposure to low doses of high LET radiation, and even low LET radiation, may be underestimated when risk coefficients have been derived from observations at high doses. All of these observations suggest that the biological processes occurring in cells in response to low doses and dose rates or to fractionated doses can be fundamentally different from those that result from exposure to high doses, calling the concept of DDREF and its universal application into question” (EGIS 2007).
Radiologists were one of the earliest occupational groups exposed to ionizing radiation. Among these physicians, an excess of skin cancer and leukaemia, together with an increased risk of cancer deaths as well as deaths from all causes, were detected prior to 1950. The detailed review of the historical development of radiation safety standards for radiation workers in the United States and other Western countries presented in the report of the Yoshinaga et al. (2004) illustrates the remarkable improvement in radiologic protection and the concomitant reduction in radiation exposure during the 20st century. The first radiologic protection standard for occupational exposure introduced in 1902 was equivalent to 0.1 Gy per day (30 Gy per year). In 1928, the U.S. Advisory Committee on X-ray and Radium Protection proposed the first formal standard of 0.1 R per day (0.3 Sv per year). In 1957, the International Commission on Radiological Protection recommended a dose limit of 0.05 Sv per year, which led to much lower levels of radiation exposure.
By reviewing epidemiologic data on cancer risks from eight cohorts of over 270,000 radiologists and radiologic technologists in various countries, Yoshinaga et al. (2004) revealed an increased mortality due to leukemia among early workers employed before 1950, when radiation exposures were high. After implementation of the recommendations of the International Commission on Radiological Protection in 1930 (dose limit of 500 mGy per year), that excess has decreased and it has disappeared among physicians who started working after 1950 (Matanoski et al. 1987; Berrington et al. 2001; Cameron, 2002; Doll et al. 2005). Mortality of British radiologists who had registered since 1954 was remarkably low in comparison with that of medical practitioners as a whole; this was true both for cancer and for all other causes of death combined (Berrington et al. 2001). Cameron (2002) stressed that ‘the British radiology data show that moderate doses of radiation are beneficial rather than a risk to health.’
Taken together, the epidemiological data regarding the risk for cancer mortality in the radiologists and radiotherapists suggest that doses of about 20 mGy per fraction can have cumulative effects, and that the lowest potentially cancer-inducing dose is about 500 mGy (Tubiana 2008).
The large numbers of nuclear industry workers around the world present a possibility of deriving risk coefficients of direct relevance to radiological protection, however, it would appear that there are some problems with this study that require attention before reliance can be placed upon the results (Wakeford 2009).
The largest nuclear workers cohort study to date finds little evidence for an association between low doses of external ionizing radiation and chronic lymphocytic leukemia mortality (Vrijheid et al. 2008). In a cohort study of mortality among 954 Canadian military personnel exposed to low-dose ionizing radiation during nuclear reactor clean-up operations in Chalk River, Ontario, no excess of death from leukemia or thyroid cancer was found in the exposed groups. A survival analysis by recorded gamma radiation dose also did not show any effect of radiation dose on mortality (Raman et al. 1987). Canadian study examining the effect of occupational radiation exposure among nuclear industry workers found that cancer mortality in this population was 58% of the national average (Abbatt et al. 1983). In United Kingdom, a negative association between radiation exposure and mortality from cancers, in particular leukemia (excluding chronic lymphatic leukemia) and multiple myeloma was found for radiation workers (Kendall et al. 1992). Cardis et al. (2005) in the study of 407,391 nuclear plant workers in 15 countries concluded that 1–2% of cancer deaths among the cohort can be due to radiation. A study of the cancer mortality among 85,000 nuclear industry workers in three countries did not detect any increase in solid tumour incidence, but found a small increase in leukaemia for those exposed to doses over 400 mSv (Cardis et al. 1995). Subsequently, in the study of larger cohort (410,000 workers), no cancer excess for cumulative doses below 150 mSv was observed (Cardis et al. 2007; Vrijheid et al. 2007). Zablotska et al. (2004) in the study of 45,468 Canadian nuclear power industry workers found much lower cancer mortality rate than in the general population. For all solid cancers combined, a significant reduction in risk in the 1–49 mSv category compared to the lowest category (< 1 mSv) with a relative risk of 0.699 (95% CI = 0.548, 0.892) was observed. Above 100 mSv the risk appeared to increase (Zablotska et al. 2004). In their recent meta-analysis of the epidemiological studies on cancer incidence and mortality risks from low-dose-rate, moderate-dose (LDRMD) radiation exposures, Jacob et al. (2009) concluded that this analysis does not confirm that the cancer risk per dose for LDRMD exposures is lower than for acute, high-dose exposures. The participation in United Kingdom’s atmospheric nuclear weapon tests had no detectable effect on expectation of life or on subsequent risk of developing cancer or other fatal diseases (Darby et al. 1993). In Kostyuchenko and Krestina (1994) study, changes in sex and age structure, mortality and reproductive function of 7854 persons exposed to radiation by thermal explosion in a radioactive waste storage facility in the former Soviet Union (Siberia) in 1957 have been investigated. It was determined that the exposure groups of 496 mGy, 120 mGy, and 40 mGy corresponded to 28%, 39%, and 27% lower cancer mortality rates, respectively, compared to those living in nearby villages that were not exposed. No carcinogenic effect for doses below 100 mSv for individuals irradiated during the Chernobyl accident was obtained (Cardis et al. 2006; UN Chernobyl Forum, 2006). However, some studies, for example those conducted by Ivanov (2007) and Ivanov et al (2004, 2009), have indicated an increased risk of cancer among Chernobyl cleanup workers. Ivanov et al. (2009) studying risk of solid cancers among 59,770 persons who stayed in the radiation exposure zone (30-km zone around the Chernobyl nuclear power plant) in 1986–1987, found that the excess relative risk per unit dose was 0.96 (95% CI: 0.3–1.7) and the minimum latent period for induction of solid tumors was 4.0 years (95% CI: 3.3–4.9). However, the methodology of this research seems to be questioned, because persons exposed to a very wide range of cumulative doses (0.001–0.3 Gy) were all included in the study (Ivanov et al. 2004).
The effective dose values in computed tomography (CT) generally range from 1 to 20 mSv. An excess of cancers has never been detected for doses below 100 mSv. However, some authors stress the risk of cancer after diagnostic radiation exposure by using the LNT model. For example, in their recent report Brenner and Hall (2007) concluded based on LNT hypothesis, that in a few decades about 1.5–2 percent of all cancers in the U.S. population might be caused by CT scan usage. Several data really confirm that low doses of radiation, delivered weekly, can have cumulative effects. A breast cancer excess was detected in girls and young women after repeated chest fluoroscopic procedures for chronic tuberculosis or scoliosis (Miller et al. 1989; Boice et al. 1991; Moory Doody et al. 2000; Ron 2003). However, after repeated X-ray examinations, a cancer excess was found only for cumulative doses greater than about 0.5 Gy (Tubiana et al. 2008).
Many data do contradict the assumption that diagnostic X-ray examinations are dangerous. Several studies have not detected any increase in both leukemia and solid tumors in X-ray examined patients (Boice et al. 1991; Kleinerman 2006). Scott et al. (2008) even point out that CT scans may reduce rather than increase the risk of cancer. Radiation exposure associated with diagnostic X-rays, mammograms, and CT scans can probably reduce cancer risk via stimulating the removal of precancerous neoplastically transformed cells and other genomically unstable cells, as well as can prevent metastasis of existing cancer (Liu 2003, 2007; Cheda et al. 2004; Nowosielska et al. 2006b; Bauer 2007). For high-energy, gamma-ray photons, the protective zone includes doses in the range 1 mGy to 100 mGy (Scott and Di Palma 2007). Doses currently associated with routine diagnostic X-ray procedures fall in the protective zone and therefore may likely be protective against cancer and some other diseases.
Ionizing radiation is a powerful tool to treat cancer. One limitation in radiotherapy is the unavoidable damage delivered to the normal, non-cancer cells that can give rise to side effects (Löbrich and Kiefer 2006). In radiotherapy, the irradiation is fractionated, and the dose rate is similar to that of radiodiagnosis (from 1 Gy/min for beam radiotherapy to 1 Gy/h for brachytherapy). During a course of iodine-131 therapy, the extrathyroidal absorbed doses range from a minimum of 0.15 to 0.5 rad/mCi for breast and gonads to a maximum of 1.5 to 2 rad/mCi for stomach and salivary glands (Zanzonico 1997). An excess of solid tumours and leukaemia was observed in thyroid cancer patients who received several hundred mCi of radioactive iodine-131 (Rubino et al. 2005), whereas no cancer excess was detected among the large number of patients treated by 10–20 mCi of radioiodine-131 for the treatment of hyperthyroidism (Franklyn et al. 1999). At a low dose rate (up to 15 mGy/min), the carcinogenic effect appears to be reduced in patients; a dose per fraction between 120–160 mGy cumulating to about 3 Gy caused less carcinogenesis than higher doses per fraction (Rubino et al. 2003; Moon et al. 2006; Suit et al. 2007; Tubiana et al. 2009). The main factors are the dose and dose rate. It has been shown the great impact of the dose per fraction (Rubino et al. 2003). According to the authors, the influence of the dose rate and fractionation is probably related to DNA repair, either during irradiation (brachytherapy) or between sessions (beam therapy). The impact of the dose rate probably explains why the incidence of cancer is lower for the same dose following medical irradiation than for an A-bomb survivor (Tubiana 2008).
The Biological Effects of Ionizing Radiation VI Report of the National Academy of Sciences ‘The Health Effects of Exposure to Indoor Radon’ (BEIR VI 1998) is the most definitive accumulation of scientific data on the health effects of radon to date. On the basis of the epidemiologic evidence from miners and understanding of the genomic damage caused by alpha particles, the BEIR VI committee concluded that exposure to radon in homes is expected to be a cause of lung cancer in the general population. The report confirms that radon is the second leading cause of lung cancer after smoking with an estimated 15,000 to 22,000 lung cancer deaths each year in the United States, and that it is a serious public health problem. In the report, it is stated that smoking increases the lung cancer risk by a factor of 10–20 and radon by 0.2–0.3, which implies a smoking risk about 50 times higher than the radon risk (BEIR VI 1998). The BEIR VI committee selected a LNT relationship relating exposure to risk for the relatively low exposures at issue for indoor radon. This assumption has significant implications for risk projections.
However, in extrapolating the miner data to the situation of radon in homes, considerable uncertainty exists in estimating the risks from radon exposure. In BEIR VI report, domestic radon risk estimates were extrapolated from miner data which are at both higher doses and higher dose rates. Moreover, extrapolation of risk from miner data is problematic, because miners are known to have smoked more than the average population. In most of the miner populations that have been studied, the majority of the men would have been cigarette smokers. For example, in the Schneeberg area, the miner’s smoking rate in the early uranium-mining period, also called the “wild years” (1946 to 1954), has been estimated to have been above 90 % (Becker 2003). In most cases, this will have had an effect on their lung cancer risk that is even greater than their radon exposure (Darby and Hill 2003).
Numerous studies have demonstrated that environmental exposures to ionizing radiation can suppress cancer and other diseases, and environmental radiation hormesis associated with elevated levels of indoor radon and with elevated background radiation appears to be preventing many cancer deaths (for reviews, see Luckey 2003, 2008a,b; Cameron 2005; Scott and Di Palma 2007; Scott et al. 2008). Both early (Cohen 1993, 1994) and recent (Thompson et al. 2008; Scott et al. 2009) data for inhalation exposure to radon and its progeny suggest a hormetic dose-response relationship for lung cancer. Based on an analysis of the data of Thompson et al. (2008), lifetime exposure to residential radon at the United States Environmental Protection Agency’s action level of 4 pCi/ L of air appears to be associated with a > 60% reduction in lung cancer cases, rather than an increase (Scott et al. 2009).
Large-scale epidemiological studies have been conducted to analyse the frequency of health effects in high natural background radiation areas (for review, see Hendry et al. 2009). These studies mainly considered the risk of cancer on the basis of mortality data or of incidence data. Some studies also considered the risks of non-cancer diseases or of congenital malformation. Overall, the studies examining health effects of background radiation have reported that populations in areas with high background radiation rates show no adverse health effects when compared to low-dose populations.
In a Chinese study which has compared area with average radiation exposure of 2.31 mSv/y with a similar area with 0.96 mSv/y average exposure, the cancer mortality rate was lower in the high-background group (High Background Radiation Research Group 1980). However, this difference was statistically significant only in the oldest-aged group (40–70 years of age), i.e., in those individuals who have the greatest lifelong exposure to high background levels of radiation. In the other large-scale Chinese study, the cancer mortality rate was lower in the area with a relatively high background radiation (74,000 people), while the control group who lived in an area with low background radiation (78,000 people) had a higher rate of cancer mortality (Wei et al. 1990). In a similar Indian study, the cancer incidence/mortality rates was significantly less in areas with a high-background radiation level than in similar areas with a low background radiation level (Nambi and Soman 1987).
Chromosome aberrations in the peripheral blood lymphocytes from the high-background radiation areas residents, e.g. in Kerala (India) and Yandjang (China) have been measured (Tao et al. 2000; Nair et al. 2009). It is important to recognize that chromosome aberrations are good bio-markers of radiation dose received by exposed individuals, but do not seem to reflect radiation risk (Brooks 1999; Brooks et al. 2003). The population living in Kerala, India, experiences a lifetime terrestrial irradiation of up to 70 mSv a year, an average, about 7.5 times higher than other populations in India (Nair et al. 2009). In Yangjiang, China, the concentrations of natural radionuclides in soil are quite different between different areas especially that of thorium, which is 5.2 to 7.6 times higher in high background radiation areas than that in the neighboring control areas (Tao et al. 2000). Although high background radiation has been repeatedly shown to increase the frequency of chromosome aberrations in the circulating lymphocytes of exposed persons, in both areas, there was no increase in cancer incidence or mortality associated with a high radiation background (Tao et al. 2000; Nair et al. 2009). Cohen (1993) found a significant negative correlation between average radon levels and mortality rates from the lung cancer in 1600 US counties. The comparison of three Rocky Mountain States (Idaho, Colorado, New Mexico) with three Gulf States (Louisiana, Mississippi, Alabama) in the USA showed a strong negative correlation between natural radon levels and estimated lung cancer mortality (Jagger 1998). In a large-scale study in the USA, the mortality rate due to all malignancies was lower in states with higher annual radiation dose (Frigerio and Stowe 1976). A lower cancer mortality was reported among inhabitants in the Misasa spa area (Japan), where there is a high radon background, than in the whole Japanese population (Mifune et al. 1992). Urban residents in Misasa are exposed to much greater levels of radiation than rural residents due to the presence of many radon spas. Kondo (1993) by comparing cancer mortality rates of residents from the urban vs. rural areas in Misasa, showed that urban population (with spas) have significantly lower cancer mortality rates than those living in the suburbs.
Thus, epidemiological studies have failed to show any adverse health effects in the populations living in the terrestrial high-background radiation areas. Summarizing these points, Cameron (2005) stated that ‘we need increased background radiation to improve our health.’ A mountain of facts indicating that background ionizing radiation may be essential for life provided a strong argument in favor of a beneficial effect of low-level radiation.
Most population-based cancer risk estimates following radiation exposure are based primarily on the Japanese atomic bomb survivor cohort data, because of the wide and well-characterised range of doses received. Doses were estimated from position and shielding for each person at the time of A-bombing. A total of 86,543 persons were in the exposed cohorts of the two cities; 45,148 received up to 10 mSv and served as “in city controls”. Over 90% of the exposed cohorts received less than 500 mSv (Luckey 2008b).
A-bomb survivors studies generally indicate that acute irradiation with absorbed doses of more than 0.1–0.2 Gy per person increases cancer risk in the exposed groups proportionally with dose (Pierce et al. 1996; Pierce and Preston 2000). Leukaemia was the first cancer to be associated with A-bomb radiation exposure, with preliminary indications of excess among the survivors within the first five years after the bombings; an excess of solid cancers became apparent approximately ten years after radiation exposure (Little 2009). However, the dose-effect relationship for the A-bomb survivors is not linear (Preston et al. 2004). There is a threshold for leukaemia among the A-bomb survivors (Little and Muirhead, 2000). No evidence for risk increase after absorbed doses below 0.1 Gy was obtained (Heidenreich et al. 1997; Preston et al. 2004). According to the UNSCEAR report (1994), among A-bomb survivors from Hiroshima and Nagasaki who received doses lower than 200 mSv, there was no increase in the number of deaths due to cancers. Mortality caused by leukemia was even lower in this population at doses below 100 mSv than in age-matched control cohorts. Moreover, there is some evidence for increased health in Japanese A-bomb survivors, including decreased mutation, leukemia and solid tissue cancer mortality rates, as well as increased average lifespan (Luckey 2008b).
A number of studies done on the atomic bomb survivors have shown no demonstrable genetic or hereditary effects among those exposed to the radiation (UNSCEAR 2000, Annex G). Although low dose radiation-induced genetic effects have been observed in laboratory animals (Dubrova et al. 2000; Barber et al. 2002; 2006), no evidence of genetic effects has been found among the children born to atomic bomb survivors from Hiroshima and Nagasaki (Neel et al. 1990). These data suggest that humans are probably less sensitive to the genetic effects of radiation than has been assumed on the basis of extrapolations from experiments with mice.
It should be pointed out that studies based on A-bomb survivors ignore the fact that the population was exposed to multiple stressors including radiation, blast wave and thermal wave. No corrections are made for synergistic effects of the combined exposure making it questionable as to whether cancer risk estimates based on A-bomb survivor data should be applied to low dose radiation exposure for other population (e.g., persons exposed in a clinical setting to diagnostic X-rays).
Current recommendations for limiting exposure to ionizing radiation in humans are based on the LNT model of radiation carcinogenesis, which states that any radiation dose, no matter how small, can cause cancer. For example, the last BEIR report on the effects of low level radiation, BEIR VII Phase 2 report (2005) reconfirmed that the LNT model is the most practical model to estimate radiation risks, especially for radiation protection purposes. However, many scientists are questioning whether the logic used in the existing guidelines for cancer risk assessment based on mode-of-action approach is the best way to safeguard and promote public health (Cuttler and Pollycove 2009). The proposed guidelines ignore the fact that mechanisms of natural radiation protection can reduce or even eliminate minor damage caused by low-dose radiation, and in doing so can lead to radiation phobia among media, politicians and public. Recent data support the non-LNT or hormesis hypothesis which increased the controversy over the LNT model (Aleta 2009; Hoffmann 2009; Little at al. 2009).
Some authors concluded that excess in cancer mortality exists among persons exposed to low-level ionizing radiation in doses below 100 mGy (Brenner et al. 2003; Cardis et al. 2005; Darby et al. 2005). However, the statistical methodology of these studies is debatable because they are based on the concept that cancer induction process is the same after low-or high-dose irradiation (Breckow, 2006). In all such studies, people who had received doses much higher than 100 mSv were included. If these studies were restricted to persons receiving less than 100 mSv, the cancer excess was not significant (Tubiana 2008).
The debate still rages with some data suggesting that low doses and low dose rates are even more hazardous that previously projected (Cardis et al. 2005; Jacob et al. 2009) while the bulk of the data suggest that there is little or no increased risk in the low dose region. The conclusion from the evidence reviewed in this paper and elsewhere is that the LNT hypothesis fails very badly in the low-dose, low-dose-rate region. This means that the cancer risk from the ordinarily encountered radiation exposures (medical X-rays, background radiation, etc.) is much lower than given by the LNT-based estimates. Among humans, there is no evidence of a carcinogenic effect for acute irradiation at doses less than 100 mSv and for protracted irradiation at doses less than 500 mSv (Tubiana et al. 2009).
Moreover, there is abundant evidence that low doses and dose rates of low-LET radiation may be of benefit, rather than detriment. The finding of a hormetic dose-response relationship at low exposure levels means that a carcinogen’s mode of action is expected to be non-linear at low-dose radiation exposures. Further understanding of this relationship can provide new insights about mechanisms of radiation carcinogenesis.