|Home | About | Journals | Submit | Contact Us | Français|
The Chernobyl accident was probably the worst possible catastrophe of a nuclear power station. It was the only such catastrophe since the advent of nuclear power 55 years ago. It resulted in a total meltdown of the reactor core, a vast emission of radionuclides, and early deaths of only 31 persons. Its enormous political, economic, social and psychological impact was mainly due to deeply rooted fear of radiation induced by the linear non-threshold hypothesis (LNT) assumption. It was a historic event that provided invaluable lessons for nuclear industry and risk philosophy. One of them is demonstration that counted per electricity units produced, early Chernobyl fatalities amounted to 0.86 death/GWe-year), and they were 47 times lower than from hydroelectric stations (~40 deaths/GWe-year). The accident demonstrated that using the LNT assumption as a basis for protection measures and radiation dose limitations was counterproductive, and lead to sufferings and pauperization of millions of inhabitants of contaminated areas. The projections of thousands of late cancer deaths based on LNT, are in conflict with observations that in comparison with general population of Russia, a 15% to 30% deficit of solid cancer mortality was found among the Russian emergency workers, and a 5% deficit solid cancer incidence among the population of most contaminated areas.
Ten days after two steam and hydrogen explosions blew up the Chernobyl nuclear reactor the fire that melted its core died out spontaneously. But the drama of this catastrophe still flourishes, nourished by politics, authorities, media and interest groups of ecologists, charitable organizations and scientists. It lives in the collective memory of the world and propagates real health, social and economic harm to millions of people in Belarus, Russia and the Ukraine. It is exploited in attempts to strangle development of atomic energy, the cleanest, safest and practically inexhaustible means to meet the worlds energy needs. The world’s uranium resources alone will suffice for the next 470,000 years (IAEA 2008). Chernobyl was indeed a historic event, but it is the only nuclear power station disaster that ever resulted in an occupational death toll, albeit a comparatively small one. A vast environmental dispersion of radioactivity occurred that did not cause any scientifically confirmed fatalities in the general population. The worst harm to the population was caused not by radiation, and not to flesh, but to minds.
This catastrophe provided many invaluable lessons. One of them is a recognition of the absurdity of the linear non-threshold hypothesis (LNT) which assumes that even near zero radiation dosage can lead to cancer death and hereditary disorders. Chernobyl was the worst possible catastrophe. It happened in a dangerously constructed nuclear power reactor with a total meltdown of the core and ten days of free emission of radionuclides into the atmosphere. Probably nothing worse could happen. Yet the resulting human losses were minute in comparison with catastrophes from other energy sources.
Highly sensitive monitoring systems that had been developed in many countries for the detection of fallout from nuclear weapons enabled easy detection of minute amounts of Chernobyl dust even in remote corners of the world. This added to global epidemics of fear induced by the accident. Radioactive debris was dispersed into the troposphere and stratosphere of the Northern Hemisphere up to at least 15 km altitude (Jaworowski and Kownacka 1994). On the first few days after the accident the concentrations of radiocesium measured at this altitude over Poland (maximum 36.1 mBq/m3 STP) was 2 to 6% of that at the ground level. Such a high vertical distribution and mixing enabled a small portion of Chernobyl debris to pass over the equatorial convergence and into the Southern Hemisphere and on to the South Pole (Dibb et al. 1990; Philippot 1990). This was not in agreement with computer models of nuclear accidents that projected a maximum uplift of fission products to below 3000 m altitude (ApSimon et al. 1985; ApSimon and Wilson 1987).
Enormous amounts of radionuclides entered the air from the burning reactor. Yet the total emission was 200 times less than from all of the 543 nuclear warheads exploded in the atmosphere since 1945. The highest estimated radiation dose exposure to the world population from these explosions was 0.113 mSv recorded in 1963 (UNSCEAR 1988). The radiation doses from Chernobyl dust were estimated and compared with natural doses by UNSCEAR (2000a). During the first year after the accident the average individual dose received by inhabitants of the Northern Hemisphere was estimated by UNSCEAR as 0.045 mSv, i.e., less than 2% of the average global annual natural dose (2.4 mSv per year). During next 70 years the global population will be exposed to a total Chernobyl dose of approximately 0.14 mSv, or 0.08% of the natural lifetime dose of 170 mSv. People living in the most contaminated areas of the former Soviet Union received an average annual whole body radiation doses in 1986 –1995 of 0.9 mSv in Belarus, 0.76 mSv in Russia, and 1.4 mSv in Ukraine (UNSCEAR 2000b). Average doses estimated for the period 1986 – 2005 are 2.4 mSv in Belarus, 1.1 mSv in Russia, and 1.2 mSv in Ukraine (UNSCEAR 2008). All these doses dwarf in comparison with natural radiation doses in some parts of the world which, for example, in Ramsar, Iran reach >400 mSv/year (Mortazawi et al 2006) and in Brazil and southwestern France reach up to more than 700 mSv per year (UNSCEAR 2000b) (Figure 1).
Comparison of these doses and epidemiological observations should be a basis of realistic estimates of the latent medical consequences of the Chernobyl accident, rather than risk factors based on LNT. This, and comparatively minute health consequences were apparent soon after the catastrophe (Jaworowski 1988), but this information was not shared with the public. Recently the well-known environmentalist James Lovelock spent a lot of time dispelling all usual myths that surround the Chernobyl accident and stated that for many years the scientists who could have challenged the nonsense about the catastrophe chose to keep quiet (Murphy 2009).
No harmful health effects have ever been detected in high natural radiation background areas. This is consistent with other studies of the incidence of cancers in exposed populations. In the United States and in China, for example, the incidence of cancers was found to be lower in regions with high natural radiation than in regions with low natural radiation (Frigerio et al. 1973; Frigerio and Stowe 1976; Wei et al. 1990). Among British radiologists exposed mainly to x-rays the all causes and cancer mortality is lower by about 50% than that in the average male population of England and Wales (Berrington et al 2001). Also, in other population groups exposed to low doses of ionizing radiation (i.e., patients diagnosed with 131I and X-rays, dial painters, chemists and others exposed to ingested or inhaled radium or plutonium, persons exposed to higher levels of indoor radon and A-bomb survivors) a lower percentage of neoplastic malignances was observed (Cohen 2000; Luckey 2003; UNSCEAR 1994). A Taiwan study of several thousand residents of apartments contaminated with cobal-60 who had been chronically exposed to gamma rays for up to 20 years with total doses estimated to range from 120 to 4000 mSv revealed that the cancer mortality and congenital malformations of these residents substantially decreased rather than increased (Chen et al 2004), suggesting a stimulating or hormetic effect of low doses of low linear-energy-transfer (LET of ionizing radiation. This finding was partially confirmed by a later study on cancer incidence in a similar Taiwan cohort, in which in groups of all cancers, all cancers except leukemia, and solid cancers, with number of cancer cases ranging from 119 to 190, a deficit of incidence was found in comparison with unexposed population. Such deficit, however, was not found in groups of all types of leukemia and of some solid cancers of particular organs, in which the number of cases was 1 to 2 orders of magnitude smaller than in the first three groups (Hwang et al 2006). About 3000 reports on radiation hormesis were recently reviewed (Luckey 2003).
Among approximately 200,000 American, British and Canadian nuclear workers exposed to radiation total cancer deaths ranged from 27% to 72% of total cancer deaths in control workers (Luckey 2003). Such hormetic deficit invalidate LNT, because the concept of hormesis transcends difficulties of a dose threshold for excess cancers. In the absence of hormesis, the existence of a true threshold for excess cancers might be impossible to demonstrate rigorously because of the statistical problems of proving an absolute equality of effect in an epidemiological study at a very low dose level. If however a deficit is observed in the population irradiated at relatively high dose level, as in hormesis, there is often a statistically significant difference at an acceptable confidence level (Webster 1993). This remark of Webster, an UNSCEAR member, reflects discussions in the Committee during preparation of its “hormetic report” (UNSCEAR 1994).
A more recent study based on collective doses for about 400,000 nuclear workers concluded that the cancer death data are consistent with the LNT relationship, although the authors found a 31% decrease in relative cancer mortality (Cardis et al 2007). This conclusion was based on an ad hoc accepted assumption of a confounding healthy worker effect for the studied cohort. However, the existence of this effect was not supported by their data or by any other factual evidence. This effect could be correctly assumed only if the cancer marker diagnostics (ACS 2009) and genetic tests were used in pre-employment screening and selection of these workers. But these procedures were not applied in the (Cardis et al 2007) cohort, and even now they are not recommended by ICRP, directives of European Union or IAEA International Basic Safety Standards. Thus this assumption is invalid and explains nothing. On the other hand, the statistical reanalysis of Cardis et al (2007) data clearly documents that their assumption of a healthy worker effect was incorrect, and their data indicated that low doses of ionizing radiation induced a hormetic effect in the exposed nuclear workers (Fornalski and Dobrzynski 2009).
In terms of human losses (there were 31 early deaths) the accident in the Chernobyl nuclear power plant was a minor event compared with many other major industrial catastrophes. In the 20th century more than ten such catastrophes have occurred, with several hundreds to many thousands fatalities in each. For example, coal smog killed approximately 12,000 people in London UK between December 1952 and February 1953 (Bell and Davis 2001). The annual death toll from accidents in Chinese coal mines reached 70,000 deaths in the 1950s and 10,000 in the 1990s (WNA 2009). In 1984 about 20,000 people perished due to an eruption in a pesticide factory in Bhopal (India) (Dhara and Dhara 2002), and the collapse of a hydroelectric dam on the Banqiao river in China in 1975 caused 230,000 fatalities (Altius 2008; McCully 1998; Yi 1998).
The world does not celebrate the anniversaries of these enormous man-made disasters, but year after year we do so for the hundreds and thousands of times less deadly Chernobyl accident. Ten years ago I discussed the possible causes of this paranoiac phenomenon (Jaworowski 1999). Measured as early deaths per electricity units produced by the Chernobyl facility (9 years of operation, total electricity production of 36 GWe-years, 31 early deaths) yields 0.86 death/GWe-year). This rate is lower than the average fatalities from a majority of other energy sources. For example the Chernobyl rate is 9 times lower than the death rate from liquefied gas, (Hirschberg et al 1998) and 47 times lower than from hydroelectric stations (40.19 deaths/GWe-year including Banqiao disaster). But the political, economic, social and psychological impact of Chernobyl was enormous. Let’s examine what happened starting with my personal experience.
At about 9 A.M. on Monday, April 28, 1986 at the entrance to my institute in Warsaw I was greeted by a colleague with a statement, “Look, at 7:00 we received a telex from a monitoring station in northern Poland saying that the beta radioactivity of the air there is 550,000 times higher than the day before. I found a similar increase in the air filter from the station in our backyard, and the pavement here is highly radioactive.”
This was a terrible shock. My first thought was: A NUCLEAR WAR! It is curious that all my attention was concentrated on this enormous rise of “total beta activity” in air used to monitor radiation emergencies from nuclear test fallout. Many years spent during the Cold War on preparations to defend the Polish population against the effects of a nuclear attack had conditioned my colleagues and me to such an exaggerated reaction. We reacted that way although we knew that on this first day of “Chernobyl in Poland” the dose rate of external gamma radiation penetrating our bodies was higher only by a factor of 3 from the day before, and it was similar to the average natural radiation doses which since time immemorial we have received from ground and cosmic radiation. At 11 A.M., after we had collected enough dust from the air for gamma spectrometry measurements, we discovered that it contained cesium-134, and thus that its source was not an atomic bomb but a nuclear reactor. This was tranquilizing news, which did not, however, calm our frantic behavior.
In 1986 the impact of a dramatic increase in atmospheric radioactivity dominated my thinking and everybody else’s. This state of mind led to immediate consequences. First there were various hectic actions, such as ad hoc coining of different limits for radionuclides in food, water and other things. In particular countries these limits varied by a factor of many thousands, reflecting various political and mercenary factors and the emotional states of the decision makers. For example, Sweden allowed for 30 times more activity in imported vegetables than in domestic ones, and Israel allowed less radioactivity in food from Eastern than from Western Europe. The cesium-137 concentration limit in vegetables imposed in the Philippines was 22 Bq per kg, 8600 times lower than in the more pragmatic United Kingdom (Salo and Daglish 1988). In Poland a group of nuclear physicists and engineers proposed a cesium-137 limit of 27 Bq in 1 kilogram for any kind of food, but, fortunately, the authorities decided more soberly and imposed a 1000 Bq limit.
Behind these restrictions, meaningless from the point of view of human health, stood three factors: (1) emotion; (2) the LNT mindset and international recommendations based on it; and (3) a social need to follow an old medical rule, “Ut aliquit fecisse videatur” (to make it appear that something is being done). The third factor was a placebo used by the authorities to dodge the worst kind of criticism, i.e., accusations of inactivity in the face of a “monstrous disaster”. This led to an overreaction in Europe and in some other countries, but at the greatest scale and with the most severe consequences in the Soviet Union. The costs of these regulations were enormous. For example, Norwegian authorities introduced a cesium-137 concentration limit of 6000 Bq/kg in reindeer meat and game, and a 600 Bq/kg limit for sheep (Henriksen and Saxebol 1988). A Norwegian eats an average of 0.6 kg of reindeer meat per year. The radiation dose from this meat would be 0.047 mSv per year. Thus this measure was aimed to protect Norwegians against a radiation dose about 200 times lower than the natural dose in some regions of Norway (11 mSv per year) (UNSCEAR 1982). The costs of this “protection” climbed to over $70 million in 1986, and in the 1990s it was still about $4 million per year (Christensen 1989; Idas and Myhre 1994). This means that unnecessary and wasteful restrictions, once implemented under the influence of the above three factors, have a long lifetime.
The hysterical reaction of authorities, further excited by extremely exaggerated media reports, is well exemplified by the Japanese government’s cancellation of a several hundred million (in US$) contract for shipping Polish barley for the production of Japanese beer. This happened in May, 1986 a few days after completely false information of extreme contamination of Poland by Chernobyl fallout appeared on the front page of the biggest Japanese daily, Asahi Shimbun. It screamed with block letters, “DUST OF DEATH IN POLAND”, and it cited my name as the source of the information. I was asked by the Polish government to write a text in English which might be used to avert this loss of money. I did this during a weekend spent with my wife in our cottage on the banks of the Vistula together with John Davis, the American ambassador to Poland, and his charming wife Helene. When I finished my writing assignment I asked John to correct the language. He said that the English was almost OK, but not exactly in proper diplomatic style. He then proceeded to change the text completely. On Monday a spokesman for the communist government asked me to read the text at his press conference. I presented the talk, but after I finished he distributed copies of the talk to the waiting flock of journalists. He was totally unaware that it had been prepared by the US ambassador. A visit by the Japanese ambassador to our institute managed to salvage the contract. A few days later ambassador Davis arranged an international deal for shipment by air of large quantities of powdered milk for Polish children to replenish strategic reserves that were rapidly being depleted. This was not an easy task because other European countries, in a similar position to ours, refused to sell their milk. As we now know, during the next four years the Davises played a delicate but pivotal role in realizing a major goal for the people of Poland, Solidarity’s victory over communism (Davis 2009; Davis et al 2006). As explained below Solidarity’s triumph was related to the Chernobyl accident.
A classic example of wastefully applying the LNT principle to the Chernobyl emergency was provided by Swedish radiation protection authorities. When the farmers near Stockholm discovered that the Chernobyl accident had contaminated their cow’s milk with cesium-137 above the limit of 300 Bq per liter imposed by authorities, they wrote asking if their milk could be diluted with uncontaminated milk from other regions to bring it below the limit. This would be done by mixing 1 liter of contaminated milk with 10 liters of clean milk. To the farmers’ surprise and disappointment the answer was “no”, and the milk was then to be discarded. This was a strange ruling since it has always been possible to reduce pollutants to safer levels by dilution. We do this for other pollutants in foodstuffs, and we dilute fumes from fireplaces or ovens with atmospheric air in the same way that nature dilutes volcanic emissions or forest fire fumes. The Swedish authorities explained that even though the individual risk could be reduced by diluting the milk, at the same time the number of consumers would be increased. Thus the risk would remain the same, but now spread over a larger population (Walinder 1995).
This was a faithful application of the ICRP recommendations based the LNT assumption and its offspring, the concept of “collective dose”, ie., reaching terrifyingly great numbers of “man-sieverts” by multiplying tiny innocuous individual radiation doses by large number of exposed people. In an earlier paper I exposed the lack of sense in and negative consequences of the LNT assumption and of the collective dose and dose commitment concepts (Jaworowski 1999). The application of these principles has caused the costs of the Chernobyl accident to exceed $100 billion in Western Europe (Becker 1996) and much more in post-soviet countries where it has led to unspoken sufferings and the pauperization of millions of people. The international institutions standing behind this assumption and these concepts certainly will not admit responsibility for their disastrous consequences. They should.
The linear no-threshold hypothesis was accepted in 1959 by the International Commission on Radiological Protection (ICRP 1959) as a philosophical basis for radiological protection. This decision was based on the first report of the newly established United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR 1958). A large part of this report was dedicated to a discussion of linearity and of the threshold dose for adverse radiation effects. Fifty years ago UNSCEAR’s stand on this subject was formed after an in-depth debate that was not without influence from the political atmosphere and issues of the time. The Soviet, Czechoslovakian and Egyptian delegations to UNSCEAR strongly supported the LNT assumption and used it as a basis for recommendation of an immediate cessation of nuclear test explosions. LNT was also supported by the Soviet Union during the later years of the Cold War (Jaworowski 2009), and this was consistent with the thinking of American authorities. The target theory prevailing in the 1950s and the then new results of genetic experiments with fruit flies irradiated with high doses and dose rates strongly influenced this debate. In 1958 UNSCEAR stated that contamination of the environment by nuclear explosions increased radiation levels all over the world and thus posed new and unknown hazards for present and future generations. These hazards cannot be controlled and “even the smallest amounts of radiation are liable to cause deleterious genetic, and perhaps also somatic, effects”. This sentence had an enormous impact in subsequent decades and has been repeated in a plethora of publications. Even today it is taken as an article of faith by the public. However, throughout the entire 1958 report the original UNSCEAR view on LNT remained ambivalent. As an example, UNSCEAR accepted as a threshold for leukemia a dose of 4000 mSv (page 42), but at the same time the committee accepted a risk factor for leukemia of 0.52% per 1000 mSv, assuming LNT (page 115). The committee quite openly presented this difficulty and showed its consequences in a table (page 42). Continuation of nuclear weapons tests in the atmosphere was estimated to cause 60,000 leukemia cases worldwide if no threshold is assumed, and zero leukemia cases if a threshold of 4000 mSv were in place. In the final conclusions UNSCEAR pinpointed this dilemma. “Linearity has been assumed primarily for purposes of simplicity”, and “There may or may not be a threshold dose. The two possibilities of threshold and no-threshold have been retained because of the very great differences they engender”. After a half century we still discuss the same problem. In 1958 UNSCEAR had no doubts about major genetic defects in the world population that could be caused by nuclear test fallout, and estimated them as high as 40,000. But now the Committee has learned that even among the children of highly irradiated survivors of atomic bombings no statistically significant genetic damage could be demonstrated (UNSCEAR 2001).
However, in the ICRP document of 1959 no such controversy and no hesitations appeared. LNT was arbitrarily assumed, and serious epistemological problems related to the impossibility of finding harmful effects at very low levels of radiation were ignored. Over the years the working assumption of ICRP of 1959 came to be regarded as a scientifically documented fact by the mass media, public opinion and even many scientists. The LNT assumption, however, belongs in the realm of administration and is not a proved scientific principle (Jaworowski 2000).
The absurdity of the LNT was brought to light in 1987 when minute doses of Chernobyl radiation were used to calculate that 53,000 people would die of Chernobyl-induced cancer over the next 50 years (Goldman et al 1987). This frightening death toll calculation was derived simply by multiplying the trifling Chernobyl doses in the US (0.0046 mSv per person) by the vast number of people living in the Northern Hemisphere and by a cancer risk factor based on epidemiological studies of 75,000 atomic bomb survivors in Japan. But the A-bomb survivor data are irrelevant to such estimates because of the difference in the individual doses and dose rates. A-bomb survivors were flashed within less than a second by radiation doses at least 50,000 times higher than any dose that US inhabitants will ever receive over a period of 50 years from the Chernobyl fallout. We have reliable epidemiological data for a dose rate of perhaps 1000 or 6000 mSv per second in Japanese A-bomb survivors. But there are no such data for human exposure at a dose rate of 0.0045 mSv over 50 years, nor will there ever be any. The dose rate in Japan was larger by a factor of about 1012 than the Chernobyl dose rate in the US. Extrapolating over such a vast span is neither scientifically justified nor epistemologically acceptable. It is also morally suspect (Walinder 1995). Indeed, Lauriston Taylor, the late president of the US National Council on Radiological Protection and Measurements, deemed such extrapolations to be a “deeply immoral use of our scientific heritage” (Taylor 1980).
In its document on protection of the public in a major radiation emergency ICRP recommended administration of stable iodine in form of tablets to be taken before or as soon as possible after the start of exposure to radioactive iodine-131 (ICRP 1984). The commission advised applying this prophylactic measure to everybody, pregnant women, neonates, young infants and adults, starting at the projected thyroid dose of 50 mSv. This recommendation was based on the LNT dogma. We followed it in Poland.
In the late afternoon of April 28, 1986 we learned from the BBC that there was a reactor accident in Chernobyl. We had seen the radioactive cloud flowing over Poland from east to west, and we had the first data on concentration levels of radioiodine in grass and soil in eastern Poland and in Warsaw. Using these data I calculated that contamination of thyroid glands of Polish children might reach a limit of 50 mSv, and much more if the situation in Chernobyl and weather conditions further aggravated the situation. In our institute we had no information from the Soviet Union on the current state of affairs or of any projections regarding the behavior of the destroyed reactor. Therefore we assumed that in the next few days the radioactivity in the air would increase and cover the whole country. We prepared a portfolio of countermeasures to be implemented by the government. I presented this project at a meeting of the deputy prime minister, several ministers and high ranking secretaries of the Central Committee of the PZPR (Polish United Workers Party) at about 4 A.M. on April 29th. The most important measure recommended, and also accepted after a short discussion by this mixture of government and party, was stable iodine prophylaxis to protect the thyroid glands of children against iodine-131 irradiation. Administration of stable iodine in liquid form (as a “solution of Lugol”) was initiated in the northeastern part of Poland approximately 38 hours after we discovered the Chernobyl fallout (at approximately midnight on April 28th). Treatment was given for the next three days, and about 18.5 million people, including adults, received the stable iodine drug.
We were able to perform this action successfully because we had already made plans for implementing nuclear war emergency measures. In the 1960s our institute had recommended that the government prepare for such an event by distributing strategic stores of stable iodine at sites all over the country as the only reasonable measure against body contamination from fission products. The program was implemented in the early 1970s, and each Polish pharmacy, hospital and various other institutions had large supplies of iodine. At the time of the Chernobyl accident Poland had more than enough iodine ready for use for approximately 100 doses for each Polish citizen. A few years after the catastrophe it was estimated that in the more contaminated parts of the country the average thyroid radiation dose in the 1 to 10 year old age group was about 70 mSv, and in about 5% of children the maximum dose was about 200 mSv (Krajewski 1991). A decade later we learned that among those of more than 34,000 Swedish patients who were not suspect for thyroid cancers, and whose thyroids were irradiated with iodine-131 up to dose of 40,000 mSv (average dose 1,100 mSv), there was no statistically significant increase in thyroid cancers, but rather a 38% decrease in their incidence (Dickman et al. 2003; Hall et al. 1996; Holm et al. 1988). If I knew then what I know today I would not have recommended to the Polish government such a vast prophylactic action, not because of its allegedly adverse medical effects - there were none (Nauman 1989) - but because its practical positive health effect was meaningless.
The most nonsensical, expensive and harmful action, however, was the evacuation of 336,000 people from contaminated regions of the former Soviet Union, where the radiation dose from Chernobyl fallout was about twice the natural dose. Later this limit was decreased to even below the natural level and was some five times lower than a radiation dose rate of 5.25 mSv/year at Grand Central Station in New York City (Benenson et al 2006). “Contaminated areas” were defined as being those where the average cesium-137 ground deposition density exceeded 37 kBq per m2. In the Soviet Union these areas covered 146,100 km2. The Chernobyl fallout of about 185 kBq/m2 or more also covered large areas of Austria, Bulgaria, Finland, Norway and Sweden (UNSCEAR 2000b). Small areas with Chernobyl fallout reaching up to about 185 kBq/m2 were also found in other countries (Great Britain, Greece, Romania, Switzerland and Turkey (EUR 1996). Radiation doses received in areas with a cesium-137 deposition density of about 37 kBq/m2 were about 1.6 mSv during the first year after the Chernobyl accident, and the lifetime dose (after 70 years) was predicted to reach 6 mSv (UNSCEAR 1988). This activity level is ten times lower than the average amount (400 kBq per m2) of about 50 natural radionuclides present in a 10 cm thick layer of soil (Jaworowski 2002). The corresponding Chernobyl lifetime radiation dose is 28 times lower than the average natural lifetime dose of about 170 mSv. But the annual dose from 37 kBq of cesium-137 per m2 was similar to the 1 mSv/year dose limit recommended by ICRP for the general population, and this is why it was accepted by the Soviet authorities as a yardstick for remedial measures.
The evacuation caused great harm to the populations of Belarus, Russia and the Ukraine. It led to mass psychosomatic disturbances, great economic loss and traumatic social consequences. According to Academician Leonid A. Ilyin, the leading Russian authority on radiation protection, the mass relocation was implemented by the Soviet government under the pressure of populists, ecologists and self-appointed “specialists”, and it was done against the advice of the best Soviet scientists (Ilyin 1995; Ilyin 1996). The really dangerous air radiation dose rate of 1 Gy/h on 26 April 1986 (0.01 Gy/h 2 days later) covered an uninhabited area of only about 0.5 km2 in two patches reaching up to a distance of 1.8 km southwest of the Chernobyl reactor (UNSCEAR 2000b).
Based on these data there was no valid reason for the masterly evacuation of 49,614 residents from the city of Prypyat and the village of Yanov situated about 3 km from the burning reactor. In these settlements the radiation dose rate in the air on 26 April 1986 was 1 mSv/h (UNSCEAR 2000b), and two days later it was only 0.01 mSv/h. Thus with a steadily decreasing radioactivity fallout the dose rate was not dangerous at all. However, according to L.A. Ilyin, one of the leaders of the Chernobyl rescue team, there was a danger that the “corium” (the melted core of the reactor, with a total volume of ~200 m3, a mass of ~540 tons and a temperature of about 2000°C, ) might penetrate down through the concrete floor and spread to rooms below. The team suspected that in these rooms there could have been a great volume of water with which the corium could come into contact. This would have led to a much more powerful explosion than the initial one and caused a vastly greater emission of radioactivity that could have covered Prypyat and Yanow with lethal fallout. Therefore, the evacuation of the whole population of these localities was a correct precautionary measure that was carried out in an orderly manner in only two hours. But the evacuation and relocation of the remaining approximately 286,000 people, of which there were about 220,000 after 1986 (UNSCEAR 2000b), was an irrational overreaction induced in part by the influence of the ICRP and IAEA recommendations based on the LNT (Ilyin 1995). The current reluctance of the Ukrainian authorities to resettle the residents back to Prypyat (now a slowly decaying ghost town and tourist attraction) does not seem rational. The radiation dose rate measured on April 10, 2008 in the streets of this city ranged from 2.5 to 8.4 mSv/year, i.e., more than 10 times lower than natural radiation in many regions of the world (Fornalski 2009) (Figure 2).
Besides the 28 fatalities among rescue workers and employees of the power station due to very high doses of radiation (2.9 – 16 Gy), and 3 deaths due to other reasons (UNSCEAR 2000b), the only real adverse health consequences of the Chernobyl catastrophe among approximately five million people living in the contaminated regions were the epidemics of psychosomatic afflictions that appear as diseases of the digestive and circulatory systems and other post-traumatic stress disorders such as sleep disturbance, headache, depression, anxiety, escapism, “learned helplessness”, unwillingness to cooperate, overdependence, alcohol and drug abuse and suicides. These diseases and disturbances could not have been due to the minute irradiation doses from the Chernobyl fallout (average dose rate of about 1 – 2 mSv/year), but they were caused by radiophobia (an deliberately induced fear of radiation) aggravated by wrongheaded administrative decisions and even, paradoxically, by increased medical attention which leads to diagnosis of subclinical changes that persistently hold the attention of the patient. Bad administrative decisions made several million people believe that they were “victims of Chernobyl” although the average annual dose they received from “Chernobyl” radiation was only about one third of the average natural dose. This was the main factor responsible for the economic losses caused by the Chernobyl catastrophe, estimated to have reached $148 billion by 2000 for the Ukraine and to reach $235 billion by 2016 for Belarus.
Psychological factors and a failure to teach radiological protection in medical school curricula might have led to abortions of wanted pregnancies in Western Europe during the period soon after the accident where physicians wrongly advised patients that Chernobyl radiation posed a health risk to unborn children. However, numerical estimates of this effect (Ketchum 1987; Spinelli and Osborne 1991) cast doubt on this assumption. Similarly uncertain are estimates of the number of decisions against fecundation probably taken in Europe during the first few months after the accident (Trichopoulos et al 1987). This problem was discussed in 1987 by an IAEA Advisory Group that concluded that medical practitioners having direct contact with the population at large are among the most important persons who might develop the right perception of risks in nuclear emergencies, prevent social panic and overreactions, and help to ensure the rational behavior in the society. After the Chernobyl accident the public very often turned for help to medical practitioners, but physicians were unable to provide realistic advice even on minor problems. This was because medical curricula did not at that time prepare doctors for nuclear emergencies. In none of the nine countries represented at the meeting were the principles of radiobiology and radiation protection included in medical school curricula (IAEA 1987). Lack of knowledge in this important group was among the factors that increased public anxiety and stress. It seems that now, two decades later, the situation in this respect is very much the same.
In 2000 the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR 2000b) and in 2006 the United Nations (UN) Chernobyl Forum (a group composed of representatives from 8 UN organizations, the World Bank and the governments of Belarus, Russia and the Ukraine) stated in their documents that, except for thyroid cancers in the population of highly contaminated areas, no increase in the incidence of solid tumors and leukemia, and no increase in genetic diseases was observed. An increase in registration of thyroid cancers in children under 15 years old was first found in 1987, one year after the accident, in the Bryansk region of Russia, and the greatest incidence, of 0.027% was found in 1994. Both of these studies were made too early to be in agreement with what we know about radiation induced cancers. The mean latency period for malignant thyroid tumors in adults and children exposed to external and internal medical irradiation with <20 to >40 Gy is about 28 years (Kikuchi et al; 2004; UNSCEAR 2000b). Kikuchi et al (2004) tried to explain the discrepancy between the clinical experience and the Chernobyl findings by some exotic ideas, such as, for example, “radiation leakage or other environmental conditions, exposure to carcinogens that occurred near Chernobyl prior to the nuclear accident, or that the population is genetically predisposed to thyroid cancer”. However, mass screening and diagnostic suspicion, already flourishing in 1987, is a more serendipitous explanation.
The number of 4000 new thyroid cancers registered among the children from Belarus, Russia and the Ukraine should be viewed with respect to the extremely high occurrence of these dormant subclinical malignant tumors that contain transformed tumor cells and are quite common in the population (Akslen and Naumov 2008; Weinberg 2008). This is exemplified by occult thyroid cancers, the incidence of which varies from 5.6% in Colombia, 9.0% in Poland, 9.3% in Minsk, Belarus, 13% in the United States, 28% in Japan, to 35.6% in Finland (Harach et al 1985; Moosa and Mazzaferri 1997). In Finland occult thyroid cancers are observed in 2.4% of children (Harach et al 1985), i.e., some 90 times more than the maximum observed in the Bryansk region. In Minsk, Belarus the normal incidence of occult thyroid cancers is 9.3% (Furmanchuk et al 1993). The “Chernobyl” thyroid cancers are of the same histological type and are similar in invasiveness to the “occult cancers” (Moosa and Mazzaferri 1997; Tan and Gharib 1997). Since 1995 the number of registered cancers has tended to decline. This is not in agreement with what we know about radiation induced thyroid cancers whose latency period is about 5 – 10 years after irradiation exposure (Inskip 2001) and whose risk increases until 15 – 29 years after exposure (UNSCEAR 2000a). In the United States the incidence rate of thyroid tumors detected between 1974 and 1979 during a screening program was 21 times higher than before the screening (Ron et al 1992), an increase similar to that observed in three former Soviet countries. It appears that the increased registration of thyroid cancers in contaminated parts of these countries is a classical screening effect.
According to the regulations of the Belarusian Ministry of Health the thyroids of all people who were younger than 18 in 1986 and those of each inhabitant of “contaminated areas” must be diagnosed every year (Parshkov et al 2004). More than 90% of children in contaminated areas are now diagnosed for thyroid cancers every year with ultrasonography (USG) and other methods. It is obvious that such a vast scale screening, probably the greatest in the history of medicine, resulted in finding thousands of the “occult” cancers, or “incidentalomas”, expanded to forms detectable by modern diagnostic methods that were not in routine use in the Soviet Union before 1986.
Data for the past 20 years published by (Ivanov et al 2004) and cited in the UNSCEAR and Chernobyl Forum documents (Forum 2005; Forum 2006; Ivanov et al 2004; UNSCEAR 2008) show, in comparison to the Russian general population, a 15% to 30% lower mortality from solid tumors among the Russian Chernobyl emergency workers and a 5% lower average solid tumor incidence among the population of the Bryansk district, the most contaminated in Russia (Figures 3 and and4).4). In the most exposed group of these people (with an estimated average radiation dose of 40 mSv) a 17 % decrease in the incidence of solid tumors of all kinds was found. In the Bryansk district the leukemia incidence is not higher than in the Russian general population. According to (UNSCEAR 2000b) no increase in birth defects, congenital malformations, stillbirth or premature births could be linked to radiation exposures caused by the Chernobyl fallout. The final conclusion of the UNSCEAR 2000b report is that the population of the three main contaminated areas with a cesium-137 deposition density greater than 37 kBq/m2 “need not live in fear of serious health consequences”, and forecasts that “generally positive prospects for the future health of most individuals should prevail”.
The publications of the UN Chernobyl Forum (2005, 2006) present a rather balanced overview of the Chernobyl health problems, but with three important exceptions. The first is (mainly after (Cardis et al. 2005) ignoring or downplaying the effect of screening for thyroid cancers of about 90% population (see discussion above), and interpreting the results with a linear no-threshold dose-response model. This Cardis et al (2005) paper, however, was criticized by (Scott 2006) for this interpretation, not confirmed by the data presented. Both the Chernobyl Forum (2005, 2006), and Cardis et al (2005, 2006) papers, ignore the aforementioned fundamental problem of occult thyroid cancers in the former Soviet Union and elsewhere in Europe. The incidence of thyroid occult cancers increased rapidly after advent of new USG diagnostics (Topliss 2004). Reaching up to 35.6% (see above) this incidence is more than 1300 times higher than the maximum thyroid cancer incidence found in Bryansk Region, Russia in 1994 (UNSCEAR 2000b), what implies a vast potential for bias. It seems that up until now an epidemiological study on temporal changes of intensity of thyroid screening in the former Soviet Union was not performed. The conclusions from epidemiological studies not taking into account these changes in screening may be invalid. In Bryansk region, Russia the thyroid cancer incidence was found 45% higher in males and 90% higher in females, than for the whole Russian population. However, when dose-response analyses were performed using external and internal comparisons, no positive association of thyroid cancers with radiation dose was observed, but a negative one, i.e. a hormetic effect (Ivanov et al 2004). These results strongly suggest that the increased cancer rates in Bryansk (and by implication in other contaminated regions) compared with general population rates are due to thyroid cancer screening and better reporting rather than radiation exposure (Ron 2007). In her interpretation of thyroid cancer data Ron also did not take into account the occult thyroid cancer issue. Even more important, however, was perhaps ignoring both in her and Cardis et al (2006) papers a decrease of thyroid cancer incidence of up to 38%, after diagnostic irradiation with iodine-131 of many thousands of non-cancer Swedish patients with doses similar to or higher than those received from the Chernobyl fallout by inhabitants of post-soviet countries (Dickman et al 2003; Hall et al 1996;Holm et al 1991; Holm et al. 1988).
The second problem with the Chernobyl Forum (2005, 2006) reports is estimation of deaths among the patients with acute radiation disease. From among 134 persons with this disease who had been exposed to extremely high radiation doses, 31 died soon after the accident. Among the 103 survivors, 19 died before 2004. Most of these deaths were due to such disorders as lung gangrene, coronary heart disease, tuberculosis, liver cirrhosis, fat embolism and other conditions that can hardly be defined as caused by ionizing radiation. But the Chernobyl Forum (2005, 2006) presents them as a resulting from high irradiation and sums them up to a total of approximately 50 victims of acute irradiation. After many summers all the 103 survivors will eventually die. The Chernobyl Forum (2005, 2006) philosophy would then count them all, yielding a round total of 134 victims of high irradiation. In fact, the mortality rate among these 103 survivors was 1.08% per year, i.e., less than average mortality rate of 1.5% in the three affected countries in 2000 (GUS 1991).
And finally, the third “Forum problem” is the projections of future fatalities caused by low level Chernobyl radiation from 4000 up to exactly 9935 deaths. These numbers are not based on epidemiological data of cancer mortality observed during the past 20 years by (Ivanov et al 2004) that demonstrated no such increase, but rather a decrease of solid tumor and leukemia deaths among exposed people. These epidemiological data, rather than the LNT assumption, should be used as the basis for a realistic projection of the future health of the millions of people officially labeled “victims of Chernobyl”. However, the Chernobyl Forum (2005, 2006) instead chose to use the LNT radiation risk model (ICRP 1991) and performed a simplistic arithmetical exercise by multiplying small doses by a great number of people and including a radiation risk factor deduced from the Hiroshima and Nagasaki studies. People living in areas highly contaminated by the Chernobyl fallout were irradiated during a protracted time. The dose rates in Hiroshima and Nagasaki were higher by a factor of about 1011 than the average dose rate of the “Chernobyl victims” that was used in Forum’s projections. The result of this exercise is nothing more than a fibbing fantasy. Several scientific and radiation protection bodies, including UNSCEAR, the Health Physics Society (Mossman et al 1996), the French Academy of Science (Tubiana 1998), and even the chairman of the International Commission on Radiological Protection (Clarke 1999), advised against making such calculations. Merely publishing these numbers is harmful and petrifies the Chernobyl fears. Any efforts to explain the intricacies of radiation risk assessments to the public or to compare these numbers with the much higher level of spontaneous cancer deaths will be futile excercises. The past twenty years proved that such efforts are worthless. Making such calculations keeps a lot of people busy and well but has no relation to reality and honesty. The Chernobyl Forum (2005, 2006) elucubrations pale in the face of recent estimates by other bodies (Greenpeace 2006; Vidal 2006) predicting the incidence of millions Chernobyl cancers and hundreds of thousands deaths.
It is reassuring, however, that sixteen years after the Chernobyl catastrophe another group composed of four UN organizations (United Nations Development Programme – UNDP; United Nations International Children’s Emergency Fund – UNICEF; World Health Organization – WHO; United Nations Office for the Coordination of Humanitarian Affaires – UN-OCHA) dared to state in its 2002 report based on UNSCEAR studies that a great part of the billions of dollars used to mitigate the consequences of the Chernobyl accident was spent incorrectly. The dollars spent in these efforts did not improve but actually worsened a deteriorating situation for 7 million so-called “victims of Chernobyl” and petrified the psychological effects of the catastrophe and the wrong decisions of the authorities. The report (UNDP 2002) recommended that the three post-soviet countries and the international organizations abandon the current policy. The misguided basis of this policy, i.e. expectation of mass radiation health effects, was responsible for the enormous and uselessly expended resources sacrificed for remediation efforts. The report presented 35 practical recommendations needed to stop the vicious cycle of Chernobyl frustrations, social degradation, pauperization and the epidemic of psychosomatic disorders. The recommendations suggest a reversal from the position of concentrating attention on nonexistent radiation hazards and that the relocated individuals should be allowed to return to their old settlements, i.e., that essentially all of the restrictions should be removed.
But here we enter a political mine-field. How well will people accept losing the mass benefits (equivalent to about $40 a month) that they poetically call a “coffin bonus”? How can it be explained to them that they were made to believe that they were the “victims” of a non-existing hazard, that the mass evacuations were an irresponsible error, that for twenty years people were unnecessarily exposed to suffering and need, that vast areas were unnecessarily barred from use, and that their countries’ resources were incredibly squandered? One can read in many publications that the Chernobyl catastrophe had serious political implications by becoming an important factor in the dismantling of the Soviet Union and in attempts to control nuclear arms. As Mikhail Gorbachev stated : “The nuclear meltdown at Chernobyl 20 years ago … even more than my launch of pre-restroika, was perhaps the real cause of the collapse of the Soviet Union five years later. … Chernobyl opened my eyes like nothing else: it showed the horrible consequences of nuclear power …One could now imagine much more clearly what might happen if a nuclear bomb exploded …one SS-18 rocket could contain a hundred Chernobyls. Unfortunately, the problem of nuclear arms is still very serious today.” (Gorbachev 2006).
Would fulfilling the recommendations of the UNDP 2002 report again result in a political catharsis and perhaps induce violent reactions? Probably not in Russia, where a more rational approach to Chernobyl prevails. But the political classes of Belarus and Ukraine have for years demonstrated a much more emotional approach. When the (UNSCEAR 2000a) report documenting the low incidence of serious health hazards resulting from the Chernobyl accident was presented to the UN General Assembly, the Belarus and Ukraine delegations lodged a fulminating protest. This in 2002 set the stage the Chernobyl Forum and helped to focus its agenda.
The Chernobyl rumble and emotions are beginning to settle down. In the centuries to come the catastrophe will be remembered as a proof that nuclear power is a safe means of energy production. It even might change the thinking of ICRP.
This paper was greatly improved by comments and editorial help from Dr. Norman Kelker, Enso Life Sciences, Farmington, N.Y.