Can Low Doses of Radiation Cause More Damage than High Doses?
The New York Times’ Matthew Wald reports today:
The Bulletin of the Atomic Scientists[’] May-June issue carries seven articles and an editorial on the subject of low-dose radiation, a problem that has thus far defied scientific consensus but has assumed renewed importance since the meltdown of the Fukushima Daiichi reactors in Japan in March 2011.
This month a guest editor, Jan Beyea [who received a PhD in nuclear physics from Columbia and has served on a number of committees at the National Research Council of the National Academies of Science] and worked on epidemiological studies at Three Mile Island, takes a hard look at the power industry.
The bulletin’s Web site is generally subscription-only, but this issue can be read at no charge.
Dr. Beyea challenges a concept adopted by American safety regulators about small doses of radiation. The prevailing theory is that the relationship between dose and effect is linear – that is, that if a big dose is bad for you, half that dose is half that bad, and a quarter of that dose is one-quarter as bad, and a millionth of that dose is one-millionth as bad, with no level being harmless.
The idea is known as the “linear no-threshold hypothesis,’’ and while most scientists say there is no way to measure its validity at the lower end, applying it constitutes a conservative approach to public safety.
Some radiation professionals disagree, arguing that there is no reason to protect against supposed effects that cannot be measured. But Dr. Beyea contends that small doses could actually be disproportionately worse.
Radiation experts have formed a consensus that if a given dose of radiation delivered over a short period poses a given hazard, that hazard will be smaller if the dose is spread out. To use an imprecise analogy, if swallowing an entire bottle of aspirin at one sitting could kill you, consuming it over a few days might merely make you sick.
In radiation studies, this is called a dose rate effectiveness factor. Generally, a spread-out dose is judged to be half as harmful as a dose given all at once.
Dr. Beyea, however, proposes that doses spread out over time might be more dangerous than doses given all at once. He suggests two reasons: first, some effects may result from genetic damage that manifests itself only after several generations of cells have been exposed, and, second, a “bystander effect,” in which a cell absorbs radiation and seems unhurt but communicates damage to a neighboring cell, which can lead to cancer.
One problem in the radiation field is that little of the data on hand addresses the problem of protracted exposure. Most of the health data used to estimate the health effects of radiation exposure comes from survivors of the Hiroshima and Nagasaki bombings of 1945. That was mostly a one-time exposure.
Scientists who say that this data leads to the underestimation of radiation risks cite another problem: it does not include some people who died from radiation exposure immediately after the bombings. The notion here is that the people studied in ensuing decades to learn about the dose effect may have been stronger and healthier, which could have played a role in their survival.
Still, the idea that the bomb survivor data is biased, or that stretched-out doses are more dangerous than instant ones, is a minority position among radiation scientists.
Dr. Beyea writes:
Three recent epidemiologic studies suggest that the risk from protracted exposure is no lower, and in fact may be higher, than from single exposures.
Conventional wisdom was upset in 2005, when an international study, which focused on a large population of exposed nuclear workers, presented results that shocked the radiation protection community—and foreshadowed a sequence of research results over the following years.
It all started when epidemiologist Elaine Cardis and 46 colleagues surveyed some 400,000 nuclear workers from 15 countries in North America, Europe, and Asia—workers who had experienced chronic exposures, with doses measured on radiation badges (Cardis et al., 2005).
This study revealed a higher incidence for protracted exposure than found in the atomic-bomb data, representing a dramatic contradiction to expectations based on expert opinion.
A second major occupational study appeared a few years later, delivering another blow to the theory that protracted doses were not so bad. This 2009 report looked at 175,000 radiation workers in the United Kingdom ….
After the UK update was published, scientists combined results from 12 post-2002 occupational studies, including the two mentioned above, concluding that protracted radiation was 20 percent more effective in increasing cancer rates than acute exposures (Jacob et al., 2009). The study’s authors saw this result as a challenge to the cancer-risk values currently assumed for occupational radiation exposures. That is, they wrote that the radiation risk values used for workers should be increased over the atomic-bomb-derived values, not lowered by a factor of two or more.
In 2007, one study—the first of its size—looked at low-dose radiation risk in a large, chronically exposed civilian population; among the epidemiological community, this data set is known as the “Techa River cohort.” From 1949 to 1956 in the Soviet Union, while the Mayak weapons complex dumped some 76 million cubic meters of radioactive waste water into the river, approximately 30,000 of the off-site population—from some 40 villages along the river—were exposed to chronic releases of radiation; residual contamination on riverbanks still produced doses for years after 1956.
Here was a study of citizens exposed to radiation much like that which would be experienced following a reactor accident. About 17,000 members of the cohort have been studied in an international effort (Krestinina et al., 2007), largely funded by the US Energy Department; and to many in the department, this study was meant to definitively prove that protracted exposures were low in risk. The results were unexpected. The slope of the LNT fit turned out to be higher than predicted by the atomic-bomb data, providing additional evidence that protracted exposure does not reduce risk.
In a 2012 study on atomic-bomb survivor mortality data (Ozasa et al., 2012), low-dose analysis revealed unexpectedly strong evidence for the applicability of the supralinear theory. From 1950 to 2003, more than 80,000 people studied revealed high risks per unit dose in the low-dose range, from 0.01 to 0.1 Sv.
We pointed out last year:
The Bulletin of Atomic Scientists reported that one of the best-known scientists of the 20th century – Dr. John Gofman – also believed that chronic low level radiation is more dangerous than acute exposure to high doses. Gofman was a doctor of nuclear and physical chemistry and a medical doctor who worked on the Manhattan Project, co-discovered uranium-232 and -233 and other radioactive isotopes and proved their fissionability, helped discover how to extract plutonium, led the team that discovered and characterized lipoproteins in the causation of heart disease, served as a Professor Emeritus of Molecular and Cell Biology at the University of California Berkeley, served as Associate Director of the Livermore National Laboratory, was asked by the Atomic Energy Commission to undertake a series of long range studies on potential dangers that might arise from the “peaceful uses of the atom”, and wrote four scholarly books on radiation health effects.
Other experts have made the same claim:
Even low level radiation can cause big problems. Columbia provides an illustration:
Radiation can sicken or kill us by directly damaging cells:
Or indirectly … by producing free radicals:
Scientists from the Institute of Nuclear Science claim in the Archive of oncology:
Chronic exposure to low-dose radiation doses could be much more harmful than high, short-term doses because of lipid peroxidation initiated by free radicals.
Peroxidation of cell membranes increases with decreasing dose rate (Petkau effect).
(See this for more on the Petkau effect.)
Low Doses Can Cause Big Problems
Whether or not low doses of radiation are more dangerous than high doses, one thing is clear: repeated exposure to low doses of radiation can cause cancer.
Yet governments worldwide are raising acceptable radiation levels based upon politics.
Indeed, the Department of Energy is trying to replace the widely-accepted model of the dangers of low dose radiation based on voodoo science. Specifically, DOE’s Lawrence Berkeley Labs recently used a mutant line of human cells in a petri dish which was able to repair damage from low doses of radiation, and extrapolated to the unsupported conclusion that everyone is immune to low doses of radiation: