LNT, UNSCEAR and the NRC “State-of-the-Art Reactor Consequence Analyses”

UNSCEAR 2012 “Therefore, the Scientific Committee does not recommend multiplying very low doses by large numbers of individuals to estimate numbers of radiation-induced health effects within a population exposed to incremental doses at levels equivalent to or lower than natural background levels;”

The main NRC SOARCA page, which indexes the definitive 2012 NRC severe accident study. This study is large so I’ll rely on the NRC’s own words of summary:

SOARCA’s main findings fall into three basic areas: how a reactor accident progresses; how existing systems and emergency measures can affect an accident’s outcome; and how an accident would affect the public’s health. The project’s preliminary findings include:

  • Existing resources and procedures can stop an accident, slow it down or reduce its impact before it can affect public health;
  • Even if accidents proceed uncontrolled, they take much longer to happen and release much less radioactive material than earlier analyses suggested; and
  • The analyzed accidents would cause essentially zero immediate deaths and only a very, very small increase in the risk of long-term cancer deaths.

Rod Adams posted his thorough analysis of UNSCEAR here, which Rod summarizes thusly:

  • The individual early fatality risk from SOARCA scenarios is essentially zero.
  • Individual LCF risk from the selected specific, important scenarios is thousands of times lower than the NRC Safety Goal and millions of times lower than the general cancer fatality risk in the United States from all causes, even assuming the LNT dose-response model.

If I may underscore that last: even assuming the LNT dose-response model For more plain English here’s UK environmentalist Mark Lynas in Why Fukushima death toll projections are based on junk science:

As the Health Physics Society explains[1] in non-scientific language anyone can understand:

…the concept of collective dose has come under attack for some misuses. The biggest example of this is in calculating the numbers of expected health effects from exposing large numbers of people to very small radiation doses. For example, you might predict that, based on the numbers given above, the population of the United States would have about 40,000 fatal cancers from background radiation alone. However, this is unlikely to be true for a number of reasons. Recently, the International Council on Radiation Protection issued a position statement saying that the use of collective dose for prediction of health effects at low exposure levels is not appropriate. The reason for this is that if the most highly exposed person receives a trivial dose, then everyone’s dose will be trivial and we can’t expect anyone to get cancer. [my emphasis]

The HPS illustrates this commonsensical statement with the following analogy:

Another way to look at it is that if I throw a 1-gram rock at everyone in the United States then, using the collective dose model, we could expect 270 people to be crushed to death because throwing a one-ton rock at someone will surely kill them. However, we know this is not the case because nobody will die from a 1-gram rock. The Health Physics Society also recommends not making risk estimates based on low exposure levels.

James Conca explains the UNSCEAR 2012 report, which finally drove a stake into the heart of LNT:

The United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) (UNSCEAR 2012) submitted the report that, among other things, states that uncertainties at low doses are such that UNSCEAR “does not recommend multiplying low doses by large numbers of individuals to estimate numbers of radiation-induced health effects within a population exposed to incremental doses at levels equivalent to or below natural background levels.” (UNDOC/V1255385)

You know, like everyone’s been doing since Chernobyl. Like everyone’s still doing with Fukushima.

Finally, the world may come to its senses and not waste time on the things that aren’t hurting us and spend time on the things that are. And on the people that are in real need. Like the infrastructure and economic destruction wrought by the tsunami, like cleaning up the actual hot spots around Fukushima, like caring for the tens of thousands of Japanese living in fear of radiation levels so low that the fear itself is the only thing that is hurting them, like seriously preparing to restart their nuclear fleet and listening to the IAEA and the U.S. when we suggest improvements.

The advice on radiation in this report will clarify what can, and cannot, be said about low dose radiation health effects on individuals and large populations. Background doses going from 250 mrem (2.5 mSv) to 350 mrem (3.5 mSv) will not raise cancer rates or have any discernable effects on public health. Likewise, background doses going from 250 mrem (2.5 mSv) to 100 mrem (1 mSv) will not decrease cancer rates or effect any other public health issue.

Note – although most discussions are for acute doses (all at once) the same amount as a chronic dose (metered out over a longer time period like a year) is even less effecting. So 10 rem (0.1 Sv) per year, either as acute or chronic, has no observable effect, while 10 rem per month might.

UNSCEAR also found no observable health effects from last year’s nuclear accident in Fukushima. No effects.

The Japanese people can start eating their own food again, and moving back into areas only lightly contaminated with radiation levels that are similar to background in many areas of the world like Colorado and Brazil.

Low-level contaminated soil, leaves and debris in Fukushima Prefecture piling up in temporary storage areas. (Photo by James Hackett, RJLee Group)

The huge waste of money that is passing for clean-up now by just moving around dirt and leaves (NYTimes) can be focused on clean-up of real contamination near Fukushima using modern technologies. The economic and psychological harm wrought by the wrong-headed adoption of linear no-threshold dose effects for doses less than 0.1 Sv (10 rem) has been extremely harmful to the already stressed population of Japan, and to continue it would be criminal.

To recap LNT, the Linear No-Threshold Dose hypothesis is a supposition that all radiation is deadly and there is no dose below which harmful effects will not occur. Double the dose, double the cancers. First put forward after WWII by Hermann Muller, and adopted by the world body, including UNSCEAR, its primary use was as a Cold War bargaining chip to force cessation of nuclear weapons testing. The fear of radiation that took over the worldview was a side-effect (Did Muller Lie?).


In the end, if we don’t reorient ourselves on what is true about radiation and not on the fear, we will fail the citizens of Japan, Belarus and the Ukraine, and we will continue to spend time and money on the wrong things…

That’s just Jim’s summary – please read his complete essay for the charts, tables and implications for Japan. And did Muller Lie? The evidence seems pretty conclusive that all this enormous waste of resources was based on a lie. Not to mention the fear, and in the case of Fukushima at least a thousand unnecessary deaths due to the panic and mismanagement of the evacuation.


[1] While link testing, I found that Mark’s HPS link fails – that’s the Internet. Here’s the most recent HPS position statement I could find this morning. Radiation Risk In Perspective: Position Statement Of The Health Physics Society (updated 2010) 

In accordance with current knowledge of radiation health risks, the Health Physics Society recommends against quantitative estimation of health risks below an individual dose1 of 50 millisievert (mSv) in one year or a lifetime dose of 100 mSv above that received from natural sources. Doses from natural background radiation in the United States average about 3 mSv per year. A dose of 50 mSv will be accumulated in the first 17 years of life and 0.25 Sv in a lifetime of 80 years. Estimation of health risk associated with radiation doses that are of similar magnitude as those received from natural sources should be strictly qualitative and encompass a range of hypothetical health outcomes, including the possibility of no adverse health effects at such low levels.

There is substantial and convincing scientific evidence for health risks following high-dose exposures. However, below 50– 100 mSv (which includes occupational and environmental exposures), risks of health effects are either too small to be observed or are nonexistent.

[2] Environmentalist Stewart Brand on the retirement of LNT.

[3] Report of the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) Fifty-ninth session (21-25 May 2012) [PDF]. 

[4] EPA’s decision to allow risk-based decisions to guide responses to radiological events

Kerry Emanuel: Reddit AMA on climate change and severe weather

I’m Kerry Emanuel, a Professor of Atmospheric Science at the Massachusetts Institute of Technology in Cambridge, Massachusetts. I do research on hurricanes and other types of severe weather, on climate change, and how climate change might affect severe weather. My research is mostly theoretical, but I also build computer models and occasionally participate in field experiments and build and use laboratory experiments. I have flown research aircraft into hurricanes, and wrote a book called “Divine Wind: The History and Science of Hurricanes”, aimed at a general reader and covering both the science of hurricane and how they have influenced history, art, and literature.

We discovered this conversation after it was concluded. Kerry Emanuel is one of the four leading scientists who wrote this open letter: ‘To Those Influencing Environmental Policy But Opposed to Nuclear Power’. He also wrote a short book for the informed layman called “What We Know about Climate Change” (recommended as an efficient and readable overview of the science)

Here I have cherry-picked a few of Dr. Emanuel’s answers. The questions are my paraphrasing, as the information is largely in his replies:

Q: …claims civilization as we know it will end with that 4°C

A: In my view, the only really good way to look at this is to view it as a problem of risk. By its very definition, risk is probabilistic. The consensus view of global temperature increase over the next century is a curve with a peak in the 2-4 C range, but a non-trivial tail at higher temperatures. The most probable outcome (at least on the 100 year time scale) has risks that are probably manageable, but as Marty Weitzman at Harvard has pointed out, we need to pay attention to the tail of the risk distribution, because the economic and societal risks can be very large there. Scientists by nature are conservative and do not like to talk about what might happen in the tail, but we do need to think carefully about tail risk as part of our overall assessment of the risk.

Q: …increasing hurricane risks…

A: In my view, at the moment, we in the U.S. deal so poorly with existing hurricane risk that climate change considerations take a back seat. We actively subsidize folks to live and build in hurricane prone regions, and we bail them out massively when disaster strikes. The subsidies come in the form of state-mandated caps on insurance premiums, cheap federal flood insurance, and federal disaster relief. We need to solve these problems regardless of whether climate change results in more frequent and/or intense storms. But there are two climate-related issues that we need to consider now: rising sea level (which is already affecting the magnitude of storm surges, which in practice do much of the damage in hurricanes and other coastal storms), and projections that the incidence of very intense hurricanes should increase in the 100-year time scale. These considerations may, for example, enter into calculations of how high and how strongly we need to build sea walls in certain places.

Q: …are weather forecasts improving? 

A: Weather forecasts have demonstrably improved over the past half-century or so, but as Lorenz demonstrated, there is a fundamental limit to how far out one can make a forecast. (We think this fundamental limit is at about 2 weeks.) But faster computers have allowed us to do something we could not do just 20 years ago or so, that is quantify the uncertainty in each individual forecast. This is done by running ensembles of computer models, or ensembles within just one model but starting from slightly different, but equally plausible, initial states. These slight differences in models or initial conditions typically amplify with time, but do so at different rates at different times and places. The divergence yields a measure of uncertainty.

Q: …aren’t there more hurricanes due to global warming?

A: We do see some signals in open-ocean hurricane statistics, but since only about 1 and 3 Atlantic hurricanes make landfall in the U.S., and these do damage over a tiny fraction of their lifetimes, the record of landfalling storms is too short to see any climate signals, save perhaps for El Nino-related signals. We do not expect to see a global warming signal in U.S. hurricane damage for some decades. [highlighted because this agrees with Roger Pielke Jr. analysis of US damage data. Ed.]

Second, there is some indication that hurricanes (and cloud clusters in general) dry out the atmosphere, and this could have climate impacts. But this is very early, tentative work.

It is very hard to attribute individual events, or even groups of events, to climate change. This is simply a matter of statistics. We usually need long records to detect climate signals. There are also natural, long-period fluctuations of the North Atlantic climate that modulate rainfall in places like England.

Q: How drastic do you predict climate change to affect the United States in the next 20, 50 and 100 years?

A: I think we have to avoid the idea of a prediction. We know enough about climate risk to assert that the level of risk is enough to be a serious issue, more so as time goes by.

Almost all studies that I am aware of show differences in hurricane response to climate change, among the various ocean basins where hurricanes occur. But there is almost no agreement in the magnitude or even the sign of these differences.

Q: What are the chances that Earth will enter a new Ice Age in the coming decades?

A: Nill. But the chances in 30,000 years are excellent!

Q: My question is, what is the most interesting cause and effect relationship you learned about in the course of your research, where it turned out that seemingly disparate things were actually closely related?

A: For me, the most exciting and robust finding of climate research to date is the determination of the ultimate cause of the great glacial cycles of the last 3 million years or so. There is now very strong evidence that the root cause of these cycles lies in periodic variations in the earth’s rotation axis and orbit around the sun. Such cycles obey very precise mathematical relations, and we can see these signals in ice core and deep sea sediment records.

Q: …you’ve surely run into people who think climate change is a “hoax” or people who are just misinformed … What essays or books would you recommend?

A: All I can say to this is that I try to get people to look at this as a problem of risk. But most risk problems we are used to dealing with (e.g. the risk that our house might burn down) confront problems that may develop in our own lifetimes. We are less used to thinking about risk to future generations. We have to intelligently weigh climate risks (and possible benefits) against the risks (and possible benefits) of any actions we might contemplate to deal with climate change. We have to get away from binary thinking… climate change will be either an apocalypse or nothing to worry about; solutions will either be a complete panacea or not work at all. I do think this is actually the way most people think about the problem of climate change. As usual, the extreme elements are the noisiest, though….

Q: where would you say you have seen the most change in YOUR views on climate change as more evidence has stacked up?

A: Back in the 1980s, I did not feel there was enough evidence to warrant much concern about climate change. But great advances in paleoclimate, analysis of in-situ and satellite observations, my own acquisition of some basic understanding of climate physics and, yes, climate models have all added up to very compelling evidence that we are changing climate and engendering serious risks in doing so.

Q: Does the data you are seeing suggest that everything..

A: It is a great human temptation to attribute just about everything to the cause-du-jour. I remember when, in the 1980s. everything under the sun was blamed on El Nino. But we have to stand back, fight that temptation, and look at the data. This says that precipitation extremes are likely to increase (and there is some evidence that they have, in some places), and that heat waves will become more common and cold waves less so. We think hurricanes might become more intense, but we do not know much about how many other phenomena — such as tornadoes and hailstorms — might be affected by climate change.

Q: To what extent was the severity of Hurricane Katrina affected by AGW? 

A: It is virtually impossible to attribute any one event in a chaotic system to any particular cause. We can say that had that exact same storm followed that exact same track, with exactly the same environmental winds but through the thermodynamic environment of the 1980s, its winds would have been perhaps 20 MPH less. But that is a very restricted statement.

Unfortunately, like energy policy, climate policy depends upon the ability to understand long term risk – to evaluate and choose amongst imperfect options. That’s just the way it is.

The REAL reason some people hate nuclear energy

I heard Carl Sagan argue today (in a Science Friday archival interview from May 1996) that entrenched-power is not motivated to encourage critical thinking in the population. I’m afraid Dr. Sagan hit the bulls-eye on that one – the political logic is obvious. 

Today I also read Martin Nicholson’s new and important article at BraveNewClimate on human misperception of risk. Martin’s essay is based on David Ropeik’s essential book How Risky Is It, Really? Human evolution did not prepare us at all for a world where we must make choices amongst imperfect alternatives that have complex future consequences. Evolution did not select for skill at making decisions with century-time-scale impacts. Nor for choosing between alternative risk-benefit pairings. The beginning of Martin’s concluding section makes this clear:

Closing the Perception Gap

Making policy decisions based on fears rather than facts can lead to decisions that feel good (e.g. no nuclear) but increase the overall risk to the population (more deaths and health risks from burning fossil fuels and climate risks from greenhouse gas emissions).

Ropeik tells us that risk perception is an intrinsic, biologically rooted, inescapable part of how the human animal behaves. We need to accept this and use what we know about the way humans respond to risk in order to help ourselves make better, healthier choices. We need to bring the risk perception factors out of the subconscious shadows and use them as practical tools to allow our rational thinking to have more influence in the process.

We need to keep an open mind and give ourselves time to get more information from neutral and reliable sources – those that have no obvious bias. We need to consider all components of our response to the risk – not just the facts. We need to consider the pros and cons of various risk-management options. Why not factor feelings and values into the equation instead of trying to factor them out? Think about which policies will do us the most good.

Poor risk communication from government or agencies that are supposed to protect us like the International Atomic Energy Agency (IAEA) or the World Health Organization (WHO) can sometimes fail to account for people’s risk perceptions. This was a key factor in the long-term social/psychological/economic consequences of Chernobyl. A similar situation may have occurred at Fukushima.

Unfortunately, “feel good” is the most salient feature of politically successful policies. How does this connect to Carl Sagan’s argument? Only our critical thinking skills can save us from “feel good”. One thing we know for sure is that it is rare in western education systems to see critical thinking encouraged.   Read Martin’s essay, you’ll be glad you did.

Fukushima, radiation and risk: what is scary and what is not

Thanks to Randall XKCD http://what-if.xkcd.com/29/

The purpose of this post is to communicate why the more you know about radiation the less you worry about nuclear radiation – even the consequences of the terrible accident at Fukushima Daiichi.

To get your skeptical circuits warmed up, let's begin with the above graphic, an excerpt from Randall Munroe's What-If XKCD where Randall “answers your hypothetical questions with physics, every Tuesday”.

What if I took a swim in a typical spent nuclear fuel pool? Would I need to dive to actually experience a fatal amount of radiation? How long could I stay safely at the surface?

Randall's exploration of the question is a useful introduction to how to think about risk and radiation dose – in relation to intensity, exposure time and mediation medium (water in this example). Randall begins

Assuming you’re a reasonably good swimmer, you could probably survive treading water anywhere from 10 to 40 hours. At that point, you would black out from fatigue and drown. This is also true for a pool without nuclear fuel in the bottom.

After you've enjoyed “Spent Fuel Pool“, I recommend Randall's Radiation Dose Chart, which has become a frequently-cited resource for an introduction to radiation dose and risk. The chart is useful for an overview of relative magnitudes. In addition to Randall's chart I recommend that you download for your archive Natural Radioactivity, published by the physics department of Idaho State University. That is “ground truth” on the details of background radiation in the oceans, or land – lots of numbers and units.

With that gentle introduction I hope you are ready to read some resources that go into Fukushima monitoring in a bit more detail. Are you worried about contamination from the Fukushima Daiichi reactors? E.g., turning the Pacific Ocean into a place too dangerous to swim? Too dangerous to eat the Blue Fin Tuna?

First you will find your hard data at Monitoring environmental radiation Nuclear Regulation Authority (NRA), Japan. In particular, you can find the weekly Sea Area Monitoring reports. As I write the latest report is for 10 December, 2013 (PDF).

To make sense out of all the Becquerels/Litre in the NRA tabulations I recommend Putting Fukushima in Perspective: A primer on radioactivity in the Ocean written by University of Victoria marine chemist Jay T. Cullen (@JayTCullen). Dr. Cullen is investing his personal time in science communication to inform the public about the real risks associated with contamination from the Fukushima site. From his primer article:

Talk of plumes of radioactivity being broadcast across the Pacific must take into account that the background radioactivity of seawater is about 14 Bq/L. It is important that although one can detect isotopes from the reactor in the environment the absolute levels are very low and will be lower as the ocean mixes, and the isotope decays.

Dr. Cullen is using 14 Bq/L as the global ocean radioactivity – what does that mean? Well, one Becquerel is that quantity of a radioactive material that will have 1 transformations in one second. So the unit Bq/L tells us there is a concentration of radioactive elements in each litre of ocean that emits at the rate of 1 count per second (cps). We don't know what the material is, but we know a Geiger counter would detect 14 counts/second from a typical litre of sea water. And we know empirically (by swimming in the stuff, eating the Tuna, etc.) that 14 Bq/L is perfectly safe. Even if we don't know exactly what the number means.

Click the thumbnail for full size graphic

So let's examine some of the extensive NRA monitoring, which publishes weekly sampling results from sites immediately around the Fukushima Daiichi breakwater, out to open ocean. The thumbnail to the left shows the worst/highest sample values for Cs-134 and Cs-137 that I could find in the open sea zone (full size).

In the next table I have compared the worst samples to typical ocean background radiation. What we see is that dilution and decay of the cesium isotopes has already reduced the radiation to levels that are insignificant in relation to normal. That indicates that US Pacific coast residents do not need to be alarmed.


Some like to use the radioactivity of a banana to make these units more familiar. A typical banana emits about 15 Bq due to the potassium isotope K-40. So radiation-wise eating a banana is similar to drinking a litre of typical ocean, ignoring retention rates. If you are comfortable with bananas and seawater, but are still concerned about the Fukushima contribution, think of it this way. Equivalent to eating that banana, you would have to drink between 3 and 6 cubic meters of pure water contaminated with the measured concentrations of Fukushima cesium. I think I prefer to get my radiation dose from the banana, but I appreciate they are equivalent.

But what about concentration of the insignificant levels by fish and mollusks into dangerous levels if consumed? Good question. I asked the same question, which led me back to Dr. Cullen again for the analysis of that issue, titled What Controls Levels of Fukushima Radioisotopes in Marine Organisms?

Scientists normally report the amount of a radioactive element in an organism in units of concentration where the mass or activity of the radionuclide is given relative to the weight of the organism or its tissue. The units of these measurements are, therefore, either kilogram (kg) or activity in Becquerel (Bq = disintegrations per second) divided by the mass of the organism or tissue (kg/kg or Bq/kg). We want to understand how much radionuclide ends up in the organism relative to the isotopes concentration in seawater which can be reported in either kg per liter of seawater or Bq per liter of seawater (kg/L or Bq/L). By determining the ratio of the concentration of a radionuclide in an organism to the concentration of the isotope in seawater we define the Concentration Factor (CF) which has units of L/kg:


So if the CF for an element in a given organism is a very high number then that radioisotope tends to bioaccumulate and is found at higher concentrations in the organism than in the surrounding marine environment. Conversely, if the CF is low there is little risk of bioaccumulation in the organism.

So what is the bottom line on seafood?

What can we expect on the west coast of North America?

Beginning in the new year we can expect seawater affected by the Fukushima disaster to arrive at our coast in the Pacific northwest. Peak concentrations in the heart of the plume of affected seawater are expected to be on the order of 0.001 to 0.020 Bq/L based on measurements and physical models of ocean circulation. The much lower radionuclide concentrations are the result of mixing and the decay of shorter lived isotopes. Given known CFs for marine organisms these seawater concentrations will result in much lower concentrations of radionuclides in organisms residing on the west coast compared to their Japanese cousins. The radioactive dose to these organisms or consumers of these organisms will be dominated by the naturally occurring radionuclide Po-210.

A confirming evaluation of the food chain question was published in the June 25, 2013 issue of the Proceedings of the National Academy of Sciences Evaluation of radiation doses and associated risk from the Fukushima nuclear accident to marine biota and human consumers of seafood [open access]. Excerpt from the abstract:

To link the radioactivity to possible health impairments, we calculated doses, attributable to the Fukushima-derived and the naturally occurring radionuclides, to both the marine biota and human fish consumers. We showed that doses in all cases were dominated by the naturally occurring alpha-emitter 210Po and that Fukushima-derived doses were three to four orders of magnitude below 210Po-derived doses. Doses to marine biota were about two orders of magnitude below the lowest benchmark protection level proposed for ecosystems (10 µGy⋅h−1).

My bottom line is — if you wish to monitor for any dangers developing when Fukushima seaborne contamination reaches California, then I suggest you subscribe to Dr. Cullen's blog MarineChemist. That's what we do (we subscribe to his RSS feed). If there is anything to worry about then Dr. Cullen will let you know. Or you can just subscribe to Seekerblog!

I promised to also discuss “what is scary?” My answer is the post-antibiotic world where antibiotics don't work any more. That is really, really scary, especially if you are a geezer like me. Climate change is very scary – but antibiotic resistance is spreading as I write. The big hurts from climate change will probably be after-death experiences for me.

Japanese Fisheries Agency samples fish for contamination: most OK “even in the sea near Fukushima”

Japanese Fisheries Agency samples fish for contamination: most OK “even in the sea near Fukushima”
Source Japan Times

The officials from the Fisheries Agency stressed that the monitoring results show that the impact of the nuclear crisis on fish is now subtle even in the sea near Fukushima.

Results from the cesium density test in the first three month after the meltdown catastrophe started in March 2011 showed that 53 percent of fish caught around Fukushima exceeded the legal limit of 100 becquerels per kilogram, but now only 2.2 percent of fish caught top this threshold. Regardless, fish caught within 20 km of Fukushima No. 1 are not shipped to market.

As for fish caught far from Fukushima, more than 14,000 samples have been tested in the past year and only 88 exceeded the legal cesium safety limit of 100 becquerels per kilogram.

Fukushima water leaks:“This is healthwise a big nothing”

Lake Barrett, a former head of the Department of Energy’s Office of Civilian Nuclear Waste Management, spent nearly a decade at the U.S. Nuclear Regulatory Commission and led the clean-up operations after the 1979 partial meltdown at the Three Mile Island nuclear plant. He has been brought in by Tokyo Electric Power (Tepco) to advise it on the lengthy decommissioning process at Fukushima.

He said work should begin now to pump groundwater from the plant before it reaches wrecked reactors – a measure that has been stalled by local opposition.

“They should start pumping as soon as practical,” said Barrett, adding that groundwater would have to be released into the sea along with water that had been treated to remove most radiation – by a system designed by Toshiba Corp.

“I believe in a matter of a few months … early next year … water will be cleaned up and be ready to be discharged,” he said in an interview.

But Barrett, who has said he would feed his grandchildren fish caught off the Fukushima coast if the clean-up proceeds as planned, said Tepco has lost its credibility to reassure a jittery public. “When Tepco says: ‘trust me, this water is safe,’ that’s not enough,” he said.

(…) He said concerns raised by South Korea and China over the continued leaks of radiated water at Fukushima “political posturing.”

“This is healthwise a big nothing,” he said.



Do we need to worry about Fukushima contamination in the ocean? (part 1)

In a word, no – though it isn't a good idea to eat the bottom fish feeding within a few kilometers of the Daiichi harbor. And if you made your living fishing in the ocean right around Daiichi, your livelihood has been destroyed until the cleanup is completed. While there are serious threats that deserve our intense focus, Fukushima is not anywhere on my list, which starts with antibiotic resistance, energy poverty, and climate change. But turn on a TV anywhere and you will soon see newsreaders talking about radiation leaking from Fukushima Daiichi into the Pacific Ocean. If there are any numbers mentioned they will be Very Big Numbers voiced to make it clear these are unbelievably scary.

On the other hand, talk to any scientist familiar with radiation health physics: they will be unconcerned, but monitoring. Why is it that the level of fear is inversely proportional to understanding? In brief, it is because with understanding comes the appreciation that life is adapted to the levels of ionizing radiation common around the planet. Those background levels vary by more than an order of magnitude, and surprisingly, residents of the areas with highest background radiation do not have elevated levels of cancer. So radiation is not scary, unless the dose exceeds the tolerance of our DNA repair systems. To put the numbers and units in an easy to grasp frame, please spend some time absorbing the brilliant relative radiation chart developed by XKCD. For reference, keep in mind an annual dose limit of roughly 50 mSv (here is some background on exposure limits at the Health Physics Society).

Since the current focus of fear is Fukushima I've gathered a few science resources that I hope will help the reader lose at least those particular fears. First we have scientist Ken Buesseler, with Woods Hole Oceanographic Institution. Ken maintains a Woods Hole website FAQ: Radiation from Fukushima. Ken's most recent update is 28 August:

On March 11, 2011, a magnitude 9.0 earthquake—one of the largest ever recorded—occurred 80 miles off the coast of Japan. The earthquake created a series of tsunamis, the largest estimated to be over 30 feet, that swept ashore. In addition to the tragic human toll of dead, injured, and displaced, the earthquake and tsunamis badly damaged the Fukushima Daiichi nuclear power plant, eventually causing four of the six reactors there to release radiation into the atmosphere and ocean.

Since mid-2011, I have worked with Japanese colleagues and scientists around the world to understand the scope and impact of events that continue to unfold today. In June 2011, I organized the first comprehensive, international expedition to study the spread of radionuclides from Fukushima into the Pacific, and I or members of my lab have participated in several other cruises and analyzed dozens of samples of water, sediment, and biota. In addition, I began my career in oceanography by studying the spread of radionuclides from Chernobyl in the Black Sea. These are a few of the most common questions that people have been asking me lately.

-Ken Buesseler, Woods Hole Oceanographic Institution.

What is the state of fisheries off Japan and along U.S. West Coast?

The coastal fisheries remain closed in Japan near Fukushima, where there is a concern for some species, especially the bottom dwelling ones, which are being tested and many have been found to be above the Japanese government's strict limits for cesium in seafood. These contaminated fish are not being sold internally in Japan or exported. Because of the dilution that occurs even a short distance from Fukushima, we do not have a concern about the levels of cesium and other radionuclides in fish off the West Coast of the U.S.

More about the state of Japanese fisheries (pdf).

Are fish such as tuna that might have been exposed to radiation from Fukushima safe to eat?

Seawater everywhere contains many naturally occurring radionuclides, the most common being polonium-210. As a result, fish caught in the Pacific and elsewhere already have measurable quantities of these substances. Most fish do not migrate far from home, which is why fisheries off Fukushima remain closed. But some species, such as the Pacific bluefin tuna, can swim long distances and could pick up cesium in their feeding grounds off Japan. However, cesium is a salt taken up by the flesh that will begin to flush out of an exposed fish soon after they enter waters less affected by Fukushima. By the time tuna are caught in the eastern Pacific, cesium levels in their flesh are 10-20 times lower than when they were off Fukushima. Moreover, the dose from Fukushima cesium is considered insignificant relative to the dose from naturally occurring polonium-210, which was 1000 times higher in fish samples studied, and both of these are much lower relative to other, more common sources, such as dental x-rays.

More about the dose and associated risk (pdf) of radiation from Fukushima to marine life and humans.

(…)Is radiation exposure still a concern?

Is radiation exposure still a concern? I stood on a ship two miles from the Fukushima reactors in June 2011 and as recently as May 2013, and it was safe to be there (I carry radiation detectors with me) and collect samples of all kinds (water, sediment, biota). Although radioactive isotopes in the samples and on the ship were measurable back in our lab, it was low enough to be safe to handle samples without any precautions. In fact, our biggest problem is filtering out natural radionuclides in our samples so we can measure the trace levels of cesium and other radionuclides that we know came from Fukushima.

Where does radiation from Fukushima go once it enters the ocean? The spread of cesium once it enters the ocean can be understood by the analogy of mixing cream into coffee. At first, they are separate and distinguishable, but just as we start to stir the cream forms long, narrow filaments or streaks in the water. The streaks became longer and narrower as they moved off shore, where diffusive processes began to homogenize and dilute the radionuclides. In the ocean, diffusion is helped along by ocean eddies, squirts, and jets that broaden, mix, and continue to dilute the cesium as it travels across the ocean. With distance and time, radionuclide concentrations become much lower in the ocean, something that our measurements confirm.

More information about our oceanographic studies off Fukushima (pdf).

Are the continued sources of radiation from the nuclear power plants of concern?

The site of the Fukushima Dai-ichi nuclear power plant is an ongoing source of radionuclides (pdf) in to the ocean “something I've seen evidence of in my data and published about since 2011. Although the numbers sound large (300,000 gallons of water leaked or 20 trillion bequerels per liter), we calculated in 2011 when radiation levels were much higher than today that the dose to someone on a ship or in the ocean was not of concern. For the workers at the site, direct exposure from leaking storage tanks is of greater health concern because exposure from these concentrated sources is much higher. For the general public, it is not our direct exposure, but uptake by the food web and, hence, the potential for human consumption of contaminated fish that is the main health concern.

Will radiation be of concern along U.S. and Canadian coasts? Levels of any Fukushima contaminants in the ocean will be many thousands of times lower after they mix across the Pacific and arrive on the West Coast of North America some time in late 2013 or 2014. This is not to say that we should not be concerned about additional sources of radioactivity in the ocean above the natural sources, but at the levels expected even short distances from Japan, the Pacific will be safe for boating, swimming, etc.

Is debris washing ashore on the US/Canadian West Coast of concern? Debris washed out to sea by the tsunami does not carry Fukushima radioactive contamination”I‚Äôve measured several samples in my lab. It does, however, carry invasive species, which will be of serious concern to coastal ecosystems on the West Coast.

Have there been increased deaths as a result of radiation from Fukushima?

Reports of increased deaths are simply not true. Read this reasoned response in Scientific American to the most often-cited “scientific” paper about erroneously linking deaths to radiation from Fukushima. That article ends “This is not to say that the radiation from Fukushima is not dangerous (it is), nor that we shouldn't closely monitor its potential to spread (we should).” I agree with that statement.

Where can people go for reliable information?

Here are some other links I have passed to others. Fukushima's Radioactive Water Leak: What You Should Know http://news.nationalgeographic.com/news/energy/2013/08/130807-fukushima-radioactive-water-leak/

Latest Radioactive Leak at Fukushima: How Is It Different? http://news.nationalgeographic.com/news/energy/2013/08/130821-fukushima-latest-leak-how-is-it-different/

See also following article from the Woods Hole Oceanographic Institution (w/ links to many others) http://www.whoi.edu/oceanus/viewArticle.do?id=167749&sectionid=1000 From the special issue of Oceanus Magazine devoted to the cause and impacts of Fukushima: http://www.whoi.edu/oceanus/series/fukushima

Consider supporting our new Center for Marine and Environmental Radioactivity and check out CMER public education links, such as ABCs of radioactivity http://www.whoi.edu/page.do?pid=119836

Last updated: August 28, 2013

I'm working on a followup post that is intended to provide a reference set of resources to help readers get comfortable with radiation and risk.


Risk Literacy

Risk assessment

More from John Brockman’s Edge question “What scientific concept would improve everybody’s cognitive toolkit?” by Gerd Gigerenzer, Psychologist; Director of the Center for Adaptive Behavior and Cognition at the Max Planck Institute for Human Development in Berlin; Author, Gut Feelings

Literacy — the ability to read and write — is the precondition for an informed citizenship in a participatory democracy. But knowing how to read and write is no longer enough. The breakneck speed of technological innovation has made risk literacy as indispensable in the 21st century as reading and writing were in the 20th century. Risk literacy is the ability to deal with uncertainties in an informed way.

Without it, people jeopardize their health and money and can be manipulated into experiencing unwarranted, even damaging hopes and fears. Yet when considering how to deal with modern threats, policy makers rarely ever invoke the concept of risk literacy in the general public. To reduce the chances of another financial crisis, proposals called for stricter laws, smaller banks, reduced bonuses, lower leverage ratios, less short-termism, and other measures.

But one crucial idea was missing: helping the public better understand financial risk. For instance, many of the “NINJAs” (no income, no job, no assets) who lost everything but the shirts on their backs in the subprime crisis didn’t realize that their mortgages were variable, not fixed-rate. Another serious problem that risk literacy can help solve are the exploding costs of health care.. Tax hikes or rationed care are often presented as the only viable alternatives. Yet by promoting health literacy in patients, better care can be had for less money.

For instance, many parents are unaware that one million U.S. children have unnecessary CT scans annually and that a full body scan can deliver one thousand times the radiation dose of a mammogram, resulting in an estimated 29,000 cancers per year.

I believe that the answer to modern crises is not simply more laws, more bureaucracy, or more money, but, first and foremost, more citizens who are risk literate. This can be achieved by cultivating statistical thinking.

Simply stated, statistical thinking is the ability to understand and critically evaluate uncertainties and risks. Yet 76 percent of U.S. adults and 54 percent of Germans do not know how to express a 1 in 1,000 chance as a percentage (0.1%). Schools spend most of their time teaching children the mathematics of certainty — geometry, trigonometry — and spend little if any time on the mathematics of uncertainty. If taught at all, it is mostly in the form of coin and dice problems that tend to bore young students to death. But statistical thinking could be taught as the art of real-world problem solving, i.e. the risks of drinking, AIDS, pregnancy, horseback riding, and other dangerous things. Out of all mathematical disciplines, statistical thinking connects most directly to a teenager’s world.

Even at the university level, law and medical students are rarely taught statistical thinking — even though they are pursuing professions whose very nature it is to deal with matters of uncertainty. U.S. judges and lawyers have been confused by DNA statistics and fallen prey to the prosecutor’s fallacy; their British colleagues drew incorrect conclusions about the probability of recurring sudden infant death. Many doctors worldwide misunderstand the likelihood that a patient has cancer after a positive screening test and can’t critically evaluate new evidence presented in medical journals. Experts without risk literacy skills are part of the problem rather than the solution.


Shifting Baseline Syndrome

What went wrong? Many things, from factory fishing to inadequate oversight, but much of it was aided and abetted by treating each step toward disaster as normal. The entire path, from plenitude to collapse, was taken as the status quo, right up until the fishery was essentially wiped out.

More from John Brockman's Edge question “What scientific concept would improve everybody's cognitive toolkit?” by Paul Kedrosky, Editor, Infectious Greed; Senior Fellow, Kauffman Foundation

When John Cabot came to the Grand Banks off Newfoundland in 1497 he was astonished at what he saw. Fish, so many fish — fish in numbers he could hardly comprehend. According to Farley Mowat, Cabot wrote that the waters were so “swarming with fish [that they] could be taken not only with a net but in baskets let down and [weighted] with a stone.”

The fisheries boomed for five hundred years, but by 1992 it was all over. The Grand Banks cod fishery was destroyed, and the Canadian government was forced to close it entirely, putting 30,000 fishers out of work. It has never recovered.

What went wrong? Many things, from factory fishing to inadequate oversight, but much of it was aided and abetted by treating each step toward disaster as normal. The entire path, from plenitude to collapse, was taken as the status quo, right up until the fishery was essentially wiped out.

In 1995 fisheries scientist Daniel Pauly coined a phrase for this troubling ecological obliviousness — he called it “shifting baseline syndrome”. Here is how Pauly first described the syndrome: “Each generation of fisheries scientist accepts as baseline the stock situation that occurred at the beginning of their careers, and uses this to evaluate changes. When the next generation starts its career, the stocks have further declined, but it is the stocks at that time that serve as a new baseline. The result obviously is a gradual shift of the baseline, a gradual accommodation of the creeping disappearance of resource species…”

It is blindness, stupidity, intergeneration data obliviousness. Most scientific disciplines have long timelines of data, but many ecological disciplines don't. We are forced to rely on second-hand and anecdotal information — we don't have enough data to know what is normal, so we convince ourselves that this is normal.

But it often isn't normal. Instead, it is a steadily and insidiously shifting baseline, (…snip…)

When you understand shifting baseline syndrome it forces you to continually ask what is normal. Is this? Was that? And, at least as importantly, it asks how we “know” that it's normal. Because, if it isn't, we need to stop shifting the baselines and do something about it before it's too late.




ExternE: Comparing Nuclear Health and Environmental Effects

When an anti-nuclear activist says “No to nuclear power because it isn’t safe” I ask “compared to what?” Decisions about energy options always involve comparative risks and benefits. So to make informed choices the politicians need to be informed and able to evaluate relative risk/benefits. A staggering amount of research has been done to characterize the risks of nuclear, fossil fuels, bioenergy, hydro, wind and solar. From that research the conclusion I reach is that nuclear power is the safest available option to meet the energy demands of both developed and developing countries. Hydro can be similarly safe, but the hydro opportunities are largely already exploited, while we need to keep in mind that in the 1975 to 1985 period some of the biggest man-made energy-related disasters were caused by dam failures in China (The worst energy-related accident was the Banqiao/Shimantan dam failure in China in 1975 when some 30 000 people were killed) and India (Machhu II, India 2500 deaths and Hirakud, India 1000 deaths). Source: “Comparing Nuclear Accident Risks with Those from Other Energy Sources” (PDF).

In the following we will move from the severe accidents comparisons to the full life cycle long term health effects.


Diagram 2 (click for full size). Most of the health risk calculations in ExternE, presented as deaths per TWh (electricity). The diagram shows electricity production facilities in all EU states and in Norway

The graphic at left shows the Deaths/TWh in all EU states + Norway for fossil fuels, bioenergy, hydro, nuclear and wind. This is from the Swedish study “Economic Analysis of Various Options of Electricity Generation – Taking into Account Health and Environmental Effects” by Nils Starfelt and Carl-Erik Wikdahl. The authors started with the exhaustive EU ExternE-Pol studies, then expressed the health/environmental effects in the readily-understood metric of “deaths per TWh (terrawatt-hour)” of electrical generation.

If you examine the chart very carefully you should be able to detect the tiny Nuclear Power data points for Denmark and France.

Mean deaths per TWh for EU states

Diagram 3 (click for full size). Mean values of health effects, presented as deaths/TWh, for the respective forms of electricity generation throughout the EU. These calculations are based on the same data as in Diagram 2.


Some of the more straightforward conclusions that can be drawn from the results shown in Diagrams 2 and 3 are:

1. Coal, lignite and oil result in considerably greater external costs and thus health effects than do the other forms f energy. This difference becomes even greater if the greenhouse effect is also included in the results: see Diagram 7.

2. The external costs of hydro power and nuclear power are about two orders of magnitude less than those from the above-mentioned fossil fuels.

3. Among the fossil fuels, natural gas has considerably less effect on the environment than do the other forms of energy.

4. The external costs of bioenergy, as shown in the ExternE results, lie close to those for fossil fuels, but it should be noted that, in most cases, the results are based on technology for which there is a considerable potential for improvement.

The authors use the ExternE database to model Sweden as an example case. From there they can reason about the impact of closing the Swedish nuclear reactor Barsebäck:

The risk of major effects, and the need for extensive evacuations in the event of an accident at Barsebäck, have dominated the debate in Sweden and Denmark. Using the results presented in this report, it can therefore be of interest to make a comparison between the health risks resulting from a nuclear power station accident and those from normal operating emissions from Danish coal-fired power production.

Closing the Barsebäck reactors will result in a loss of production in Sweden which, during a statistically average climate year, cannot be compensated for by energy savings or by increased production of nuclear power and/or water power in Sweden and Norway. For a number of years into the future, the only possibility is a greater import of electricity produced in coal-fired power plants, mainly in Denmark.

According to data given in ExternE, the increased pollution from operation of these coal-fired power stations, which would otherwise have not been operated if Barsebäck had not been shut down, will amount to about 200 deaths per year, of which most will occur in Denmark but a few also in Sweden.

This conclusion is obvious to students of energy policy, but is never-to-be-discussed in Greenpeace circles. Friends of the Earth, Greenpeace and their ilk are directly responsible for stopping nuclear plant construction, and for the dramatic cost increases driven by activist delaying tactics, and thus the pollution and related deaths from fossil fuel generation that would have been eliminated by expanded nuclear power.

As we do not know exactly how many GW of nuclear capacity would have been constructed (instead of coal and gas), it is a counterfactual to calculate how many deaths we should credit to the accounts of the anti-nuclear activists. I am comfortable with my conclusion that their account has accumulated tens of thousands of unnecessary deaths, and delayed by nearly half a century the development of mass-manufactured modular nuclear plants.

Lastly, for reference I’ll note the authors’ comment on new capacity costs — I’ve not had time to verify their figures:

The generating cost for new capacity in Sweden has been calculated to be in the range of 2.5 to 3.5 EUcents/kWh for hydro, nuclear and gas and about twice as much for bioenergy and wind. Taxes and subsidies are not included.