Appeals to the climate consensus can give the wrong impression

Credit John Cook 2014 www.skepticalscience.com

Image credit John Cook http://www.skepticalscience.com, Closing the Consensus Gap on Climate Change

Dr. Will Howard, recently published an essay that will appeal to those of you interested in science communications, especially in the challenging and politically-charged context of climate change. Dr. Howard makes the extremely important point that “scientific consensus” on climate change reflects strong consilience of evidence. I confess that I had to look up “consilience” to learn that it is indeed the perfect term to capture how we have developed confidence in our understanding of the causal connections between human-generated greenhouse gases and climate change.

In public discourse, if we had chosen “consilience of evidence” to describe the accumulation of research, then perhaps people might have understood more readily that we are not talking about the results of an opinion poll or a negotiated statement (yes, the IPCC Summary for Policymakers [PDF] is a negotiated statement, though I don’t know how else such a summary could be produced).

I thought Will’s essay captured this science communications challenge succinctly, and especially how this strong consilience of evidence is separate from the politics of what to do about it:

“Consensus” is understood differently in science compared to politics or society.

Scientists use this word to refer to consilience of multiple lines of evidence that underlie widespread agreement or support a theory.

In the case of climate change, multiple lines of evidence underpin the prevailing view that the climate system is showing decade-on-decade warming over the past 50 years.

In particular, this warming bears temporal and spatial patterns, or “fingerprints”, that point to human causes.

For example, the stratosphere (that part of the atmosphere higher than about 11 km) has been cooling as the lower atmosphere and the ocean warm. This is the pattern we expect from the addition of greenhouse gases and not from, say, changes in the sun’s output.

But in public and especially political discourse, “consensus” tends to imply majority opinion or concurrence. As consensus in this public context is often arrived at by negotiation, saying there’s a scientific “consensus” may imply to the community that prevailing scientific views represent a negotiated outcome. This is the antithesis of science.

Consensus of the non-scientific kind does have a role to play in the climate debate. This includes negotiating whether warming is a “good” or “bad” thing and what, if anything, we should do about it.

These are not scientific questions. These are issues of values, politics, ethics and economics. As a nation and as a global society we need to reach consensus to resolve those questions and to make and implement appropriate public policy.

I’ve nothing to add to Will’s excellent essay, so I recommend that you go directly to The Conversation to read the original and the comments. Some effort is required to weed the growing number of comments so I will highlight a segment of the conversation which focuses upon the important question of effective science communication:

John Cook
Climate Communication Research Fellow at University of Queensland

This is an interesting article with many important points. I would be the first person to stress the importance of communicating the many “fingerprints” being observed in our climate (and in fact have created a human fingerprints infographic which I frequently use in public talks http://www.skepticalscience.com/graphics.php?g=32).

However, the article is missing a crucial element to this discussion – what does the evidence tell us about the efficacy of consensus messaging? A number of studies have found that one of the strongest predictors of public support for climate mitigation policies is perception of consensus (i.e., the level of agreement among climate scientists about human-caused global warming). Also, consensus messaging significantly increases acceptance of climate change. A randomised experiment by Stephan Lewandowsky found that informing Australians of the 97% consensus increased their acceptance of human-caused global warming and intriguingly, the increase was greatest amongst conservatives. In this case, consensus neutralised ideology to some degree.

When people think there is still an ongoing debate about human-caused global warming amongst climate scientists, they’re less likely to accept climate change and support climate action. And given the Australian public on average think there is 58% agreement among climate scientists, rather than 97%, then this misconception has serious societal implications. Science communicators need to take into account that people use expert scientific opinion as a heuristic to inform their views on complex scientific issues.

To underscore this fact, I’ve actually tested the human fingerprints message (linked to above) and the consensus message in a randomised experiment. Consensus messaging significantly outperformed the fingerprints message. The lesson here is that we need to understand how laypeople think about complex scientific issues like climate change.

However, I don’t think there need be that much conflict between what social science is telling us and the views of the OP. A recent paper by Ed Maibach tested various forms of consensus messaging and they found the most effective was a message that emphasised both consensus and the evidence-based nature of the scientific method:

“Based on the evidence, 97% of climate scientists have concluded that human-caused climate change is happening”

John Cook
Climate Communication Research Fellow at University of Queensland
In reply to Anna Young

Anna, the problem you raise is exactly why communication like the John Oliver YouTube video embedded in the OP are so powerful. Not only does Oliver communicate the 97% consensus, he also does something equally important – he communicates how people cast doubt on the consensus (in this case, by perpetuating false balance in the media). What Oliver is doing is equipping people with the knowledge and the critical thinking skills so that when they see the mainstream media show a debate between a climate scientist and a tinfoil guy, they can see it for what it is. It’s not only a funny video, it’s brilliant communication. The fact that it’s been viewed millions of times means millions of people have now been “inoculated” against the misinformation of false debate in the mainstream media.

So kudos to Will Howard for embedding the video.

Will Howard
Research scientist at University of Melbourne
In reply to John Cook


Thanks John, for contributing that perspective. The points you raise, I would suggest, may be applicable to many areas of “contested” science, in health, resources (e.g. coal seam gas) and others. 

Whatever is said about the consensus, I do think we need to do a better job of communicating what underpins it. As your co-author Peter Jacobs notes

“to those suggesting that the consensus message is an appeal to authority that ignores evidence- the consensus exists *because of* the overwhelming physical evidence, which is detailed at length in the scientific literature.”

But I wonder about this: both the consensus and the consilience of evidence (my preferred term) seem to be strengthening, yet public support for policies aimed at mitigating climate change seem not to be.

I note polls suggesting climate change and environmental issues have moved down peoples’ priorities. Here in Australia, our current government was elected with a major plank in its platform being the removal of the carbon tax. (Whether we agree or disagree with their policy they ran on that issue and were elected).

Is this because people are skeptical of the science? Is it just that other issues take on more urgency: jobs, the economy, international conflicts, etc.?

John Cook
Climate Communication Research Fellow at University of Queensland
In reply to Will Howard

I like the term “consilience of evidence” also but when I test-drive it in public talks, it tends to inspire blank looks from the audience. It’s a term that scientists love. Laypeople, not so much. Which is why, again, it’s important that we understand our audience when we do science communication.

Why is public support not changing that much? Public concern about climate change does correlate with economic performance hence the drop in climate concern after the GFC. Another predictor of public concern about climate change is cues from our political leaders so you can see why Australia has a problem in that department at the moment. There’s certainly a number of factors that influence people’s attitudes to climate.

But as I said above, several recent studies have found perception of scientific agreement on climate change is one of the biggest factors. And given public perception of consensus is very low (I measured it at 58% on a representative Australian sample), this misconception is definitely a significant problem. It’s not the only factor delaying public support for climate action but it’s a big one.

Also, communicating the 97% consensus is a lot easier to understand than explanations of why greenhouse gases in the upper atmosphere are more efficient at radiating longwave radiation to space, hence contributing to the cooling stratosphere. From a communication point of view, consensus is a low lying fruit. This is why consensus messaging outperformed fingerprint messaging in my data.

So communicating the 97% consensus can help with removing one roadblock delaying climate action. It won’t fix everything – it’s not a magic bullet. But ignoring the “consensus gap” only serves to give extra life to that stumbling block.

I wrote a post a while back How to break the climate change gridlock including a conversation with  Andrew Dressler, Professor of Atmospheric Sciences at Texas A&M, about how we might more explicitly get each party’s values and economic interests on the negotiating table.

Will Howard has received funding from the Australian Research Council, the Australian Government Department of Climate Change, the Cooperative Research Centres Program, and the Australian Antarctic Science Program.

This article was originally published on The Conversation. Read the original article.

Good advice for science advisors – from the book “Future Directions for Scientific Advice for Whitehall”

I was surprised to find a well-informed essay on science policy – in, of all places, the Guardian. At the end of the article I found out why this is such a good essay:

Geoff Mulgan is chief executive of Nesta, the UK’s innovation foundation. He is on Twitter @geoffmulgan. This article is from the book Future Directions for Scientific Advice in Whitehall (edited by Robert Doubleday and James Wilsdon) which is free to download here from 18 April 2013.

The essay is longish for sound reasons – here are some excerpts from the concluding paragraphs to motivate you to read the complete essay:

(…) Formal scientific knowledge sits alongside these other types of knowledge, but does not automatically trump the others. Indeed, a politician, or civil servant, who acted as if there was a hierarchy of knowledge with science sitting unambiguously at the top, would not last long. The consequence is that a scientist who can mobilise other types of knowledge on his or her side is likely to be more effective than one that cannot; for example, by highlighting the economic cost of future floods and their potential effect on political legitimacy, as well as their probability.

These points help to explain why the role of a chief scientific adviser can be frustrating. Simply putting an eminent scientist into a department may have little effect, if they don’t also know how to work the system, or how to mobilise a large network of contacts. Not surprisingly, many who aren’t well prepared for their roles as brokers, feel that they rattle around without much impact.

For similar reasons, some of the other solutions that have been used to raise the visibility and status of scientific advice have tended to disappoint. Occasional seminars for ministers or permanent secretaries to acclimatise them to new thinking in nanotechnology or genomics are useful but hardly sufficient, when most of the real work of government is done at a far more junior level. This is why some advocate other, more systematic, approaches to complement what could be characterised as the “clever chap” theory of scientific advice.

First, these focus on depth and breadth: acclimatising officials and politicians at multiple levels, and from early on, to understanding science, data and evidence through training courses, secondments and simulations; influencing the media environment as much as insider decision making (since in practice this will often be decisive in determining whether advice is heeded); embedding scientists at more junior levels in policy teams; linking scientific champions in mutually supportive networks; and opening up more broadly the world of evidence and data so that it becomes as much part of the lifeblood of decision making as manifestos.

Here the crucial point is that the target should not just be the very top of institutions: the middle and lower layers will often be more important. A common optical mistake of eminent people in London is to overestimate the importance of the formal relative to the informal, the codified versus the craft.

Second, it’s vital to recognise that the key role of a scientific adviser is to act as an intermediary and broker rather than an adviser, and that consequently their skills need to be ones of translation, aggregation and synthesis as much as deep expertise. So if asked to assess the potential commercial implications of a new discovery such as graphene; the potential impact of a pandemic; or the potential harms associated with a new illegal drug, they need to mobilise diverse forms of expertise.

Their greatest influence may come if – dare I say it – they are good at empathising with ministers who never have enough time to understand or analyse before making decisions. Advisers who think that they are very clever while all around them are a bit thick, and that all the problems of the world would be solved if the thick listened to the clever, are liable to be disappointed.

(…) In optimistic moments, I hope that we are moving towards a period of more overtly experimentalist governance, where governments are willing to test their ideas out – to run RCTs and embed continuous learning and feedback into everything they do. Experimental government would certainly be better than government by instinct, government by intuition and government solely guided by ideology.

In such a context, the old model of a clever man given a desk in Whitehall, sitting in a corner writing memos may be even more anachronistic. We certainly need highly intelligent eminent experts to guide decisions. We need to pay more comprehensive and sophisticated attention not only to the supply of useful knowledge, but also to how that knowledge is used. By doing this, governments and advisers can make more informed decisions, fewer mistakes and respond better to the complex problems they face. But let’s be as serious in making use of the evidence about evidence, as we are about the evidence itself.

Highly recommended!

Good advice for science advisors – from the book Future Directions for Scientific Advice in Whitehall

I was surprised to find a well-informed essay on science policy – in, of all places, the Guardian. At the end of the article I found out why it is such a good essay:

Geoff Mulgan is chief executive of Nesta, the UK’s innovation foundation. He is on Twitter @geoffmulgan. This article is from the book Future Directions for Scientific Advice in Whitehall (edited by Robert Doubleday and James Wilsdon) which is free to download here from 18 April 2013.

 

(…) Formal scientific knowledge sits alongside these other types of knowledge, but does not automatically trump the others. Indeed, a politician, or civil servant, who acted as if there was a hierarchy of knowledge with science sitting unambiguously at the top, would not last long. The consequence is that a scientist who can mobilise other types of knowledge on his or her side is likely to be more effective than one that cannot; for example, by highlighting the economic cost of future floods and their potential effect on political legitimacy, as well as their probability.

These points help to explain why the role of a chief scientific adviser can be frustrating. Simply putting an eminent scientist into a department may have little effect, if they don’t also know how to work the system, or how to mobilise a large network of contacts. Not surprisingly, many who aren’t well prepared for their roles as brokers, feel that they rattle around without much impact.

For similar reasons, some of the other solutions that have been used to raise the visibility and status of scientific advice have tended to disappoint. Occasional seminars for ministers or permanent secretaries to acclimatise them to new thinking in nanotechnology or genomics are useful but hardly sufficient, when most of the real work of government is done at a far more junior level. This is why some advocate other, more systematic, approaches to complement what could be characterised as the “clever chap” theory of scientific advice.

First, these focus on depth and breadth: acclimatising officials and politicians at multiple levels, and from early on, to understanding science, data and evidence through training courses, secondments and simulations; influencing the media environment as much as insider decision making (since in practice this will often be decisive in determining whether advice is heeded); embedding scientists at more junior levels in policy teams; linking scientific champions in mutually supportive networks; and opening up more broadly the world of evidence and data so that it becomes as much part of the lifeblood of decision making as manifestos.

Here the crucial point is that the target should not just be the very top of institutions: the middle and lower layers will often be more important. A common optical mistake of eminent people in London is to overestimate the importance of the formal relative to the informal, the codified versus the craft.

Second, it’s vital to recognise that the key role of a scientific adviser is to act as an intermediary and broker rather than an adviser, and that consequently their skills need to be ones of translation, aggregation and synthesis as much as deep expertise. So if asked to assess the potential commercial implications of a new discovery such as graphene; the potential impact of a pandemic; or the potential harms associated with a new illegal drug, they need to mobilise diverse forms of expertise.

Their greatest influence may come if – dare I say it – they are good at empathising with ministers who never have enough time to understand or analyse before making decisions. Advisers who think that they are very clever while all around them are a bit thick, and that all the problems of the world would be solved if the thick listened to the clever, are liable to be disappointed.

(…) In optimistic moments, I hope that we are moving towards a period of more overtly experimentalist governance, where governments are willing to test their ideas out – to run RCTs and embed continuous learning and feedback into everything they do. Experimental government would certainly be better than government by instinct, government by intuition and government solely guided by ideology.

In such a context, the old model of a clever man given a desk in Whitehall, sitting in a corner writing memos may be even more anachronistic. We certainly need highly intelligent eminent experts to guide decisions. We need to pay more comprehensive and sophisticated attention not only to the supply of useful knowledge, but also to how that knowledge is used. By doing this, governments and advisers can make more informed decisions, fewer mistakes and respond better to the complex problems they face. But let’s be as serious in making use of the evidence about evidence, as we are about the evidence itself.

 

Nature: Time to confront academic fraud

One percent fraudulent papers is much higher than I thought. Yes, it is just one study, but this certainly raises the question: what are appropriate countermeasures?

Considerable hard data have emerged on the scale of misconduct. A metastudy (D. Fanelli PLoS ONE 4, e5738; 2009) and a detailed screening of all images in papers accepted by The Journal of Cell Biology (M. Rossner The Scientist 20 (3), 24; 2006) each suggest that roughly 1% of published papers are fraudulent. That would be about 20,000 papers worldwide each year.At the time of the Baltimore case, it was widely argued that research misconduct was insignificantly rare — and irrelevant to the progress of science, which would self-correct. Few senior scientists now believe that. They know that misconduct exists and that, unchecked, it can undermine public regard for science and scientists.

 

False Positive Science

This science policy post by Roger Pielke Jr. is a gem. You’ll want to keep these principles in mind whenever you read new research press releases (much of the science reporting you read in the media is regurgitated press releases). Here’s an excerpt:

(…) The problem of “false positive science” is of course not limited to the discipline of psychology or even the social sciences. Simmons et al. provide several excellent empirical examples of how ambiguity in the research process leads to false positives and offer some advice for how the research community might begin to deal with the problem.

Writing at The Chronicle of Higher Education, Geoffrey Pullam says that a gullible and compliant media makes things worse:

Compounding this problem with psychological science is the pathetic state of science reporting: the problem of how unacceptably easy it is to publish total fictions about science, and falsely claim relevance to real everyday life.

Pullam provides a nice example of the dynamics discussed here in the recent case of the so-called “QWERTY effect” which is also dissected here. On this blog I’ve occasionally pointed to silly science and silly reporting, as well as good science and good reporting — which on any given topic is all mixed up together.

When prominent members of the media take on an activist bent, the challenge is further compounded. Of course, members of the media are not alone in their activism through science. The combination of ambiguity, researcher interest in a significant result and research as a tool of activism makes sorting through the thicket of knowledge a challenge in the best of cases, and sometimes just impossible.

The practical conclusion to draw from Simmons et al. is that much of what we think we know based on conventional statistical studies published in the academic literature stands a good chance of just not being so — certainly more than the 5% threshold used as a threshold for significance. Absent solid research, we simply can’t distinguish empirically between false and true positives, meaning that we apply other criteria, like political expediency. Knowing what to know turns out to be quite a challenge.

Roger Pielke Jr.: US science and politics

Roger irritates advocates of all political leanings — because he keeps insisting on the facts. Such as this essay, which highlights just a few of the Bush/Obama administration cases of “political interference”:

Why have a number of areas of US science become so politicized?

One answer to this question is that those concerned about science in politics have ceded discussion of issues of science policy to the most overtly partisan, many of whom see science as nothing more than a convenient tool to extract political advantage. This dynamic manifests itself in the overwhelming selectivity of attention among those who purport to be concerned about science in politics.

Consider a few examples:

Remember when James Hansen was told that his access to the media would be limited and controlled by minders at NASA? Of course you do. It has been a talking point for years.

But what about when the Obama Administration recently muzzled scientists and other officials at the Department of Health and Human Services? If you frequent the science corner of the blogosphere you might have missed it (though if you visit the conservative-o-sphere you may have seen it). Here is what one long-time journalist said about the policy:

The new formal HHS Guidelines on the Provision of Information to the News Media represent, to this 36-year veteran of reporting FDA news, a Soviet-style power-grab. By requiring all HHS employees to arrange their information-sharing with news media through their agency press office, HHS has formalized a creeping information-control mechanism that informally began during the Clinton Administration and was accelerated by the Bush and Obama administrations.

AAAS? Chris Mooney? Crickets. Remember when the Bush Administration was accused of couching its ideological preferences in the name of science in order to prohibit research on stem cells? Well, of course you do.

But what about the Obama Administration’s hiding its decision to close Yucca Mountain behind science? As President Obama’s spokesman explained:

“I think what has taken Yucca Mountain off the table in terms of a long-term solution for a repository for our nuclear waste is the science. The science ought to make these decisions.”

Read the whole thing »

The Hartwell Paper: Oblique strategies

…in The Economist 11 May 2010 there’s a discussion of the Hartwell Paper:

(…) Where the Hartwell paper becomes controversial is in its approach to decarbonisation. The authors argue that the large emerging economies are clearly fuelling themselves with renewables and nuclear as well as, rather than instead of, fossil fuels, for various reasons, and that this will not change soon. Nor, they imply, should it. They argue that there is something wrong with a world in which carbon-dioxide levels are kept to 450 parts per million (a trajectory widely deemed compatible with a 2 degree cap on warming) but at the same time more than a billion of the poorest people are left without electricity, as in one much discussed scenario from the International Energy Agency.

Their oblique approach is to aim instead for a world with accessible, secure low cost energy for all. The hope, intuition or strategy at play here is that since fossil fuels cannot deliver such a world, its achievement will, in itself, bring about decarbonisation on a massive scale. Following a path stressing clean energy as a development issue provides a more pleasant journey to the same objective.

(…) The Hartwellites do not disagree with the science in general and certainly don’t think there is no reason to act. They simply doubt that action along this one axis (carbon-dioxide reduction) can ever be made politically compelling. Instead, their oblique strategies (…) are to concentrate on easy opportunities and efficiency, energy and dignity.

In the comments I found the following observation from one of our favorite energy policy analyst/observers, the pseudonymous “harrywr2“:

One of the problems in the ‘energy debate’ is that various institutions use the ‘average’ price of coal to decide which actions may or may not make ‘economic’ sense.

The worlds greatest pile of coal sits in Gillette, Wyoming..where one can show up with a pickup truck and get a ton of coal for $12. There aren’t any ‘alternative’ energy options available that will ever compete against $12/ton coal.

In the ‘real’ world, coal has to be shipped to a market. That $12/ton coal in Wyoming ends up costing $100/ton by the time it is put on a train, hauled over the rocky mountains, put on a boat and floated across the pacific to China.

The Copenhagen folks I suppose could point to the level of investment the Chinese are making in hydro,nuclear and wind and congratulate themselves on finally convincing the Chinese on the need to be ‘environmentally friendly’.

Or one could take another view and conclude that the Chinese calculated the cost of importing coal from Wyoming and decided that ‘alternative energy’ was cheaper and as a bonus they would be congratulated by the Copenhagen folks for finally becoming ‘environmentally conscious’.

If one believes the later then the ‘Hartwell’ focus makes more sense.

Global treaties to reduce CO2 emissions are only going to happen if they coincide with the goal of ‘cheap plentiful electricity for all’.

As Harry outlines, my shorthand of “cheaper than coal” can be misleading unless regionally nuanced. I think that hurdle is valid for most Chinese utility investment decisions – but obviously does not incentivize a Wyoming region utility to choose a low-carbon option.

The Top Ten Things Environmentalists Need to Learn

The first source I look at every day is my RSS feed for Steve Packard’s DepletedCranium.com – The Bad Science Blog. RSS is efficient, especially for a full-content RSS feed like Steve’s. The downside is most RSS feeds are only recent, in this case ten entries. So if a reader stumbles on to a prolific feed like DepleteCranium, it is easy to get so involved in the latest content that you forget to explore the blog back to time(0).

That is my lame excuse for missing Steve’s 2008 tutorial for environmentalists. I was alerted by the always reliable DV82XL, who mentioned that he often refers people to this post. So get on over there, read the whole thing. You will probably also want to link the post as a resource for your readers. Ensure your age 12+ children read and understand this, which begins with this intro:

This came out a lot longer than I expected. However, this is also what is becoming an increasingly large portion of this website. Maintaining the environment is a critical issue especially as evidence of accelerated global warming mounts and as energy becomes more of an issue than it has in recent past. Unfortunately, many of those who claim to be working for enviornmental improvements lack an understanding of a few basic concepts which are absolutely critical to accomplishing anything.

I often find myself in arguments over economics versus environmentalism. This becomes a very difficult situation because the immediate accusation is that I care only about money and need to realize that sacrifices must be made for the good of the planet. I am also told that wind or solar is the answer and the costs and reduction of energy output is acceptable. These ideas that it is okay or honorable to make such sacrifices are overly simplistic and lack a true understanding of the forces at work. To use a phrase I have come to like, they are “Not even wrong.”

Thus, the top ten list

I mention the age 12+ children because I can almost guarantee they are not learning any of this if they are attending a state school. As I write, there are 525 (!!!) comments on this post, many of them first-class. I’ve a LOT more reading to do.

Steve has continued to update this post with links to his newer related posts. I’m tempted to quote them, but it is best if you straight to the source so you get the complete up to date list.

Added (2/5/08):
Having gotten a lot of attention on this article I’ve added a couple of follow-up posts which related to this and which I might suggest checking out. You may also want to check
other parts of this blog filed under “environment”. Examples are:

Sources of Greenhouse Gas and a Quick Math Lesson

Greenpeace On Nuclear Science

What is Spent Fuel? – I’m most proud of this one as it addresses an issue most people know very little about. The issue of nuclear “waste” and methods for dealing with it.

The latter article on “spent” nuclear fuel (SNF) is well written to be followed by the lay person. The article has a one paragraph summary of fast neutron reactors, but does not dwell on advanced reactor options like the IFR or LFTR. This is a nice, short primer to prep the reader to have a go at the latest MIT report on the nuclear fuel cycle. And please review the comments to this article. There I learned thanks to DV82XL that the Canadian CANDU reactors can make a valuable contribution value-extraction from the SNF from conventional PWR:

Actually, even current CANDUs have a high enough neutron economy to burn fuel discharged from light water reactors. The DUPIC cycle is a research project presently being carried out co-operatively by Canada and Korea. It provides an alternative to chemical reprocessing. DUPIC stands for Direct Use of PWR Fuel in CANDU. In DUPIC, “spent” PWR fuel is first mechanically decladded and then treated by a dry oxidation-reduction process to remove the volatile fission products. The process yields a powder, which can then be pressed into pellets again. The process does not involve chemical separation of the uranium and plutonium, and so silences the proliferation concerns. This DUPIC fuel will typically have a total fissile content of about 1.5%, so cannot be used in PWRs.

However, the fissile content is certainly sufficient for use in CANDU, where in fact DUPIC fuel would yield about twice as much energy again as was produced in the original cycle in the PWR! The ideal synergism between CANDU and PWR: fuel is first burned in PWR, and then, instead of being thrown away, yields another two times as much energy in CANDU. Again, the total amount of spent fuel per unit of electricity is much reduced.

Of course then it could be reprocessed and the cycle run again.

Much further along in the comments, DV82XL offered a very clear, concise explanation why the meme is so silly — that “terrorists an easily build a nuclear weapon from reactor-grade plutonium”. For my future reference, here is the reality of what the terrorist organization has to organize and achieve:

The technical problems confronting a terrorist organization considering the use of reactor-grade plutonium are not different in kind from those involved in using weapons-grade plutonium, only to a greater degree.

• Technical Personnel.

Competence and thorough understanding will be required in a wide range of technical specialties. These include: shock hydrodynamics, critical assemblies, chemistry, metallurgy, machining, electrical circuits, explosives, health physics, and others. At least several people who can work as a team will be needed. These will have to be carefully selected to ensure that all necessary skills are covered.

• Costs.

In addition to support for the personnel over a period adequate for planning, preparation and execution, a considerable variety of specialized equipment and instrumentation will be required, all or most of which need be obtained through controlled sources.

• Hazards.

Dealing with radiation, criticality, and the handling of, all present potential hazards that will have to be foreseen and provided against.

• Detection.

Assuming the operation is contrary to the wishes of the local national authorities the organization must exercise all necessary precautions to avoid detection of their activities. They would no doubt be faced by a massive search operation employing the most sensitive detection equipment available once it should be known that someone had acquired a supply of material suitable for use as a weapon.

• Acquisition.

Very early in the planning and equipment procurement phase the organization will need information concerning the physical form and chemical state of the fissile material it will have to work with. This will be necessary before they can decide just what equipment they will need. The actual isotopic content of the material may be undetermined until it is acquired, making preplanning difficult. The actual acquisition would entail dealing with the problems and hazards that would be set by the safeguards and security authorities.

The point here being that this is a project that is unlikely to be within the grasp of a paranational organization, at the best of times, and given the poor performance of the one device that was tested by the U.S .using this isotope, a very low likelihood of the device assembling properly when fired.

Ultimately, despite the fears of the West that such an attack may occur, the probability of one is vanishingly small – not when a semi or two filled with fertilizer and heating oil will yield a much greater explosion, more reliably and at a fraction of the cost.

A Guest Post: No Fluid Dynamicist Kings in Flight-Test

Captain Joshua Stults, an aeronautical engineer with the US Air Force, contributed a fascinating guest post to Roger Pielke Jr’s blog — offering some very useful clarification of the Honest Broker concepts.

(…) The value we brought (as we saw it), was that we were separate from the direct program office chain of command (so we weren’t advocates for their position), but we understood the technical details of the particular system, and we also understood the differing values of the folks in the Pentagon (which the folks in the program office loved to refuse to acknowledge as legitimate, sound familiar?). That position turns out to be a tough sell (program managers get offended if you seem to imply they are dishonest), so I can empathize with the virulent reaction Dr Pielke gets on applying the Honest Broker concepts to climate policy decision support. People love to take offense over their honor. That’s a difficult snare to avoid while you try to make clear that, while there’s nothing dishonest about advocacy, there remains significant value in honest brokering. Maybe Honest Broker wouldn’t be the best title to assume though. The first reaction out of a tight-fisted program manager would likely be “I’m honest, why do I need you?”

Enjoy.

Google: "Operation Aurora" attack

From the McAfee Security Insights Blog

(…) As I have written before, I believe this is the largest and most sophisticated cyberattack we have seen in years targeted at specific corporations. While the malware was sophisticated, we see lots of attacks that use complex malware combined with zero day exploits. What really makes this is a watershed moment in cybersecurity is the targeted and coordinated nature of the attack with the main goal appearing to be to steal core intellectual property.

The list of organizations reported to have been hit by the cyberattack continues to grow. As a result, many companies and governments are asking us how they can determine if they were targeted in the same sophisticated cyberattack that hit Google. The high profile cyberattack, linked to China by Google, targeted valuable intellectual property.

We’re also getting a lot of questions about the yet-to-be-patched vulnerability in Internet Explorer that was exploited in the cyberattack. That’s an important question as well, because Internet Explorer users currently face a real and present danger due to the public disclosure of the vulnerability and release of attack code, increasing the possibility of widespread attacks.

(…)

From the McAfee special page on Aurora

On January 14, 2010 McAfee Labs identified a zero-day vulnerability in Microsoft Internet Explorer that was used as an entry point for Operation Aurora to exploit Google and at least 20 other companies. Microsoft has since issued a security bulletin and patch.

Operation Aurora was a coordinated attack which included a piece of computer code that exploits the Microsoft Internet Explorer vulnerability to gain access to computer systems. This exploit is then extended to download and activate malware within the systems. The attack, which was initiated surreptitiously when targeted users accessed a malicious web page (likely because they believed it to be reputable), ultimately connected those computer systems to a remote server. That connection was used to steal company intellectual property and, according to Google, additionally gain access to user accounts. Learn more.