Category Archives: Science Policy

Good advice for science advisors – from the book “Future Directions for Scientific Advice for Whitehall”

I was surprised to find a well-informed essay on science policy – in, of all places, the Guardian. At the end of the article I found out why this is such a good essay:

Geoff Mulgan is chief executive of Nesta, the UK’s innovation foundation. He is on Twitter @geoffmulgan. This article is from the book Future Directions for Scientific Advice in Whitehall (edited by Robert Doubleday and James Wilsdon) which is free to download here from 18 April 2013.

The essay is longish for sound reasons – here are some excerpts from the concluding paragraphs to motivate you to read the complete essay:

(…) Formal scientific knowledge sits alongside these other types of knowledge, but does not automatically trump the others. Indeed, a politician, or civil servant, who acted as if there was a hierarchy of knowledge with science sitting unambiguously at the top, would not last long. The consequence is that a scientist who can mobilise other types of knowledge on his or her side is likely to be more effective than one that cannot; for example, by highlighting the economic cost of future floods and their potential effect on political legitimacy, as well as their probability.

These points help to explain why the role of a chief scientific adviser can be frustrating. Simply putting an eminent scientist into a department may have little effect, if they don’t also know how to work the system, or how to mobilise a large network of contacts. Not surprisingly, many who aren’t well prepared for their roles as brokers, feel that they rattle around without much impact.

For similar reasons, some of the other solutions that have been used to raise the visibility and status of scientific advice have tended to disappoint. Occasional seminars for ministers or permanent secretaries to acclimatise them to new thinking in nanotechnology or genomics are useful but hardly sufficient, when most of the real work of government is done at a far more junior level. This is why some advocate other, more systematic, approaches to complement what could be characterised as the “clever chap” theory of scientific advice.

First, these focus on depth and breadth: acclimatising officials and politicians at multiple levels, and from early on, to understanding science, data and evidence through training courses, secondments and simulations; influencing the media environment as much as insider decision making (since in practice this will often be decisive in determining whether advice is heeded); embedding scientists at more junior levels in policy teams; linking scientific champions in mutually supportive networks; and opening up more broadly the world of evidence and data so that it becomes as much part of the lifeblood of decision making as manifestos.

Here the crucial point is that the target should not just be the very top of institutions: the middle and lower layers will often be more important. A common optical mistake of eminent people in London is to overestimate the importance of the formal relative to the informal, the codified versus the craft.

Second, it’s vital to recognise that the key role of a scientific adviser is to act as an intermediary and broker rather than an adviser, and that consequently their skills need to be ones of translation, aggregation and synthesis as much as deep expertise. So if asked to assess the potential commercial implications of a new discovery such as graphene; the potential impact of a pandemic; or the potential harms associated with a new illegal drug, they need to mobilise diverse forms of expertise.

Their greatest influence may come if – dare I say it – they are good at empathising with ministers who never have enough time to understand or analyse before making decisions. Advisers who think that they are very clever while all around them are a bit thick, and that all the problems of the world would be solved if the thick listened to the clever, are liable to be disappointed.

(…) In optimistic moments, I hope that we are moving towards a period of more overtly experimentalist governance, where governments are willing to test their ideas out – to run RCTs and embed continuous learning and feedback into everything they do. Experimental government would certainly be better than government by instinct, government by intuition and government solely guided by ideology.

In such a context, the old model of a clever man given a desk in Whitehall, sitting in a corner writing memos may be even more anachronistic. We certainly need highly intelligent eminent experts to guide decisions. We need to pay more comprehensive and sophisticated attention not only to the supply of useful knowledge, but also to how that knowledge is used. By doing this, governments and advisers can make more informed decisions, fewer mistakes and respond better to the complex problems they face. But let’s be as serious in making use of the evidence about evidence, as we are about the evidence itself.

Highly recommended!

Good advice for science advisors – from the book Future Directions for Scientific Advice in Whitehall

I was surprised to find a well-informed essay on science policy – in, of all places, the Guardian. At the end of the article I found out why it is such a good essay:

Geoff Mulgan is chief executive of Nesta, the UK’s innovation foundation. He is on Twitter @geoffmulgan. This article is from the book Future Directions for Scientific Advice in Whitehall (edited by Robert Doubleday and James Wilsdon) which is free to download here from 18 April 2013.

 

(…) Formal scientific knowledge sits alongside these other types of knowledge, but does not automatically trump the others. Indeed, a politician, or civil servant, who acted as if there was a hierarchy of knowledge with science sitting unambiguously at the top, would not last long. The consequence is that a scientist who can mobilise other types of knowledge on his or her side is likely to be more effective than one that cannot; for example, by highlighting the economic cost of future floods and their potential effect on political legitimacy, as well as their probability.

These points help to explain why the role of a chief scientific adviser can be frustrating. Simply putting an eminent scientist into a department may have little effect, if they don’t also know how to work the system, or how to mobilise a large network of contacts. Not surprisingly, many who aren’t well prepared for their roles as brokers, feel that they rattle around without much impact.

For similar reasons, some of the other solutions that have been used to raise the visibility and status of scientific advice have tended to disappoint. Occasional seminars for ministers or permanent secretaries to acclimatise them to new thinking in nanotechnology or genomics are useful but hardly sufficient, when most of the real work of government is done at a far more junior level. This is why some advocate other, more systematic, approaches to complement what could be characterised as the “clever chap” theory of scientific advice.

First, these focus on depth and breadth: acclimatising officials and politicians at multiple levels, and from early on, to understanding science, data and evidence through training courses, secondments and simulations; influencing the media environment as much as insider decision making (since in practice this will often be decisive in determining whether advice is heeded); embedding scientists at more junior levels in policy teams; linking scientific champions in mutually supportive networks; and opening up more broadly the world of evidence and data so that it becomes as much part of the lifeblood of decision making as manifestos.

Here the crucial point is that the target should not just be the very top of institutions: the middle and lower layers will often be more important. A common optical mistake of eminent people in London is to overestimate the importance of the formal relative to the informal, the codified versus the craft.

Second, it’s vital to recognise that the key role of a scientific adviser is to act as an intermediary and broker rather than an adviser, and that consequently their skills need to be ones of translation, aggregation and synthesis as much as deep expertise. So if asked to assess the potential commercial implications of a new discovery such as graphene; the potential impact of a pandemic; or the potential harms associated with a new illegal drug, they need to mobilise diverse forms of expertise.

Their greatest influence may come if – dare I say it – they are good at empathising with ministers who never have enough time to understand or analyse before making decisions. Advisers who think that they are very clever while all around them are a bit thick, and that all the problems of the world would be solved if the thick listened to the clever, are liable to be disappointed.

(…) In optimistic moments, I hope that we are moving towards a period of more overtly experimentalist governance, where governments are willing to test their ideas out – to run RCTs and embed continuous learning and feedback into everything they do. Experimental government would certainly be better than government by instinct, government by intuition and government solely guided by ideology.

In such a context, the old model of a clever man given a desk in Whitehall, sitting in a corner writing memos may be even more anachronistic. We certainly need highly intelligent eminent experts to guide decisions. We need to pay more comprehensive and sophisticated attention not only to the supply of useful knowledge, but also to how that knowledge is used. By doing this, governments and advisers can make more informed decisions, fewer mistakes and respond better to the complex problems they face. But let’s be as serious in making use of the evidence about evidence, as we are about the evidence itself.

 

Nature: Time to confront academic fraud

One percent fraudulent papers is much higher than I thought. Yes, it is just one study, but this certainly raises the question: what are appropriate countermeasures?

Considerable hard data have emerged on the scale of misconduct. A metastudy (D. Fanelli PLoS ONE 4, e5738; 2009) and a detailed screening of all images in papers accepted by The Journal of Cell Biology (M. Rossner The Scientist 20 (3), 24; 2006) each suggest that roughly 1% of published papers are fraudulent. That would be about 20,000 papers worldwide each year.At the time of the Baltimore case, it was widely argued that research misconduct was insignificantly rare — and irrelevant to the progress of science, which would self-correct. Few senior scientists now believe that. They know that misconduct exists and that, unchecked, it can undermine public regard for science and scientists.

 

False Positive Science

This science policy post by Roger Pielke Jr. is a gem. You’ll want to keep these principles in mind whenever you read new research press releases (much of the science reporting you read in the media is regurgitated press releases). Here’s an excerpt:

(…) The problem of “false positive science” is of course not limited to the discipline of psychology or even the social sciences. Simmons et al. provide several excellent empirical examples of how ambiguity in the research process leads to false positives and offer some advice for how the research community might begin to deal with the problem.

Writing at The Chronicle of Higher Education, Geoffrey Pullam says that a gullible and compliant media makes things worse:

Compounding this problem with psychological science is the pathetic state of science reporting: the problem of how unacceptably easy it is to publish total fictions about science, and falsely claim relevance to real everyday life.

Pullam provides a nice example of the dynamics discussed here in the recent case of the so-called “QWERTY effect” which is also dissected here. On this blog I’ve occasionally pointed to silly science and silly reporting, as well as good science and good reporting — which on any given topic is all mixed up together.

When prominent members of the media take on an activist bent, the challenge is further compounded. Of course, members of the media are not alone in their activism through science. The combination of ambiguity, researcher interest in a significant result and research as a tool of activism makes sorting through the thicket of knowledge a challenge in the best of cases, and sometimes just impossible.

The practical conclusion to draw from Simmons et al. is that much of what we think we know based on conventional statistical studies published in the academic literature stands a good chance of just not being so — certainly more than the 5% threshold used as a threshold for significance. Absent solid research, we simply can’t distinguish empirically between false and true positives, meaning that we apply other criteria, like political expediency. Knowing what to know turns out to be quite a challenge.

Roger Pielke Jr.: US science and politics

Roger irritates advocates of all political leanings — because he keeps insisting on the facts. Such as this essay, which highlights just a few of the Bush/Obama administration cases of “political interference”:

Why have a number of areas of US science become so politicized?

One answer to this question is that those concerned about science in politics have ceded discussion of issues of science policy to the most overtly partisan, many of whom see science as nothing more than a convenient tool to extract political advantage. This dynamic manifests itself in the overwhelming selectivity of attention among those who purport to be concerned about science in politics.

Consider a few examples:

Remember when James Hansen was told that his access to the media would be limited and controlled by minders at NASA? Of course you do. It has been a talking point for years.

But what about when the Obama Administration recently muzzled scientists and other officials at the Department of Health and Human Services? If you frequent the science corner of the blogosphere you might have missed it (though if you visit the conservative-o-sphere you may have seen it). Here is what one long-time journalist said about the policy:

The new formal HHS Guidelines on the Provision of Information to the News Media represent, to this 36-year veteran of reporting FDA news, a Soviet-style power-grab. By requiring all HHS employees to arrange their information-sharing with news media through their agency press office, HHS has formalized a creeping information-control mechanism that informally began during the Clinton Administration and was accelerated by the Bush and Obama administrations.

AAAS? Chris Mooney? Crickets. Remember when the Bush Administration was accused of couching its ideological preferences in the name of science in order to prohibit research on stem cells? Well, of course you do.

But what about the Obama Administration’s hiding its decision to close Yucca Mountain behind science? As President Obama’s spokesman explained:

“I think what has taken Yucca Mountain off the table in terms of a long-term solution for a repository for our nuclear waste is the science. The science ought to make these decisions.”

Read the whole thing »

The Top Ten Things Environmentalists Need to Learn

The first source I look at every day is my RSS feed for Steve Packard’s DepletedCranium.com – The Bad Science Blog. RSS is efficient, especially for a full-content RSS feed like Steve’s. The downside is most RSS feeds are only recent, in this case ten entries. So if a reader stumbles on to a prolific feed like DepleteCranium, it is easy to get so involved in the latest content that you forget to explore the blog back to time(0).

That is my lame excuse for missing Steve’s 2008 tutorial for environmentalists. I was alerted by the always reliable DV82XL, who mentioned that he often refers people to this post. So get on over there, read the whole thing. You will probably also want to link the post as a resource for your readers. Ensure your age 12+ children read and understand this, which begins with this intro:

This came out a lot longer than I expected. However, this is also what is becoming an increasingly large portion of this website. Maintaining the environment is a critical issue especially as evidence of accelerated global warming mounts and as energy becomes more of an issue than it has in recent past. Unfortunately, many of those who claim to be working for enviornmental improvements lack an understanding of a few basic concepts which are absolutely critical to accomplishing anything.

I often find myself in arguments over economics versus environmentalism. This becomes a very difficult situation because the immediate accusation is that I care only about money and need to realize that sacrifices must be made for the good of the planet. I am also told that wind or solar is the answer and the costs and reduction of energy output is acceptable. These ideas that it is okay or honorable to make such sacrifices are overly simplistic and lack a true understanding of the forces at work. To use a phrase I have come to like, they are “Not even wrong.”

Thus, the top ten list

I mention the age 12+ children because I can almost guarantee they are not learning any of this if they are attending a state school. As I write, there are 525 (!!!) comments on this post, many of them first-class. I’ve a LOT more reading to do.

Steve has continued to update this post with links to his newer related posts. I’m tempted to quote them, but it is best if you straight to the source so you get the complete up to date list.

Added (2/5/08):
Having gotten a lot of attention on this article I’ve added a couple of follow-up posts which related to this and which I might suggest checking out. You may also want to check
other parts of this blog filed under “environment”. Examples are:

Sources of Greenhouse Gas and a Quick Math Lesson

Greenpeace On Nuclear Science

What is Spent Fuel? – I’m most proud of this one as it addresses an issue most people know very little about. The issue of nuclear “waste” and methods for dealing with it.

The latter article on “spent” nuclear fuel (SNF) is well written to be followed by the lay person. The article has a one paragraph summary of fast neutron reactors, but does not dwell on advanced reactor options like the IFR or LFTR. This is a nice, short primer to prep the reader to have a go at the latest MIT report on the nuclear fuel cycle. And please review the comments to this article. There I learned thanks to DV82XL that the Canadian CANDU reactors can make a valuable contribution value-extraction from the SNF from conventional PWR:

Actually, even current CANDUs have a high enough neutron economy to burn fuel discharged from light water reactors. The DUPIC cycle is a research project presently being carried out co-operatively by Canada and Korea. It provides an alternative to chemical reprocessing. DUPIC stands for Direct Use of PWR Fuel in CANDU. In DUPIC, “spent” PWR fuel is first mechanically decladded and then treated by a dry oxidation-reduction process to remove the volatile fission products. The process yields a powder, which can then be pressed into pellets again. The process does not involve chemical separation of the uranium and plutonium, and so silences the proliferation concerns. This DUPIC fuel will typically have a total fissile content of about 1.5%, so cannot be used in PWRs.

However, the fissile content is certainly sufficient for use in CANDU, where in fact DUPIC fuel would yield about twice as much energy again as was produced in the original cycle in the PWR! The ideal synergism between CANDU and PWR: fuel is first burned in PWR, and then, instead of being thrown away, yields another two times as much energy in CANDU. Again, the total amount of spent fuel per unit of electricity is much reduced.

Of course then it could be reprocessed and the cycle run again.

Much further along in the comments, DV82XL offered a very clear, concise explanation why the meme is so silly — that “terrorists an easily build a nuclear weapon from reactor-grade plutonium”. For my future reference, here is the reality of what the terrorist organization has to organize and achieve:

The technical problems confronting a terrorist organization considering the use of reactor-grade plutonium are not different in kind from those involved in using weapons-grade plutonium, only to a greater degree.

• Technical Personnel.

Competence and thorough understanding will be required in a wide range of technical specialties. These include: shock hydrodynamics, critical assemblies, chemistry, metallurgy, machining, electrical circuits, explosives, health physics, and others. At least several people who can work as a team will be needed. These will have to be carefully selected to ensure that all necessary skills are covered.

• Costs.

In addition to support for the personnel over a period adequate for planning, preparation and execution, a considerable variety of specialized equipment and instrumentation will be required, all or most of which need be obtained through controlled sources.

• Hazards.

Dealing with radiation, criticality, and the handling of, all present potential hazards that will have to be foreseen and provided against.

• Detection.

Assuming the operation is contrary to the wishes of the local national authorities the organization must exercise all necessary precautions to avoid detection of their activities. They would no doubt be faced by a massive search operation employing the most sensitive detection equipment available once it should be known that someone had acquired a supply of material suitable for use as a weapon.

• Acquisition.

Very early in the planning and equipment procurement phase the organization will need information concerning the physical form and chemical state of the fissile material it will have to work with. This will be necessary before they can decide just what equipment they will need. The actual isotopic content of the material may be undetermined until it is acquired, making preplanning difficult. The actual acquisition would entail dealing with the problems and hazards that would be set by the safeguards and security authorities.

The point here being that this is a project that is unlikely to be within the grasp of a paranational organization, at the best of times, and given the poor performance of the one device that was tested by the U.S .using this isotope, a very low likelihood of the device assembling properly when fired.

Ultimately, despite the fears of the West that such an attack may occur, the probability of one is vanishingly small – not when a semi or two filled with fertilizer and heating oil will yield a much greater explosion, more reliably and at a fraction of the cost.

A Guest Post: No Fluid Dynamicist Kings in Flight-Test

Captain Joshua Stults, an aeronautical engineer with the US Air Force, contributed a fascinating guest post to Roger Pielke Jr’s blog — offering some very useful clarification of the Honest Broker concepts.

(…) The value we brought (as we saw it), was that we were separate from the direct program office chain of command (so we weren’t advocates for their position), but we understood the technical details of the particular system, and we also understood the differing values of the folks in the Pentagon (which the folks in the program office loved to refuse to acknowledge as legitimate, sound familiar?). That position turns out to be a tough sell (program managers get offended if you seem to imply they are dishonest), so I can empathize with the virulent reaction Dr Pielke gets on applying the Honest Broker concepts to climate policy decision support. People love to take offense over their honor. That’s a difficult snare to avoid while you try to make clear that, while there’s nothing dishonest about advocacy, there remains significant value in honest brokering. Maybe Honest Broker wouldn’t be the best title to assume though. The first reaction out of a tight-fisted program manager would likely be “I’m honest, why do I need you?”

Enjoy.

Google: "Operation Aurora" attack

From the McAfee Security Insights Blog

(…) As I have written before, I believe this is the largest and most sophisticated cyberattack we have seen in years targeted at specific corporations. While the malware was sophisticated, we see lots of attacks that use complex malware combined with zero day exploits. What really makes this is a watershed moment in cybersecurity is the targeted and coordinated nature of the attack with the main goal appearing to be to steal core intellectual property.

The list of organizations reported to have been hit by the cyberattack continues to grow. As a result, many companies and governments are asking us how they can determine if they were targeted in the same sophisticated cyberattack that hit Google. The high profile cyberattack, linked to China by Google, targeted valuable intellectual property.

We’re also getting a lot of questions about the yet-to-be-patched vulnerability in Internet Explorer that was exploited in the cyberattack. That’s an important question as well, because Internet Explorer users currently face a real and present danger due to the public disclosure of the vulnerability and release of attack code, increasing the possibility of widespread attacks.

(…)

From the McAfee special page on Aurora

On January 14, 2010 McAfee Labs identified a zero-day vulnerability in Microsoft Internet Explorer that was used as an entry point for Operation Aurora to exploit Google and at least 20 other companies. Microsoft has since issued a security bulletin and patch.

Operation Aurora was a coordinated attack which included a piece of computer code that exploits the Microsoft Internet Explorer vulnerability to gain access to computer systems. This exploit is then extended to download and activate malware within the systems. The attack, which was initiated surreptitiously when targeted users accessed a malicious web page (likely because they believed it to be reputable), ultimately connected those computer systems to a remote server. That connection was used to steal company intellectual property and, according to Google, additionally gain access to user accounts. Learn more.

Stewart Brand's Four Camps

This is a great post by Roger Pielke Jr. because it reminds us all of Pielke’s 2005 taxonomy of the climate debate.

In the NYT today Stewart Brand explains that the climate debate really has four — not two — different poles. He confuses me and my father as an example of a “skeptic” (he refers to my father, a climate scientist, but then cites my research on IPCC scenarios). While it is nice to see a little nuance creep into the debate, the fatal flaw in Brand’s taxonomy is that it defines its ordering with respect to views on science. The climate debate has much more nuance among people who share the same views on the science, so I find Brand’s taxonomy a bit simplistic.

In 2005, I blogged my own taxonomy of the debate.

Here you’ll want to read the original taxonomy…

Climate change e-mail scandal underscores myth of pure science

Arizona State prof. Daniel Sarewitz has long been a Seekerblog reliable source on issues of science and technology policy. Here is Sarewitz with Samuel Thernstrom in an LA Times op-ed:

The East Anglia controversy serves as a reminder that when the politics are divisive and the science is sufficiently complex, the boundary between the two may become indiscernible.

(…) We do not believe the East Anglia e-mails expose a conspiracy that invalidates the larger body of evidence demonstrating anthropogenic warming; nevertheless, the damage to public confidence in climate science, particularly among Republicans and independents, may be enormous. The terrible danger — one that has been brewing for years — is that the invaluable role science should play in informing policy and politics will be irrevocably undermined, as citizens come to see science as nothing more than a tool for partisans of all stripes.

(…) Moreover, problems such as climate change are much more scientifically complex than determining the charge on an electron or even the structure of DNA. The research deals not with building blocks of nature but with dynamic systems that are inherently uncertain, unpredictable and complex. Such science is often not subject to replicable experiments or verification; rather, knowledge and insight emerge from the weight of theory, data and evidence, usually freighted with considerable uncertainty, disagreement and internal contradiction.

Thus, we write neither to attack nor to defend the East Anglia scientists, but to make clear that the ideal of pure science as a source of truth that can cut through politics is false. The authority of pure science is a two-edged sword, and it cuts deeply in both directions in the climate debate: For those who favor action, the myth of scientific purity confers unique legitimacy upon the evidence they bring to political debates. And for those who oppose action, the myth provides a powerful foundation for counterattack whenever deviations from the unattainable ideal come to light.

(…) The real scandal illustrated by the e-mails is not that scientists tried to undermine peer review, fudge and conceal data, and torpedo competitors, but that scientists and advocates on both sides of the climate debate continue to claim political authority derived from a false ideal of pure science. This charade is a disservice to both science and democracy. To science, because the reality cannot live up to the myth; to democracy, because the difficult political choices created by the genuine but also uncertain threat of climate change are concealed by the scientific debate.

What is the solution? Let politics do its job; indeed, demand it.

We do not believe that climate change is merely a Trojan horse for a Democratic dream of destroying global capitalism. Nor do we believe that Republicans are so bent on maximizing the profits of the fossil fuel industry that they are choosing to consign their grandchildren to a ruined world. Yet these are only slight caricatures of the fantasies that each side cherishes about the other because the true complexity of the climate debate has been camouflaged by the myth of pure, disinterested science.

That myth has allowed politicians to shirk their responsibility to be clear about the values, interests and beliefs that underpin their preferences and choices about science and policy. (…)

Please continue reading…