CGEP Discussion on Nuclear Technology and Policy

On April 10, 2015 the Columbia University Center on Global Energy Policy hosted a “Discussion on Nuclear Technology and Policy.” The CGEP panel:

Tom Blees, President, The Science Council for Global Initiatives;
Travis Bradford, Associate Professor of Practice in International and Public Affairs; Director, Energy and Environment Concentration, Columbia SIPA;
Eric Loewen, Chief Consulting Engineer, GE Hitachi Nuclear Energy; and,
Robert Stone, Director, Pandora s Promise.

There is a lot of well-informed discussion – I recommend the 90 minute video. Around 1:04 Robert Stone was asked to comment on current public attitudes towards nuclear power. He replied that where he was present at screenings “the response overwhelming support, over 90% in favor of what I’m saying in the film.” At 1:06 Robert goes in to the exceptions to this positive outlook. Following is a loose partial transcript:

Surprisingly, audiences in Europe are still infused with this idea that Chernobyl killed 100s of thousands of people. There are continual documentaries on television about that.

(…snip…) Probably the most controversial and shocking aspect of the film was what the World Health Organization has reported after years and years of study. WHO has published that substantially less than 100 people have had their lives shortened by the Chernobyl accident.

The mayor of the town of where 50,000 people were relocated from Chernobyl asked me to bring the film. They were so grateful for the film because there is this perception that we all have two headed babies, we are all dying of cancer. They said no documentary film maker has ever talked to them or visited them.

Europe: there have been so many EU TV documentaries claiming great damage/death caused by Chernobyl – and more that talked about Fukushima in the same way. No European broadcaster has shown Pandora’s Promise. 

They said we can’t show your film because it contradicts all the films that we have produced. They can’t both be true. It will undermine our credibility with our audience.

DWD Driving While Distracted: Google skips auto-pilot, goes for fully self-driving vehicles

At any given daylight moment across America, approximately 660,000 drivers are using cell phones or manipulating electronic devices while driving, a number that has held steady since 2010. — Key Facts and Statistics

“I think it’s wonderful that Tesla has gone out there with this technology, but they might have hyped Autopilot a little bit too much. It doesn’t work in all circumstances. Drivers don’t necessarily know when the car goes from tracking fine to a gray area when the car is confused, and then to a situation when the car doesn’t know where it’s going. These things aren’t well-defined.” —  Alain Kornhauser, director of the transportation program at Princeton University

I’ve been puzzling over the question of how Tesla-type auto-pilot systems are really going to work in the real world. That’s the world where drivers are likely to turn over way too much responsibility to the auto-pilot. Drivers are already frequently distracted while they are theoretically in control. Googles test data makes it very clear that drivers are going to over-trust the auto-pilot brain, and therefore fail to take control fast enough, with enough context awareness to become the pilot. I’ve found that the monthly reports from the Google Self-Driving Car Project are a super research source: “what is working, what isn’t working, what probably won’t ever work.” Read their latest October 2015 report:  “Why we’re aiming for fully self-driving vehicles”. 

As we see more cars with semi-autonomous features on the roads, we’re often asked why we’re aiming for fully autonomous vehicles. To be honest, we didn’t always have this as our plan.

In the fall of 2012, our software had gotten good enough that we wanted to have people who weren’t on our team test it, so we could learn how they felt about it and if it’d be useful for them. We found volunteers, all Google employees, to use our Lexus vehicles on the freeway portion of their commute. They’d have to drive the Lexus to the freeway and merge on their own, and then they could settle into a single lane and turn on the self-driving feature. We told them this was early stage technology and that they should pay attention 100% of the time — they needed to be ready to take over driving at any moment. They signed forms promising to do this, and they knew they’d be on camera.

We were surprised by what happened over the ensuing weeks. On the upside, everyone told us that our technology made their commute less stressful and tiring. One woman told us she suddenly had the energy to exercise and cook dinner for her family, because she wasn’t exhausted from fighting traffic. One guy originally scoffed at us because he loved driving his sports car — but he also enjoyed handing the commute tedium to the car.

But we saw some worrying things too. People didn’t pay attention like they should have. We saw some silly behavior, including someone who turned around and searched the back seat for his laptop to charge his phone — while travelling 65mph down the freeway! We saw human nature at work: people trust technology very quickly once they see it works. As a result, it’s difficult for them to dip in and out of the task of driving when they are encouraged to switch off and relax.

We did spend some time thinking about ways we could build features to address what is often referred to as “The Handoff Problem”– keeping drivers engaged enough that they can take control of driving as needed. The industry knows this is a big challenge, and they’re spending lots of time and effort trying to solve this. One study by the Virginia Tech Transportation Institute found that drivers required somewhere between five and eight seconds to safely regain control of a semi-autonomous system. In a NHTSA study published in August 2015, some participants took up to 17 seconds to respond to alerts and take control of the the vehicle — in that time they’d have covered more than a quarter of a mile at highway speeds. There’s also the challenge of context — once you take back control, do you have enough understanding of what’s going on around the vehicle to make the right decision?

In the end, our tests led us to our decision to develop vehicles that could drive themselves from point A to B, with no human intervention. (We were also persuaded by the opportunity to help everyoneget around, not just people who can drive.) Everyone thinks getting a car to drive itself is hard. It is. But we suspect it’s probably just as hard to get people to pay attention when they’re bored or tired and the technology is saying “don’t worry, I’ve got this…for now.”

Regulators: please let us have our robocars fast. They won’t be flawless, but they will be much safer than the meatware currently driving steel lethal weapons around our cities. 

3 Weak Arguments Against Self-Driving Cars Totally Miss the Point.  Let’s not let ourselves get distracted from the main goal here. In the hour that you debate the “moral dilemma” at a dinner party almost 300 people were killed somewhere by driver error [WHO Road Traffic Deaths].

Homeopathy: Dead Sharks and Hufflepuff Meds

Happiness is waking up to a fresh Ecomodernist Mom post. However today the topic is definitely not a happy one. Amy’s topic is quackery – homeopathy in particular.

As far as I’m concerned, homeopathy falls into the same category as all unproven supplements and herbal “medicine” and everything else under the alternative umbrella. I think it’s all bullshit but I’m not trying to ban it. If people want to buy magic water and pretend it does something, then by all means, they should knock themselves out.  But why not label it correctly?

Amy knows a great deal about the suffering that quacks can inflict on patients like her mom.

Mary Magnan on the NEJM Fail — the Benbrook & Landrigan Non-Disclosures


Dear Reader, if you are a subscriber to The New England Journal of Medicine then I hope you will take the time to carefully read the preceding guest post by Dr. Mary Magnan. I anticipate that you will come to the same conclusion I did – that NEJM must, at the very least, publish a correction to the Benbrook & Landrigan article. And on the NEJM website the article should be prepended with a full disclosure of the blatant conflicts of interest that were not disclosed by the authors. I also hope that subscribers to NEJM will take the trouble to explain their strong feelings to the editors.

In the media there is clearly a “double standard” applied to discussions on genetic engineering – particularly in agriculture. The tacit assumption of the reporting journalist or editor is that the person taking the anti-GMO position is the innocent “good guy environmentalist”, who is just trying to inform the public of the dangers of this technology. Conversely, the person who discusses GMO risk v. benefits is presumed to be beholden to corporate interests. The code word “Monsanto” is usually considered sufficient to terminate rational discussion.

This specific case of the NEJM publication is an example of an article carefully crafted to appear to be a scientific reference for contrary-to-fact claims that herbicide-resistant crops are increasing the negative impact of pesticides. The trick the authors use is to tabulate only the time series total mass of the herbicide glyphosate. Not discussed are the important benefits of glyphosate – that in recommended application regimes glyphosate is one of the least toxic herbicides, that glyphosate is replacing significantly more toxic chemicals, resulting in much-increased conversion of acreage to environmentally beneficial “no-till farming”.

So the conclusions of the Benbrook & Landrigan article are exactly backwards from the perspective of a scientist objectively evaluating the before-after the introduction of the glyphosate-resistance trait.

What would motivate Benbrook & Landrigan to publish such nonsense? Well, particularly in the case of Benbrook – that is his job – to provide the “scientific references” to counter the conclusions of all the worlds major scientific institutions that GMO crops are as safe as non-GMO alternatives, and good for the environment and good for the farmers.

I appreciate Dr. Magnan’s significant effort invested to carefully document what Benbrook & Landrigan should have disclosed to NEJM. My reading of the evidence is that NEJM would have declined to publish this piece.

For an overview of the evidence on pesticides before/after GMO introduction, please Dr. Steve Savage: When Increased Pesticide Use Is A Good Thing which concludes with this:

To reiterate, pesticide use or its increase are not automatically undesirable things.  It depends on what is the alternative and what is the nature of the particular pesticide in question.  Plant biotechnology is just one important tool in the bigger tool box of agriculture.  Sometimes it allows farmers to use a more attractive pesticide option (Bt Sweet Corn would the be best example of this).  Sometimes it helps them with the adoption of sustainable practices that depend on relatively low risk herbicides.  For farmers, biotechnology and pesticides are not an either/or.  They are often partners.

Dr. Savage has published many articles on the complex subject of agricultural pests, IPM (Integrated Pest Management) and pesticides. Another useful perspective is The Muddled Debate About Pesticide Use And GM Crops:

Bottom line, a biotechnology trait may decrease or increases the need for a pesticide.  There will also be many cases where the biotech trait has nothing to do with pesticide use.  There is no necessary good or bad linkage between these two categories of agricultural technology – both can serve to make crop production better.  Both are options that should be available to those who farm.

New England Journal of Medicine ignored Chuck Benbrook’s failure to disclose organic conflicts of interests

NEJM conflict of interestThis is a guest post by Mary Mangan

(This post first appeared on Genetric Literacy Project | October 20, 2015)

The New England Journal of Medicine, frequently referenced as NEJM, has a long history and important place in the sphere of American medical practice.

During the recent 200th anniversary celebration, a special issue highlighted the changes in the challenges that patients and medical practitioners have faced over the centuries. Some threats, like smallpox, had completely been vanquished during this time thanks to vaccines. Other issues have changed in proportion, as so many of us live longer and our world has changed in remarkable ways. This piece, The Burden of Disease and the Changing Task of Medicine, offers a fascinating interactive graphic of the causes of death today versus those in the past.

Access to new technology has certainly played a part in reducing the threats that used to plague us. Nobody calls for us to return to a time when diabetes killed children. Production of insulin by genetically modified bacteria has changed the course of this disease completely. And there is also hope that new technology and strategies can continue to prevent and reduce the impact of this condition going forward.

A journal with such prestige and position also has great responsibility to influence the practice of and the policies of the medical community and government regulators. As noted in the anniversary collection, the roles of medical journals remains crucial:

Journals don’t simply disseminate new knowledge about medical theory and practice. They also define the scope of medical concerns and articulate norms for physicians’ professional and social roles. Simultaneously, they work to preserve their reputation, financial stability, and editorial independence in a constantly changing publishing environment, amid an avalanche of medical information.

In an age of changing media, especially, a medical journal must maintain basic principles of adherence to quality scientific information and navigate the minefields of potential conflicts of interest that practitioners of science or influencers of policy might have in order to remain a trustworthy source of information and policy guidance. For example, if a new treatment for Ebola was developed, it would be important to disclose if funding for the studies was provided by the treatment developer. The science should still be evaluated on its merits, but journal editors require this information to accompany the published work in their journals. The International Committee of Medical Journal Editors (ICMJE) provides guidance and forms for this purpose for all authors in this sphere. Responsibly, NEJM discussed and formulated policies for dealing with conflicts of interest many years ago. Presciently, It stated in 1984:

We will therefore suggest to our authors that they routinely acknowledge in a footnote all funding sources supporting their submitted work. Likewise, any relevant direct business associations should also be acknowledged, such as employment by a corporation that has a financial interest in the work being reported.

Despite having clear policy guidelines in place, a recent opinion piece published in the NEJM by a physician and an economist focusing on the controversial issue of GMOs and pesticides escaped the proper scrutiny. The perspective piece, published on August 20, 2015 attempted to guide public policy but it failed to disclose the possible influencers or conflicts of interests of the authors. Attempts to address this with the journal’s editor, and inquiries to the ICMJE, were unsuccessful. The refusal to examine the evidence and to rectify this undermines the credibility of the journal and would be a perplexing precedent.

No conflicts acknowledged

GMOs, Herbicides, and Public Health” by Philip J. Landrigan, M.D., and Charles Benbrook, Ph.D., asked for the FDA to alter its policy on genetically modified food labeling because of what they say are two significant new developments.

First, there have been sharp increases in the amounts and numbers of chemical herbicides applied to GM crops, and still further increases—the largest in a generation—are scheduled to occur in the next few years. Second, the International Agency for Research on Cancer (IARC) has classified glyphosate, the herbicide most widely used on GM crops, as a “probable human carcinogen” and classified a second herbicide, 2,4-dichlorophenoxyacetic acid (2,4-D), as a “possible human carcinogen.

Numerous researchers with experience in this arena have challenged the factual contentions in the article. That’s not the focus on this article; everyone has a right to express their opinion, and the authors are well known in their fields. My concern is disclosing potential conflicts of interest so the reader is fully informed about potential biases and can therefore better evaluate both the substance and context of the arguments.

Original disclosure documents filed by the authors on July 1 2015 were provided. Benbrook responded to a series of questions:

Did you or your institution at any time receive payment or services from a third party (government, commercial, private foundation, etc.) for any aspect of the submitted work (including but not limited to grants, data monitoring board, study design, manuscript preparation, statistical analysis, etc.)? 

Answer NO                

Place a check in the appropriate boxes in the table to indicate whether you have financial relationships (regardless of amount of compensation) with entities as described in the instructions. Use one line for each entity; add as many lines as you need by clicking the “Add +” box. You should report relationships that were present during the 36 months prior to publication. 

Answer NO

Just to make sure there was no confusion about the COI request, ICMJE has a summary question:

Are there other relationships or activities that readers could perceive to have influenced, or that give the appearance of potentially influencing, what you wrote in the submitted work? 

Benbrook’s original filing is reproduced below:

Screen Shot 2015-10-19 at 2.10.15 PM

But his answers did not appear to be candid. From 2012-May 2014 — well before the publication date of this opinion piece — Benbrook had been an adjunct “research” professor at the Center for Sustaining Agriculture and Natural Resources (CSANR) at Washington State University (WSU). He served as the leader of the CSANR program called Measure to Manage (M2M): Farm and Food Diagnostics for Sustainability and Health (M2M). The stated goal of M2M was to “develop, refine, validate, and apply analytical systems quantifying the impacts of farming systems, technology, and policy on food nutritional quality, food safety, agricultural productivity, economic performance along food value chains, and on natural resources and the environment.”

According to this document from the CSANR website — since removed by the university [but available here] — Benbrook’s entire salary and his M2M research program was funded by the organic industry with no funding support from any independent or university sources. Screen Shot 2015-10-19 at 2.21.40 PM

Benbrook apparently had concluded that these relationships did not merit a mention to the question about his “relationships or activities that readers could perceive to have influenced, or that give the appearance of potentially influencing, what you wrote in the submitted work.”

But that was not the entirety of Benbrook’s conflicts of interest. Soon after the New York Times published additional evidence of undisclosed relationships of both authors. Reporter Eric Lipton used the Washington state Freedom of Information Act (FOIA) to obtain emails from Benbrook, which further demonstrate the close ties to the organic industry for both writers. Other emails obtained by a separate FOIA request reveal an even more elaborate web of influence and conflicts.

FOIA’d documents show that Benbrook received more than $128,000 in 2013 from Washington University, with all the funding coming from industry sources.2015-08-29_17-27-40 salary Benbrook 2013

There are a number of other curious features of Benbrook’s claims that had “nothing to disclose”. First, emails suggest that as of no later than May 28, 2015, Benbrook had separated from his position at Washington State University (NYTimes emails, page 49) (Further revelations indicate Benbrook had been severed from his position on May 15.)JessicaShade_moveWSU

So it is not clear if Benbrook was still affiliated with WSU, as the NEJM article represents, when the opinion piece was initially submitted; but he was well aware that he was no longer at WSU during the evaluation process and before publication. There is no indication he took any action to correct what ended up being an erroneous claim of employment.

There were also more details disclosed about the funding sources for his M2M program. All the funders who provided Benbrook’s salary are key players in the organic industry. The “Now Task” email sent on Sept 4, 2014 includes an attachment with funding details of this program (Funder_2012-2013_Update.docx). Funding sources include a number of organic industry sources, including Stonyfield, Whole Foods, Organic Valley, Clif Bar, The Organic Center (TOC), Organic Trade Association (OTA), and Chipotle, among others. Much of this would fall within the time period specified in the disclosure forms: “You should report relationships that were present during the 36 months prior to publication”. [emphasis ICMJE]M2M_funding

Further, this document describes some other relationships relevant to this arena but not disclosed. Benbrook had run Benbrook Consulting Services for more than 20 years, serving on a board certifying “ecolabels” and another for non-GMO products, and acting as an advisor to Whole Foods Market — an outspoken critic of the use of glyphosate (Whole Foods includes glyphosate on its Prohibited and Restricted pesticide list). Benbrook also did not disclose his involvement with an eco-label certification organization and another business relationship that he maintained with Pesticide Data Central, selling data, advice, and services associated with this topic.Benbrook_other_activities

If these boards and consultancies were pharmaceutical relationships linked to a researcher, certainly NEJM would find them valid conflicts.

There are other relationships that are not disclosed for both Benbrook and Landrigan. Here we see that Gary Hirshberg of Stonyfield farms, a producer of organic foods and active funder of mandatory GMO labeling campaigns, coordinated with both Landrigan and Benbrook on May 26, 2015 for travel and content to a July 8 Washington meeting with Wal-Mart coinciding with debate over a House vote on labeling. The agenda: a discussion titled, “Conclusions, Proposals, and Next Steps —Why a Mandatory GMO Labeling Policy can contribute to positive change and increased consumer confidence.” [NYT collection, pages 7-9]

Shown here is a sample of the email documents:landrigan_benbrook_stonyfield

The DC event appeared to be an attempt, in part orchestrated by Benbrook, to influence WalMart’s future purchasing policies in favor of organic producers. In addition to the Walmart event, Agri-Pulse reported on a Washington DC breakfast the same day [NYTimes collection, page 14]:DC_lobbying

This evidence of direct and coincident involvement with this industry and lobbying was not disclosed by either Benbrook or Landrigan, as evidence by the forms they submitted. And yet it would seem to violate the policy guidelines of the ICMJE, which includes:

When authors submit a manuscript of any type or format they are responsible for disclosing all financial and personal relationships that might bias or be seen to bias their work.

After the Times emails were posted, and after NEJM had been approached by numerous people, Benbrook did submit a revised answer to one question–whether he had any relationships or activities that readers could perceive as a conflict.Screen Shot 2015-10-19 at 2.56.38 PM

The only additional disclosures: he was a member of a USDA crop biotech advisory committee and was a Principal in Benbrook consulting. He did not disclose the two clients most relevant to his opinion: Whole Foods, which has an anti-glyphosate policy and the Pesticide Data Center. In fact there is no acknowledgement of any of his extensive connections to the organic industry. As Benbrook’s CV on display at the Pesticide Data Center, Benbrook is also a consultant to the Organic Center, a research and lobbying arm of the Organic Trade Association — a direct and unmentioned conflict.

When the NEJM editorial team was approached with this evidence, Steven Morrissey PhD, managing editor of NEJM, declined to evaluate this public evidence. ICMJE responded that they “do not, however, have any ability to investigate or authority over its use by or the practices of other journals”, so paths for how to address this with normal channels remain unclear.

What if researchers and physicians associated with a statin drug were taking trips to Washington with a lobbying group, and attempting to influence major pharmacy chains’ purchasing policies and FDA treatment recommendation? Would we expect this to be disclosed by authors in the NEJM? Of course. The science should still be evaluated on its merits, but disclosure of the relationship would be required. This situation is no different. Here Landrigan and Benbrook failed to disclose their relationships. And NEJM should insist on corrections to their disclosure forms, so that physicians, journalists, and general public readers understand this influence.

If the editors of the prestigious New England Journal of Medicine intends to be a credible voice among the noise of nonsense on many medical and public policy issues, they need to adhere to the principles of scientific evidence and disclosure on all fronts. It’s difficult enough to find credible information in these days of dubious scientific publishers and free-range social media. Undermining the confidence in reputable scientific publications could only make things worse.

Mary Mangan, Ph.D., received her education in microbiology, immunology, plant cell biology, and mammalian cell, developmental, and molecular biology. She co-founded OpenHelix, a company providing training on open source software associated with the burgeoning genomics arena, over a decade ago. All comments here are her own, and do not represent her company or any other company. You can contact Mary via twitter: @mem_somerville

China Shows How to Build Nuclear Reactors Fast and Cheap — Plus Serious Advanced Reactor R&D on FHR & MSR


Map credit Forbes

China’s 13th Five-Year Plan (2016-2020) is still in the early planning stage, but @JimConca has just posted an outline of the ambitious nuclear plans at Forbes. Jim sees 350 GW and “over a trillion dollars in nuclear investment” by 2050. Near term to 2030 China plans to build seven reactors per year achieving 150 GW total generation by 2030. Jim concludes that China seems to be commissioning new nuclear plants for roughly 1/3 of US costs.

It seems as though 5 years and about $2 billion per reactor has become routine for China. If that can be maintained, then China will be well-positioned as the world’s nuclear energy leader about the time their middle class swells to over one billion.

That’s the PWR deployment story. Globally some of the most serious advanced reactor development is being undertaken by the Chinese Academy of Sciences (CAS) in collaboration with the US national labs — working on the solid-fueled salt-cooled FHR (PB-AHTR) plus ORNL for their experience with the MSR. Here’s a summary on the collaboration from my post Nuclear City: it’s happening in Shanghai and Berkeley. The Chinese program is seriously ambitious as you can see from their aggressive schedule and USD $400 million funding:

From Mark’s reports I learned that one of the presentations was by a key figure, Xu Hongjie of the Chinese Academy of Sciences (CAS) in Shanghai. Hongjie is the director of what China dubs the “Thorium Molten Salt Reactor” (TMSR) project. One of his slides is shown above, presenting an overview of the TMSR priorities (left side) and the timelines. Happily the Chinese are also focused on the process heat applications of the PH-AHTR (hydrogen to methanol etc.) and the huge benefits to a water impoverished region like China. The Chinese are demonstrating systems-thinking at scale.

There are two Chinese MSR programs:

  • TMSR-SF or solid fuel, which looks to me to be very similar to Per Peterson’s PB-AHTR program at UC Berkeley
  • TMSR-LF or liquid fuel, which I gather is similar to popular LFTR concept.

Both designs are derivative of the Weinberg-driven Oak Ridge (ORNL) molten salt reactor program (that was cancelled by politicians in the 1960s). I understand the PB-AHTR to be most ready for early deployment, which will lay critical foundations for the liquid fuel TMSR-LF (LFTR) implementation a decade or so later. UC Berkeley’s Catalyst magazine has a very accessible summary of the PB-AHTR program.

Mark Halper reported from the Geneva Thorium Energy Conference. The

I proposed a few days ago a China – OECD cooperation to fast-track deployment of nuclear instead of coal. Fortunately, the Chinese and several of the US labs and universities seem to have figured this out without my help:-) This is probably all detailed somewhere online, but I’ve not been able to find it so far. These are the parties to the China – US cooperation:

  • Chinese Academy of Sciences (CAS) in Shanghai
  • Oak Ridge National Laboratory (ORNL)
  • University of California Berkeley
  • University of Washington

The United States could be leading the global nuclear deployment. But so long as the Big Greens are running the show that won’t happen. The good news is that once the love affair with solar/wind/gas collides with reality, then the US can get in line for low-cost, advanced Chinese nuclear technology.

Do wild animals avoid GMO corn? Join the experiment!

This is exciting – a group of serious scientists have launched a crowd-sourced experiment to test the hypothesis that wild animals such as squirrels and deer prefer non-GMO corn, and avoid GMO corn. It's exciting because you can participate – it's easy.

Anecdotal reports suggest that animals avoid eating genetically engineered or GMO corn when given a choice, while others suggest that animals have no preference. With the right materials, this is an easy experiment to do, but there are no peer-reviewed, published scientific studies to answer this question – yet.

In this experiment, we will send ears of GMO and non-GMO corn to volunteers. Adults and children, individuals and classrooms can be part of the first Citizen Science experiment to test claims about GMOs. Everyone’s results will be combined in a peer-reviewed scientific journal article

I just listened to the Talking Biotech podcast #20 on the corn experiment. Kevin Folta and – Karl Haro von Mogel do a deep dive into the design of the experiment. If you donate $25 or more at you will receive your own kit. And you can put your school on the waiting list for a free experiment.

Donate, contribute a bit of your time and you can be part of a real science project. You will learn how an experiment like this has to be designed so that the results will survive peer-review. And if the hypothesis is supported by the data I think the resulting peer-reviewed paper might make the cover of Science! All the supporters will be listed as contributors in the paper.


California’s Energy Future: 2013 Travers Conference UC Berkeley

Recently I was searching for the most up-to-date presentation of the ongoing research study “California’s Energy Future – The View to 2050″. This study was funded by the California Council on Science and Technology (CCST), staffed by about forty energy experts. The original report was published in May 2011(Summary Report [PDF]). This CCST study is one of the few examinations of regional decarbonization that “adds up” in the David MacKay sense. For an introduction to this systematic study I will recommend chairperson Jane Long’s 2013 keynote [Youtube] presented at the Travers Conference at UC Berkeley. Her talk is about 40 minutes – a clear presentation of the reality that we know how to do about only half of what’s required to achieve California’s S-3-05 requiring 80% reduction of CO2 below 1990 by 2050. Jane’s slide deck is itself a valuable resource for explaining energy realities to others. The announcement of the 2013 Travers Conference includes the following hint that California isn’t going to get where it says it is going.

The state of California has embraced an ambitious goal of meeting its future energy needs while increasing its use of renewable energy. But a recent Little Hoover Commission report finds that the state has failed to develop a comprehensive energy strategy that confronts the difficult tradeoffs it faces. The 16th Annual Travers Conference on Ethics & Accountability in Government will investigate the tradeoffs represented by reliance on different energy sources, including oil, natural gas, nuclear energy, biofuels, and wind and solar power.

The fact that nuclear physicist, former director of SLAC and Nobel laureate Burton Richter was selected as one of the six lead authors indicates to me that CCST assembled a team of serious people. You can assess for yourself in Dr. Richter’s July 2011 summary presented at the release event “CCST Report on Nuclear Power in California’s 2050 Energy Mix”. The presentation begins with this:

Report Highlights

The report assumes 67% of California’s electricity will come from nuclear while the rest is renewables as called for in AB-32. This would require 44 Gigawatts of nuclear capacity or about 30 large reactors. While reactor technology is certain to evolve over the period of interest, we assumed that they will be similar to the new generation of large, advanced, light-water reactors (LWR), known as GEN III+ that are now under review by the U.S. Nuclear Regulatory Commission. This allows us to say something about costs since these are under construction in Asia and Europe, and a larger number of similar systems have been built in Asia recently. Our main conclusions on technical issues are as follows:

  • While there are no technical barriers to large-scale deployment of nuclear power in California, there are legislative and public acceptance barriers that have to be overcome to deploy new nuclear reactors.
  • The cost of electricity from new nuclear power plants is uncertain in the United States because no new ones have been built in decades. Our conclusion is that six to eight cents per KW-hr is the best estimate today.
  • Loan guarantees for nuclear power will be required until the financial sector is convinced that the days of large delays and construction cost overruns are over. Continuation of the Price-Anderson act is assumed.
  • Nuclear electricity costs will be much lower than solar for some time. There is insufficient information on wind costs yet to allow a comparison, particularly when costs to back up wind power are included.
  • Cooling water availability in California is not a problem. Reactors can be cooled with reclaimed water or with forced air, though air cooling is less efficient and would increase nuclear electricity prices by 5% to 10%.
  • There should be no problem with uranium availability for the foreseeable future and even large increases in uranium costs have only a small effect on nuclear power costs.
  • While there are manufacturing bottlenecks now, these should disappear over the next 10 to 15 years if nuclear power facilities world-wide grow as expected.
  • There are benefits to the localities where nuclear plants are sited. Property taxes would amount to $50 million per year per gigawatt of electrical capacity (GWe) in addition to about 500 permanent jobs.

The full report discusses all these issues in more detail including weapons proliferation issues in a world with many more nuclear plants, spent fuel issues, and future options (including fusion). 

Dr. Richter ends with this 

In Summary: There are no barriers to nuclear expansion in California except legislative and public acceptance ones. The lessons of Fukushima are still being learned and will result in some new regulations. The repository problem is entirely political rather than technical.


Avoiding carbon lock-in

The Stockholm Environment Institute recently published their research on the dynamics of “carbon lock-in” (thanks to prof David MacKay @DavidJCMacKay for this reference)

the tendency for certain carbon-intensive technological systems to persist over time, “locking out” lower-carbon alternatives, due to a combination of linked technical, economic, and institutional factors.

There is a lot of information packed into the SEI graphic above — where is the “low hanging fruit”?

The concept of “lock-in” is typically discussed in the context of long-lived capital assets. E.g., the owners of Germany’s March 2015 Moorburg coal power plant will want to operate the plant through it’s planned financial lifetime. New coal plants are an incredibly bad, bad thing to do when there are economic alternative. Germany did something even worse than building a bad alternative. Germany’s ideology aside, the implications of lock-in are more complicated.

Q: Does carbon lock-in affect the prospects for carbon pricing?

SK: Yes, the fundamental concern is that carbon lock-in is self-reinforcing. The more we invest in long-lived high-carbon assets, the more powerful the political interests that benefit from them, and the greater the resistance to a low-carbon transition. The flip side is also true: the more we adopt measures that encourage investment in renewables, the more momentum will build toward a transition. It will create constituencies (such as employees and investors), expand networks (e.g. denser supply chains), and affect the market (e.g. building consumer familiarity). This is why we’ve looked at the institutional dimension of lock-in.

So, every new coal plant strengthens the political power that will protect the whole infrastructure of coal-fired generation. The same principle applies to every new wind farm.

When MIT convenes the “Future of Energy Conference” in 2100 I believe there will be broad agreement that the rich countries made a huge mistake by overinvesting in the currently fashionable variable renewables (VRE). The trillions of dollars invested in VRE were not available for building efficient nuclear fission plants. Based on experience so far, very little coal generation is substituted by the VRE. If a large enough carbon price is implemented then coal will be substituted, but the benefit of the VRE investment is reducing the fuel costs for the gas plants required for backup. Moreover, all that VRE investment created politically powerful new interest groups that benefit from:

  • building and maintaining more and more solar and wind;
  • building vast new transmission networks to move electricity from remote areas to the cities
  • decommissioning and replacing these short-lived generators

It will be interesting to see how many times the public will support the replacement of the wind farms and fields of solar when the machines built by those huge investments fail in 25 to 30 years.

Nuclear load following

Nuclear generation is sometimes misunderstood as “only baseload capable” and therefore incompatible with wind and solar because of their erratic generation profiles. This is not true. It is true that if there is a large baseload demand, then the economics favor nuclear plants that are optimized to run 24/7/365. Like any productive asset with high capital cost, the owner prefers high utilization to earn the highest return on that investment. This is one of the essential reasons that wind and solar will always be expensive – every hour they are not generating at rated capacity their high capital investment is not earning a return.

The engineering design of nuclear plants covers a range of load-response capabilities: from very fast response (think nuclear submarines and warships) to pure-baseload. The electric power market has mostly been characterized by baseload customers so traditional plant designs have been optimized for those economics. That said, even old 1960s designs like the French and German fleets are operated in load following mode. Here’s the power output time series of Golftech 2, one of the load following French nuclear plants.

The French electrical grid is sometimes 90%+ nuclear, so obviously nuclear generation has to maneuver to match the real-world demand (there is no magical “demand management” which makes the problem of the intermittency of wind/solar go away, this is the real-world of near zero carbon electricity in 2015). More references on nuclear load-following:

IAEA Technical Meeting – Load Following Sept 4-6 2013, Paris (source of the Golfech 2 chart, considerable details on how EDF plants are operated for load following)

Load-following capabilities of NPPs

So far we’ve only discussed the 1970s technology – designed and built when the primary market was for pure baseload generation. Tomorrow’s generation market will need to incorporate “renewables” which generate if the sun and weather dictate. For the zero carbon carbon future we can balance the intermittent renewables with storage or nuclear. If everyone is as wealthy as Bill Gates we could use storage. Otherwise we need dispatchable nuclear plants that can respond with high ramp-rates to VRE (variable renewable energy). Many of the advanced Gen IV reactors have economic load-following capability inherent in their designs.

The first to be deployed SMR load-follower is likely to be NuScale’s design, a creative way to achieve variable output with tried and true LWR technology:

10. Can NuScale’s SMR technology be complementary to Renewables?

Yes. NuScale’s SMR technology includes unique capabilities for following electric load requirements as they vary with customer demand and rapid changes experienced with renewable generation sources.
There are three means to change power output from a NuScale facility:
Dispatchable modules – taking one or more reactors offline over a period of days
Power Maneuverability – adjusting reactor power over a period of minutes/hours
Turbine Bypass – bypassing turbine steam to the condenser over a period of seconds/minutes/hours

NuScale power is working with industry leaders and potential customers to ensure that these capabilities provide the flexibility required by the evolving electric grid. This capability, called NuFollowTM, is unique to NuScale and holds the promise of expanding the deployment of renewables without backup from fossil-fired generating sources, such as natural gas-fired, combined cycle gas turbines (CCGTs)