Martin Rees: Will technology’s dark side eclipse humanity’s bright future?

Martin Rees 10th anniversary.

In November 2015, Martin Rees gave the Oxford Martin School 10th Anniversary Lecture [here’s the video, here’s the transcript]. The theme of the lecture is that the 21st century is special — let’s make sure we get to the other side intact. We humans have technologies under development that make me think of Stewart Brand’s famous quote in the Whole Earth Catalog “we are as gods, we might as well get good at it”. Today Stewart says

“What I’m saying now is we are as gods and have to get good at it.”

We have to get good at our job because our technologies, from fossil fuels to biotech to AI, give us the opportunity to screw it up. So we need to pay very close attention to making our way successfully through the next 100 years. Lord Rees:

Why is the 21st century special? Our planet has existed for 45 million centuries, but this the first when one species – ours – can determine the biosphere’s fate. New technologies are transforming our lives and society – they’re even changing human beings themselves. And they’re inducing novel vulnerabilities. Ecosystems are being stressed because there are more of us (world population is higher) and we’re all more demanding of resources. We’re deep into what some call the Anthropocene.

And we’ve had one lucky escape already. At any time in the Cold War era, the superpowers could have stumbled towards nuclear Armageddon through muddle and miscalculation. Robert McNamara, US defence secretary at the time of the Cuba crisis, said after he retired that “[w]e came within a hairbreadth of nuclear war without realizing it. It’s no credit to us that we escaped – Khrushchev and Kennedy were lucky as well as wise.”

This is a terrific lecture, applying science-informed optimism to the benefits and risks of some of our most powerful technologies.

For the rest of this talk I’ll focus on a different topic –the promise, and the dark side, of novel technologies that change society and empower individuals – and I’ll venture some speculations about the far future.

We live in a world increasingly dependent on elaborate networks: electric-power grids, air traffic control, international finance, just-in-time delivery, globally-dispersed manufacturing, and so forth. Unless these networks are highly resilient, their benefits could be outweighed by catastrophic (albeit rare) breakdowns — real-world analogues of what happened in 2008 to the financial system. Our cities would be paralysed without electricity. Supermarket shelves would be empty within days if supply chains were disrupted. Air travel can spread a pandemic worldwide within days. And social media can spread panic and rumour literally at the speed of light.

It’s imperative to guard against the downsides of such an interconnected world. Plainly this requires international collaboration. (For instance, whether or not a pandemic gets global grip may hinge on how quickly a Vietnamese poultry farmer can report any strange sickness.)

On pandemics, Oxford Martin colleague Larry Brilliant has taught us how critical it is to invest in “early detection, early response”. Early detection is enabled by the growing power of our networks. Early response is enabled by human and physical infrastructure, and by investing in molecular biology so that we can rapidly analyze detected pathogens, then formulate and manufacture vaccines or antiviral compounds.

One of Martin Rees’s concerns is malign biotech, especially since CRISPR.

Malign or foolhardy individuals have far more leverage than in the past. It is hard to make a clandestine H-bomb. In contrast, biotech involves small-scale dual use equipment. Millions will one day have the capability to misuse it, just as they can misuse cybertech today. Indeed, biohacking is burgeoning even as a hobby and competitive game.

So what do we do about this risk? Regulation is useless for controlling the behavior of the “malign or foolhardy”. In fact we want to be very mindful that we do not entangle our best researchers in a net of over regulation. Because our best defense is exactly our rapid detection-response capabilities created to minimize the impact of natural pandemics.

What about the benefits and risks of advanced AI, specifically General Artificial Intelligence (GAI)?

The timescale for human-level AI may be decades, or it may be centuries. Be that as it may, it’s but an instant compared to the future horizons, and indeed far shorter than timescales of the Darwinian selection that led to humanity’s emergence.

I think it’s likely that the machines will gain dominance on Earth. This is because there are chemical and metabolic limits to the size and processing power of ‘wet’ organic brains. Maybe we’re close to these already. But no such limits constrain silicon based computers (still less, perhaps, quantum computers): for these, the potential for further development over the next billion years could be as dramatic as the evolution from pre-Cambrian organisms to humans. So, by any definition of ‘thinking’, the amount and intensity that’s done by organic human-type brains will be utterly swamped by the future cerebrations of AI.

Moreover, the Earth’s biosphere isn’t the optimal environment for advanced AI – interplanetary and interstellar space may be the preferred arena where robotic fabricators will have the grandest scope for construction, and where non-biological ‘brains’ may develop powers than humans can’t even imagine.

But we humans shouldn’t feel too humbled. We could be of special cosmic significance for jump-starting the transition to silicon-based (and potentially immortal) entities, spreading their influence far beyond the Earth, and far transcending our limitations.

So, even in this ‘concertinered’ timeline — extending billions of years into the future, as well as into the past — this century may be a defining moment where humans could jeopardise life’s immense potential.That’s why the avoidance of complete extinction has special resonance for an astronomer.

That’s the rationale for the Future of Humanity Institute, the element of the Martin School that addresses ‘existential’ risks on the science fiction fringe.

Watch or read, and please tell your friends. We really, really need to focus much more energy on long term thinking.

I almost forgot to mention that Martin Rees is a cofounder of another prestigious risk research institution, the Centre for the Study of Existential Risk at Cambridge.

More on the Oxford Martin School. Lastly, good news: our home star is good for another six billion years. Just imagine what we can accomplish before we are forced to move!

Oxford Martin School

NewImage

Dr. James Martin founded the School in 2005 with Oxford’s largest ever benefaction. The mission of the Oxford Martin School is to develop practical solutions to the really hard problems.

Martin’s vision was that the Oxford Martin School should be a unique, interdisciplinary research community designed to address the most pressing global challenges and opportunities of the 21st century, using rigorous interdisciplinary research to find solutions. This is vital because the problems facing humanity will become so severe, but so also are its new opportunities. A new methodology was needed for interdisciplinary research and problem-solving, and this came to pervade the Oxford Martin School.

The School now has over 30 institutes and projects concerned with different aspects of the future, from the governance of climate change to the possibilities of quantum physics; from the future of food to the implications of an ageing population; and from new economic thinking to nanotechnology in medicine. Each institute can only function by integrating multiple disciplines, and now separate institutes are becoming connected. Together, the different issues of the School connect to form an understanding of our future. The School has over 300 post-doctorate scholars and professors, working across the University of Oxford.

The Advisory Council of the School is populated by some of the most thoughtful and influential people that I know of. Other examples in addition to Martin Rees there are Nicholas Stern, Larry Brilliant and J. Craig Venter. The faculty is similarly first-rate, including Steve Rayner — one of the principals of the Hartwell Paper [see Kyoto Wrong Trousers: Radically Rethinking Climate Policy]. Steve has also been an important contributor to the birthplace of Ecomodernism at the Breakthrough Institute. See Climate Pragmatism, a revised and updated version of the Hartwell Paper.

The School is also home to the Future of Humanity Institute (FHI), lead by Founding Director Prof. Nick Bostrom. Nick is the author of Superintelligence: Paths, Dangers, Strategies, a superb introduction to the challenges of ensuring future AIs are friendly. Nick also directs the Oxford Martin Programme on the Impacts of Future Technology.

Michael Douglas narrates James Martin’s one hour documentary The Meaning Of The 21st Century based on the book of the same title. The film synopsis page says “THIS FILM IS ABOUT THE MOST VITALLY IMPORTANT SUBJECT OF OUR TIME”.

Lord Martin Rees in conversation at The Wellcome Trust

NewImage

It’s dependably fun and illuminating to see Martin Rees unconstrained by the political “don’t go theres”. So, have some fun with this Lord Martin Rees in conversation at The Wellcome Trust [13 June 2014, 93 minutes]. Skip the first 10 minutes of formalities. Then as the interview begins with ‘where it all started’ Martin explains the basic principles of his grad school experience in the 1960s:

If you go into an established field you can only do something new by doing something that the old guys got stuck on.

Whereas if you go to a field where new things are happening, then the experience of the old guys is at a heavy discount.

Max Planck’s longer quote can be paraphrased as Science advances one funeral at a time.” I had to endure only part of that experience, as advisor Woody Bledsoe would try anything promising. But my mathematics chair was a classical guy who insisted that thesis exams concentrate on partial differential equations. Very relevant to planning the Juno rendezvous with Jupiter, not so helpful in computer science. Here’s the challenge: how do we develop young scientists without trapping them inside the boundaries of the department hierarchy?

Enjoy!

David MacKay “Perhaps my last post – we’ll see”

David MacKay

Prof. David MacKay has done more than any other human to improve our understanding of practical energy policy. His famous book Sustainable Energy Without the Hot Air is on the bookshelf of everyone who is seriously interested in making the future better. 

Yesterday David wrote:

I noticed that the posts of a friend who died of cancer trickled away to a non-conclusion, and this seems an inevitable difficulty, that the final post won’t ever get writ.

I’d like my posts to have an ending, so I’m going to make this my final one – maybe.

Ever the scientist, he has been documenting his experience as a cancer patient. For example Bye-bye Chemotherapy, Hello TP53! explains how he and his oncologist discuss prospects and options. I hope that David recovers so well that he can write a new book – a scientist’s perspective on how he became a former cancer patient.

What do the aliens think about AGW politics?

Alien and Gort: You know Fission but are still building windmills?

Two years ago I was puzzling over the other-worldness of US politics on AGW which includes references to some better-informed researchers. It would be instructive to hear what the more-advanced aliens think about the earthly goings-on. I envision the alien walking down the ramp of his starship, looking out over California’s once beautiful landscape, now covered with the litter of tens of thousands of diffuse-energy contraptions

You’ve mastered the science of fission, and you are still building windmills?

Public Views on a Carbon Tax Depend on the Proposed Use of Revenue

NewImage

This is excerpted from a 2014 University of Michigan report from the National Surveys on Energy and Environment.

Conventional wisdom holds that a carbon tax is a political non-starter. However, results from the latest version of the National Surveys on Energy and Environment (NSEE) provide evidence of substantial public support for a tax on the carbon content of different fossil fuels when specific uses of tax revenue are attached. A majority of respondents support a revenue-neutral carbon tax, and an even larger majority support a carbon tax with revenues used to fund research and development for renewable energy programs. The carbon tax coupled with renewable energy research earns majority support across all political categories, including a narrow majority of Republicans. These findings generally confirm previous NSEE results when revenue use options are linked to carbon taxation. These are among the latest findings from the Spring 2014 NSEE from the Center for Local, State, and Urban Policy at the University of Michigan and the Muhlenberg College Institute of Public Opinion.

Key Findings

  1. Most Americans oppose a carbon tax when the use of tax revenue is left unspecified. Overall support for such a tax is 34% in the latest NSEE survey. Attaching a specific cost to the carbon tax reduces overall support to 29%.
  2. A revenue-neutral carbon tax, in which all tax revenue would be returned to the public as a rebate check, receives 56% support. The largest gains in support come from Republicans.
  3. A carbon tax with revenues used to fund research and development for renewable energy programs receives 60% support, the highest among tax options that we presented. Majorities of Democrats, Republicans, and Independents each express support for this tax.
  4. Most respondents oppose a carbon tax with revenues used to reduce the federal budget deficit. Overall support for such a tax is 38% with a majority of Democrats, Republicans, and Independents each expressing opposition to this tax.
  5. When asked which use of revenue they prefer if a carbon tax were enacted, pluralities of Democrats, Republicans, and Independents each prefer renewable energy over tax rebate checks or deficit reduction.

My read of this and similar polling is that the US could pass a revenue-neutral carbon tax if it is well-crafted. What will get the conservatives into the boat is to ring-fence the revenues so the money doesn’t get gobbled up in the general fund. Just for discussion, say 80% of the revenue is earmarked for rebate like Dr. James Hansen’s Fee and Dividend. The balance of 20% funds a public-private partnership for innovation in technology-neutral clean-energy and climate adaptation. I would like to include geo-engineering research in the mission, but I suspect that’s too emotionally explosive.

Designing a mechanism for allocation of the R&D fund is also a real challenge. To recommend the budgeting and allocation design – how about a reprise of the President’s Blue Ribbon Commission? I even have a candidate dream team for you, with Dr. Jane C. S. Long to chair. Jane led the California Council on Science and Technology team who produced California’s Energy Future — The View to 2050. Even better would be to make a partnership of Dr. Long’s CCST team and the Energy Research Partnership team.

A US unilateral carbon tax is a big step forward, but alone it won’t get the job done. We need buy-in by enough of the developing nations to attenuate the principal source of future emissions growth. I am convinced that there is a solution to that challenge: that it is to structure this carbon tax as the US component of a Common-Commitment as proposed by the team of David J. C. MacKay, Peter Crampon, Axel Ockenfels & Steven Stoft How to negotiate a climate agreement that will actually work. I don’t know what the numbers would be, but the carbon tax revenue splits would have to be adjusted to support the US share of Green Fund payments. What would be the politics of American support for a Common Commitment deal?

Brad Plumer on Nuclear Learnings from France & South Korea

Brad’s excellent essay on Vox interprets the recent Energy Policy paper by Jessica Lovering, Arthur Yip, and Ted Nordhaus. I have really just one quibble with Brad’s “Four broad lessons” summary where he wrote:

1) Stable regulations are essential for nuclear power to thrive. More than, say, solar or wind, nuclear will always need strict safety and environmental regulations. No way around that. The risks are inherently higher.

That’s an example of the “see I’m not pro-nuclear” positioning that we often notice even in informed commentary on how nuclear power fits into the menu of low carbon options. The relative risks of nuclear power are not inherently higher! In his marvelous “Sustainable energy without the hot air” David Mackay’s Chapter 24 Nuclear? examines nuclear power. From that chapter I extracted following graphic. This is David’s computation of deaths per GWy (gigawatt-year), which he has extracted from two of the studies we’ve previously referenced: the EU ExternE research, and the Paul Scherrer Institute.David MacKay relative risks of energy options

  • Our goal is to substitute low-carbon for fossil, especially coal, and especially in the developing fast-growing nations.
  • Amongst the low-carbon options, nuclear has proven to be the safest and really the only scaleable option that can displace coal and natural gas.
  • Nobody is proposing to build more unsafe Chernobyl RBMK unsafe. Yet Chernobyl deaths dominate the tiny comparative death-print statistics of our generation options. Take away Chernobyl and commercial nuclear’s death-print is effectively zero.

When I first studied the relative risks of our energy options I quickly realized that my fears of nuclear catastrophe were based entirely on media mythology. The media don’t report on the thousands of people killed by fossil fuels every year. Even major accidents like the San Bruno gas pipeline explosion are not widely reported or investigated (this 2010 accident was in a suburb of San Francisco: 8 fatalities, 52 injured). Fossil energy causes real people to die every year – real deaths versus theoretical nuclear deaths.

San Bruno gas pipeline explosion

We have a civilizational choice to make: whether we organize political support to scale up construction of advanced nuclear plants that are both economical and orders of magnitude safer than the existing safe 3G plants. If we fail to do that we are going to squander our wealth on the renewables dream – only to find ourselves blockaded by the economics when we are only halfway to our goal of zero emissions energy.

SpaceX demonstrates a reusable first stage

My guess is this will prove to be a historic milestone – from the perspective of future historians studying how humans evolved to be a multi-planetary species. I’ll stick my neck out to speculate that in 10 years major component reuse will be the norm.

I love good engineering! Yes I know that landing the first stage 1 out of 3 tries isn’t good enough. But isn’t it remarkable that SpaceX succeeded in only three attempts? How many more launches before they are achieving 80% success? That’s most of the cost-savings right there. Sure 100% will be nice someday.
 
Think what it would cost to fly to the South Pacific if Air New Zealand threw the Dreamliner away at the end of each flight:-)

WHO’s first global report on antibiotic resistance reveals serious, worldwide threat to public health

The nightmare bugs are multiplying because our antimicrobial team has no real leadership and has shockingly inadequate funding. We don’t have much data on what is really happening, but my guess is the deaths-from-resistant-microbes curve is increasing at an increasing rate. Every year more patients discover that the post-antibiotic world has already arrived for them.

A new report by WHO–its first to look at antimicrobial resistance, including antibiotic resistance, globally–reveals that this serious threat is no longer a prediction for the future, it is happening right now in every region of the world and has the potential to affect anyone, of any age, in any country. Antibiotic resistance–when bacteria change so antibiotics no longer work in people who need them to treat infections–is now a major threat to public health.

“Without urgent, coordinated action by many stakeholders, the world is headed for a post-antibiotic era, in which common infections and minor injuries which have been treatable for decades can once again kill,” says Dr Keiji Fukuda, WHO’s Assistant Director-General for Health Security. “Effective antibiotics have been one of the pillars allowing us to live longer, live healthier, and benefit from modern medicine. Unless we take significant actions to improve efforts to prevent infections and also change how we produce, prescribe and use antibiotics, the world will lose more and more of these global public health goods and the implications will be devastating.” 

Globally we are falling further behind. Every year more resistant bacteria are discovered, more people die. The most recent data I have shows 2 million U.S. cases of antimicrobial resistance, resulting in 23,000 deaths. We know the actuals are higher because there is no requirement for hospitals to report cases or even outbreaks of resistance. Here is an example from the transcript of the PBS special “Hunting the Nightmare Bacteria”:

Nationally, most hospitals aren’t required to report outbreaks to the government, and most won’t talk publicly about them. (…snip…)

Dr. BRAD SPELLBERG: It’s not that the government agencies are not aware of the problem and are not— and are not doing anything. It’s that we have not had a comprehensive plan for how to deal with antibiotic resistance. We don’t have reporting mechanisms, like they do in Europe, to know where resistance is occurring, who’s using the antibiotics, are we overusing them?

DAVID E. HOFFMAN: Wait. You’re telling me we don’t know the answers to the extent of the problem?

Dr. BRAD SPELLBERG: That’s correct.

DAVID E. HOFFMAN: We don’t have the data?

Dr. BRAD SPELLBERG: That is correct. I do not know how many resistant infections are occurring right now. I don’t know what the frequency of resistance in different bacteria are. We do not have those data.

NARRATOR: FRONTLINE requested an interview with the secretary of Health and Human Services, Kathleen Sebelius. We wanted to ask about the lack of data and the about the priority the department is giving to the new superbug crisis. But she declined to be interviewed.

The “nightmare bacteria” have caught governments and public health authorities napping. They didn’t seem to notice that over the past twenty years the development of new antibiotics has collapsed. From the 2013 report by the CDC Antibiotic Resistance Threats this graphic illustrates that there is now almost no new antibiotic development.

Antibiotic development collapse

The fundamental reason for the collapse in new antibiotics is the pharma marketplace doesn’t reward developers enough to pay for the R&D and the drug approval process (USD $600 million to $1 billion for a new drug). Those numbers inhibit every kind of drug – but let through those that sell to the chronic patient markets (cholesterol, hypertension, …). A successful new antibiotic may be sold to a patient for 10 days, not 30 years like a hypertension drug.  And sadly there is nearly no high level level focus on the new antimicrobial market failure.

Resistance is an everyday process – microbes begin exhibiting resistance as soon as a new compound is deployed. There was already penicillin resistance when the drug was first commercially introduced. This issue didn’t start making the headlines in at the beginning of the 21st century because there were still a lot of drugs in the cabinet that could be tried when a new resistant bug surfaced. Today, for an increasing number of infectious diseases, the antibiotic cabinet has fewer effective drugs every year. From the  CDC Antibiotic Resistance Threats report, this graphic illustrates key resistance events:

Timeline of antibiotic resistance

The PBS Frontline special is a useful introduction to this subject — with video, audio, transcript and a number of useful resource links. The CDC report Antibiotic Resistance Threats is an excellent, well-researched overview as of 2013. CDC has a Antibiotic / Antimicrobial Resistance websitethat can be your home base for researching and tracking progress on this issue. CDC is asking congress for $160M [Antibiotic Resistance Solutions Initiative — $160M: A Comprehensive Response].

So what can you do? Most important is to make it clear to your representatives that you expect them to support a major government focus.  In the U.S. there should be at least an NIH Assistant Secretary devoted to antimicrobial resistance, whose mission should be new antimicrobial drug research and development, high-efficiency testing to fast-track diagnosis of new cases, case tracking/reporting, and OBVIOUSLY to radically slash the agricultural misapplication of antibiotics at sub therapeutic doses (about 80% by mass of US antibiotic sales).

To give you an idea of how inadequate the US response is read Can a New White House Plan Catch Up to the “Superbug” Threat?

Although that initiative represents the government’s first-ever attempt to broadly address the issue of antibiotic resistance, the plan has been quickly dismissed by some scientists and lawmakers for not going far enough. In an interview with Politico, Rep. Louise Slaughter (D-N.Y.), the only microbiologist in Congress, said that goals set for 2020 are too far off to make up for lost ground.

“I’ve said to people, ‘Right now your government is not going to protect you,’” said Slaughter. “They’re about 10 years behind.”

PS – if you have some elective surgery on your horizon, say a knee or hip replacement, you might want to think about getting that done while there are still a few antibiotics that could help save your life (or your leg).

Decarbonization: Is California Exceptional?


Image: Mother Jones

If you ask a random citizen of the state “Is California decarbonizing faster than the rest of the U.S.?” they are likely to reply “Of course, California is the leader!” This accepted wisdom was reflected in the recent Mother Jones article by Gabriel Kahn [@gabekahn] Did California Figure Out How to Fix Global Warming?.  

All of these advances have undercut a fundamental tenet of economics: that more growth equals more emissions. Between 2003 and 2013 (the most recent data), the Golden State decreased its greenhouse gas emissions by 5.5 percent while increasing its gross domestic product by 17 percent—and it did so under the thumb of the nation’s most stringent energy regulations.

As the chart above shows, California is decarbonizing, but how exceptional is California’s performance since 2000? In reality, California’s results are not exceptions, but representative of the nation. For the same 2000 to 2013 period discussed in the Mother Jones article, here is a graphic showing California and U.S. greenhouse gas emissions decreases as a function of GDP [thanks to John Fleck for graphing the IEA, BEA and Census Bureau data].


Image: John Fleck [@jfleck]
And here’s John Fleck’s graphic showing how California’s per capita emissions compare to the US as a whole.


Image: John Fleck [@jfleck]

From the time series graphs it’s hard to judge whether California’s results are better or worse than the US average. Mike Shellenberger [@shellenbergerMD] aggregated the EIA 2000-2013 decarbonization data to demonstrate that the nation reduced emissions more than California. This doesn’t account for emissions that California exported to other states (e.g., for power generation) or other nations (e.g., China for the embedded energy in imported goods).


Image: Michael Shellenberger [@shellenbergerMD]

Mike has also been analyzing the favorable tail wind provided by crashing natural gas prices. Low gas prices have hidden from the consumer the true cost of subsidizing (non-large hydro) renewables. The US has been enjoying historically low natural gas prices. The lows are an anomaly caused by the local oversupply of U.S. shale gas. The oversupply situation is local to the US because it’s difficult to export natural gas. There are long lead times and large capital investments required to expand gas export infrastructure. Once the excess supply can be freely exported the US natural gas market will clear much like crude oil. Then US consumers will be paying a lot more for natural gas.


Image: Michael Shellenberger [@shellenbergerMD]

For his hypothesis that California has outperformed on decarbonization Gabe Kahn mainly credits politicians backing mandates and subsidies for wind and solar. Missing is discussion of all of the factors that contributed to the reduced emissions intensity. Nationally the two biggest contributors have been the Great Recession (falling demand) and fuel-switching from coal to gas. The Breakthrough Institute published a 2014 report Natural Gas Overwhelmingly Replaces Coal: New Analysis of US Regional Power Generation Between 2007 and 2013. US emissions intensity fell largely due to fuel-switching:

Changes in generation shares at the regional level, however, strongly support the conclusion that fuel-switching from coal to gas, along with falling electricity demand in the wake of the Great Recession, account for the vast majority of the decline in emissions. Moreover, the shift from coal to gas accounts for a significant majority of the decline in the carbon intensity of the US electrical grid since 2007.

Comparison of CA low-carbon sources
Image: James Conca [@JimConca]

Study the above 2014 chart. The biggest elephant in the room (not discussed by Gabe Kahn) is the serious negative impact of activist attacks on the state’s nuclear plants. The premature closing of SONGS cost the state almost all the gains of twenty years of building subsidized wind and solar — California’s decarbonization rate took a big step backwards.  James Conca explains:

In one fell swoop, the unnecessary closing of San Onofre Nuclear Generating Station in San Diego wiped out the low-carbon energy equivalent of almost all the wind and solar installed in California, reversing the state’s 20-year progress in low-carbon energy. Wind and solar are the only low-carbon energy sources growing in California. Geothermal, biomass and hydro have been flat for 10 years.

Going backwards: the San Onofre Nuclear Generating Station (SONGS) was prematurely closed January 2012. Now the same activists are trying to shut down California’s only nuclear power station Diablo Canyon — which is quietly producing every day about 1.6 times the output of all of California’s solar power. It will be impossible for California to achieve zero-carbon by closing rather than building new nuclear plants. In California Gets Coal for Christmas: SONGS Closure Produces Extra 18M Tons of Carbon Dioxide James Conca reviews the reality:

A state-funded study by the California Council on Science and Technology found that only significant nuclear, or obtaining as-yet-undeveloped carbon capture technologies, can solve California’s energy demands and emission goals in this century (CCST SummaryCCST Report to 2050). We geologists know how unlikely carbon capture and storage is, and we should keep trying, but we can’t bet the house on unknown technologies.

The California’s Energy Future report that Jim references is a very good piece of work. See my report on the 2013 Travers Conference at UC Berkeley for updates on the study. You can help by supporting Save Diablo Canyon.

So how is California doing relative to our two degrees target? Poorly – and most people don’t appreciate how incredibly challenging it is. Two years ago Price Waterhouse Coopers estimated that a global compounded decarbonization rate of 6.2% per year would just get us to zero-carbon by 2100. California’s 7.5% over thirteen years is way short of 6.2% compounded — it is not much better than the dotted line in the this PWC chart, and not nearly good enough. When was the last time we saw nations decarbonize rapidly? It can be done.

Image: PWC Price Waterhouse Coopers