Renewables are making no progress against coal

No doubt you’ve heard that Friends of the Earth recently announced their primary objection to nuclear power is now because it is too slow to build and too costly.

I would like to introduce FOE to the data embodied in Roger Pielke Jr’s graphic. I’ve modified Roger’s chart to illustrate the only energy policy that has succeeded to rapidly displace fossil fuels at utility scale. My crude green slope indicator highlights the period when France, Sweden, Belgium, Canada, United States, Germany, Japan, Switzerland and others built their nuclear power fleets. The absence of further progress since 1995 shows the stark reality of how little has been achieved by the billions dollars of taxpayer wealth that has been spent on renewable subsidies since Kyoto. The following chart contrasts the speed and scale of the nuclear build with the  slow build of the whole suite of “renewables”.

Roger’s short Breakthrough essay is the source of the original chart:

The data shows that for several decades the world has seen a halt in progress towards less carbon-intensive energy consumption, at about 13 percent of the total global supply. This stagnation provides further evidence that the policies that have been employed to accelerate rates of decarbonization of the global economy have been largely ineffective. The world was moving faster towards decarbonizing its energy mix long before climate policy became fashionable. Why this was so and what the future might hold will be the subject of future posts in this continuing discussion.

If you are keen to learn what makes for effective decarbonization policies, then you are likely to also enjoy Roger’s The Climate Fix. For an Executive Summary of the concepts see A Primer on How to Avoid Magical Solutions in Climate Policy.

Per Peterson answers the “nuclear waste” question

Amongst the Reddit AMA questions, I appreciated the direct way Peterson put this classic opposition complaint into proper context: 

Q: I don’t think we should build any reactors until we have a repository for the waste ready to go…

A: I understand this position (don’t make waste until you have the ability to dispose of it properly).
But the major problem we face is that we are using our atmosphere as our primary waste repository for the products of fossil fuel combustion. We have a strong scientific and technical consensus that deep geologic disposal can provide acceptable long-term isolation of nuclear wastes, and we have two countries now that have successfully developed and are building repositories for commercial spent fuel (France and Sweden).

We also have no plausible approaches to remove CO2 waste from the atmosphere once it is put there, except for some scary geoengineering ideas (such as fertilizing the oceans). Future generations are likely to be much more angry about the CO2 we’re generating now, than the nuclear waste.

Prospects for U.S. Nuclear Power After Fukushima

Click to embiggen

The chairman of one of the largest U.S. nuclear companies recently said that his company would not break ground on a new nuclear plant in the United States until the price of natural gas was more than double today’s level and carbon emissions cost $25 of ton. This seems to pretty well summarize the current prospects for U.S. nuclear power.

This paper by Lucas W. Davis (Haas School of Business UC Berkeley) is an excellent summary of the US situation as of 2011, and a good source of references for your research on nuclear construction costs. Davis is not attempting to predict the future; he is drawing inferences from the historical data. That is a depressing picture — with the 2011 evidence indicating that US nuclear suppliers have not learned even the French lessons. 

Many within the nuclear industry claim that the industry is headed more toward the French model. A chairman of a major nuclear power company recently reported that new reactors would be standardized down to “the carpeting and wallpaper”. However, this claim does not appear to be supported by the license applications that have been received to date. Among the 17 applications that have been received by the NRC, there is a mix of both pressurized water reactors and boiling water reactors, manufactured by five different reactor manufacturers (Areva, Westinghouse, Mitsubishi, GE-Hitachi, and GE). Thus, it may well be the case that the industry will soon coalesce around a very small number of designs, but this is not immediately obvious based on these initial applications. At a minimum it seems clear that the French approach of supporting a single reactor design is not going to be adopted here.

Will China lead the world out of this pit by creating a mass manufacturing supply chain for two or three standard designs?

Reddit AMA grills the UC Berkeley Department of Nuclear Engineering


Members of the UC Berkeley Department of Nuclear Engineering participated in the Science AMA Series, responding to a large number of largely hostile questions. Lots of variations of “Can I still eat fish from the contaminated Pacific”. As typical with these AMA sessions the signal to noise ratio is low due to the uninformed questions and irrelevant branched threads of discussion by people who are more interested in politics. I “mined” the 1,447 comments for what I thought were fragments worth archiving.

I guess I’ll start things off. What type of reactors should we be building? I know a big deal a few years ago was made about liquid flouride thorium reactors. Is that the way of the future, or are there superior alternatives?

Prof. Per Peterson replies (emphasis mine):

I do not think that we have the basis to determine or select the best coolant or fuel type to use in future reactors. But there are some attributes which we do need to make sure are used in future reactors.

The first is to use passive safety systems, which do not require electrical power or external cooling sources to function to remove decay heat after reactors shut down, as is the case with the AP-1000 and ESBWR designs, and with all of the light water reactor SMRs now being developed in the U.S.

The benefits of passive safety go well beyond the significant reduction in the number of systems and components needed in reactors and the reduced maintenance requirements. Passive safety systems also greatly simplify the physical protection of reactors, because passive equipment does not require routine inspections the way pumps and motors do, and thus can be placed in locations that are difficult to gain access to rapidly.

The second is to further increase the use of modular fabrication and construction methods in nuclear plants, in particular to use steel-plate/concrete composite construction methods that are quite similar to those developed for modern ship construction. The AP-1000 is the most advanced design in the use of this type of modularization, and the ability to use computer aided manufacturing in the fabrication of these modules makes the manufacturing infrastructure much more flexible. In the longer term, one should be able to design a new reactor building, transfer the design to a module factory over the internet, and have the modules show up at a construction site, so the buildings are, in essence, 3-D printed.

The final attribute that will be important for new reactors will be to make them smaller, and to develop a regulatory framework and business models that work for multi-module power plants. While there will likely always be a market for large reactors, creating an ecosystem that includes customers for smaller reactors (inland locations served only by rail, installations needing reliable power even if fuel supplies are interrupted, mature electricity markets that need to add new capacity in small increments).

On thorium, a question:

Hello! What do you think is the most important advantage that thorium has over uranium as a “fuel?”

Prof. Per Peterson’s reply

The thorium fuel cycle has clearly attractive features, if it can be developed successfully. I think that most of the skepticism about thorium emerges from questions about the path to develop the necessary reactor and fuel cycle technology, versus open fuel cycles (uranium from seawater) and closed, fast-spectrum uranium cycles.

The most attractive element of the thorium fuel cycle is the ability to operate sustainably using thermal-spectrum neutrons. This allows the design of reactor core structures that use high-temperature ceramic materials like graphite, which have substantial thermal inertia and cannot melt. Because these ceramic materials also provide significant moderation, it is difficult to use them in fast-spectrum reactors and thus the most plausible fast-spectrum reactor designs need to use metallic structural materials in their cores.

So thorium reactors are compatible with higher intrinsic safety (cores which do not suffer structural damage even if greatly overheated) and that can deliver heat at higher temperature, which enables more efficient and flexible power conversion.

Molten fluoride salts are compatible with these high-temperature structural materials, and given their very high boiling temperatures make excellent, low pressure heat transfer fluids. In the near term, the largest benefits in using fluoride salts come from the low pressure and high temperature heat they can produce. This can be achieved with solid fuel, which is simpler to work with and to obtain regulatory approvals.

But molten salt technologies also have significant challenges. One of the most important is managing the much larger amounts of tritium that these reactors produce, compared to light water cooled reactors (the quantities are closer to what heavy-water reactors, such as the CANDU, produce, but methods to control and recovery of tritium are much different for molten salts than for heavy water, and key elements remain to be demonstrated).

To repeat a critical point “…largest benefits in using fluoride salts come from the low pressure and high temperature heat they can produce. This can be achieved with solid fuel…”. This summarizes why Prof. Peterson’s lab is focused upon developing the PB-AHTR design, which will also prove out many materials and technologies required subsequently to implement the more challenging Liquid Fuel molten salt reactor concept (such as LFTR).

Regarding waste: Prof. Peterson was a member of Obama’s Blue Ribbon Commission on America’s Nuclear Future. I consider him one of the best-informed sources regarding Spent Nuclear Fuel (SNF) which the anti-nuclear lobby calls Nuclear Waste. It is not “waste” it is an extremely valuable source of carbon-free energy. 

Q: One of the elephants in the room for nuclear power is the waste….

A: …Finland and Sweden have successfully sited and are building deep geologic repositories in granite, and France is very far along in developing its geologic repository in clay. The U.S. nuclear waste program is currently stopped and is in a state of disarray…

There are a wide range of opinions as water reactors (LWRs) is substantially more expensive than making new fuel from uranium, even if the plutonium is free. This is primarily because the plutonium must be handled as an oxide powder to make LWR fuel, and oxide powder is the most hazardous and difficult form to handle plutonium in. All of the Generation IV reactor technologies can use fuel forms that do not involve handling plutonium and minor actinides in the form of powders and that are much easier to fabricate using recycled material (e.g., metal, molten salt, sol-gel particles in either coated particle or vibropacked fuel forms).

In my personal opinion, the most sensible thing to do in the near term is to prioritize U.S. defense wastes for geologic disposal, and to use a combination of consolidated and on-site interim storage for most or all commercial spent fuel. Implementation of the Blue Ribbon Commission’s major recommendations, which include development of consolidated interim storage that would initially be prioritized to store fuel from shut down reactors, would put the U.S. on this path.

By using geologic disposal primarily for defense wastes first, and using primarily dry cask interim storage for commercial spent fuel, this will give a couple of decades for nuclear reactor technology to evolve further, and by then we will be in a better position to determine whether commercial spent fuel is a waste or a resource.

Nuclear innovation: Prof. Peterson replies

There are a number of factors which make innovation difficult in improving nuclear reactor technology, in particular the long operating life of nuclear power plants and their very large capital costs, which dissuade innovation. The trend toward designing larger and larger water-cooled reactors has increased these disincentives.

Given their lower capital cost and shorter construction times, innovation is much easier in small reactors. There will remain a role for large reactors, just as dinosaurs existed for millions of years alongside the new mammal species, but currently some of the most important policy issues for nuclear power involve creating an ecosystem where small reactors find customers. Smaller reactors, produced in larger numbers with most of the fabrication occurring in factories, would also use specialized manufacturing and skilled labor more efficiently. Imagine factories as being similar to airplanes, and the ability to keep more seats filled being really important to having low per-seat prices…

FHR (Fluoride Salt Cooled High Temperature Reactor), Where to take technical risk?

I will answer this question first indirectly, and then more directly.

A key question for innovation in developing new nuclear energy technology is where to take technical risk. SpaceX provides a good example of a highly successful risk management strategy. They focused on developing a highly reliable, relatively small rocket engine, that they tested in the Falcon 1, which uses an ancient rather than innovative fuel combination, kerosene and liquid oxygen. On the other hand, they chose to use aluminum-lithium alloy with friction stir welding for their fuel tanks, which is at the cutting edge of current technology. They have then used the approach of ganging together large numbers of these engines to create the Falcon 9, which is now successfully delivering cargo to the International Space Station.

Currently the most important barrier to deploying nuclear power is not the cost of the fuel, but instead is the capital cost of the plants, the need to assure that they can run with high reliability (which for current large reactor designs creates strong disincentives to innovate), and the relatively low electricity revenues one receives for producing base load power, particularly today in the U.S.

The primary reason that UCB, MIT, and UW, and the Chinese Academy of Sciences, are working on solid fuel, salt cooled reactor technology is because we have the ability to fabricate these fuels, and the technical difficulty of using molten salts is significantly lower when they do not have the very high activity levels associated with fluid fuels. The experience gained with component design, operation, and maintenance with clean salts makes it much easier to consider the subsequent use of liquid fuels, while gaining several key advantages from the ability to operate reactors at low pressure and deliver heat at higher temperature.

Q: Can I also ask what you think the safest way to transport the waste is?**

A: Per Peterson: There is a long record of safe transportation of nuclear waste, including spent fuel, world wide. The containers used to transport nuclear wastes are substantially more robust than those used to transport hazardous chemicals and fuels, which is why transportation accidents with chemicals generate significantly more risk.

This said, the transportation of nuclear wastes requires effective regulation, controls, and emergency response capabilities to be in place. The transportation system for the Waste Isolation Pilot Plant in New Mexico has logged over 12 million miles of safe transport, with none of the accidents involving the transportation trucks causing any release of radioactive materials.

One reason it is important to restore WIPP to service (it had an accident involving the release of radioactive material underground in late February, which had minimal surface consequence because the engineered safety systems to filter exhaust air were activated) is because the WIPP transportation system has developed a large base of practical experience and skilled personnel at the state and local levels who are familiar with how to manage nuclear waste transport. This provides a strong foundation for establishing a broader transportation system for commercial spent fuel and defense high level wastes in the future.

A commenter replied to Per’s hecklers, referring to WIPP:

Actually I work for this program and this is an understatement. Not only have there never been any accidents that caused a release of nuclear material, there have never been any accidents with a truck loaded with waste containers, ever. They’ve happened while empty, but never otherwise.

Per Peterson discussed the unpriced carbon emissions externality. Which I would say is effectively a tax on nuclear because nuclear produces nearly zero carbon energy in competition with coal and gas which do not pay their carbon externality costs. Per raised a very important issue: how the NRC gatekeeping sets up a strong incentive to free-ride on NRC rulings.

But there is another important market failure that affects nuclear energy and is not widely recognized, which is the fact that industry cannot get patents for decisions that the U.S. Nuclear Regulatory Commission makes. For example, there are major regulatory questions that will affect the cost and commercial competitiveness of multi-module SMR plants, such as how many staff will be required in their control rooms. Once the first SMR vendor invests and takes the risk to perform licensing, all other vendors can free-ride on the resulting USNRC decision. This is the principal reason that government subsidies to encourage first movers, such as cost sharing or agreements to purchase power or other services (e.g., irradiation) make societal sense.

Is this being discussed in the USgov? I’ve never seen a word about it. This is another example of the sub-optimal result we get from wasting billions on energy-farming production subsidies, while rationing a few millions for nuclear R&D. Even America has very limited funds – and needs to spend them very carefully.

CERA: Construction costs for new nuclear plants up over 230% since 2000

UPDATE: I have republished this 2008 post and comments to bring it “up front” with our ongoing discussion of new nuclear construction costs. At the end I’ve incorporated the 2008 comments. 

UPDATE: Per Peterson, Professor and a former chair of the Department of Nuclear Engineering at the University of California, Berkeley, was kind enough to comment on yesterday’s post on the CBO study. Dr. Peterson noted that only about 1% of new nuclear plant construction cost is construction materials. So the theme attributing the rapid cost rises to commodity prices has no basis. Contrariwise, wind turbine construction/installation require at least 10x the materials input per kilowatt — so have higher sensitivity to price and availability of steel, concrete, copper, etc. I cannot accurately summarize in fewer words, so I recommend you read his comments carefully.

Dan Yergin’s Cambridge Energy Research Associates (CERA) maintains the Power Capital Costs Index (PCCI), depicted in the graphic at left – as of May 2008 [click on the thumbnail for full size chart]. In brief the PCCI shows that a power plant that cost $1 billion in 2000 would, on average, cost $2.31 billion in May [in constant 2000 dollars].

You can infer that the increase in the cost of new nuclear plant construction has increased by more than that 230%. As you can see in the PCCI chart the non-nuclear costs are up 180%. The PCCI is assembled from data on a basket of 30 power generation facilities in North America. I don’t know what percentage of the capital base is nuclear so I’ll speculate that it’s similar to the current 22% that nuclear contributes to US generation. That implies nuclear construction costs are up about 400% since 2000.

I may be able to get more background from the CERA Capital Cost Analysis Forum – Power. But I discovered that viewing the replay of the June 6 webconference call required IE6, so I’ll need to fire up a windows PC to access it.

On factors driving the PCCI increases since 2000, CERA writes:

…Demand for new power generation facilities remains high worldwide, leading to continued tightness in equipment markets. Cost increases, supply issues and longer delivery times are exacerbated as manufacturers struggle to keep up with demand. The weakening U.S. dollar also increases the costs of global procurement for equipment and materials.

The number of engineers in the workforce is also declining as older workers retire and are not being replaced. The resulting shortages in plant design teams add additional delays to construction schedules. The current increase in construction for nuclear power generation and the dearth of experienced nuclear engineers in North America has been a key driver behind cost escalation.

Recent cancellations of proposed coal plants in the United States due to uncertainty over environmental regulations has provided some slowing in cost increases in the U.S. coal industry. However, international competition for coal boilers, particularly in Southeast Asia, is keeping the equipment order books very active.

Concerns over a looming U.S. recession and subsequent cut backs in residential construction have offered little relaxation to power construction. The residential slump does not free up the skilled workers required in the power industry and there is no overlap of the specialist metals and equipment required.

Upstream Capital Cost Index (UCCI) Courtesy IHS

I wonder if we are looking at market reactions to an impulse in demand. In the short run [say 5 years] the supply of new nuclear plants is inelastic. Demand has increased considerably beyond expectations, so equilibrium is only achieved by higher prices. We are seeing similar supply/demand responses in several energy sectors. The headlines hammer on oil prices. Note that the UCCI is only 10% less than the PCCI.

The UCCI is based upon a portfolio of 28 upstream oil and gas projects, so it represents the overnight cost of investment in both oil & gas field development and transportation. It may include finding costs, but I’m not sure. I do know that the cost per barrel-equivalent of finding + development costs has been increasing about as fast as oil companies have been able to ramp up their investments. The net result so far is no increase in reserve-additions, which are still lagging depletion.

2 thoughts on “CERA: Construction costs for new nuclear plants up over 230% since 2000”

  1. Paul on December 4, 2008 at 1:44 pm said: Edit

“only about 1% of new nuclear plant construction cost is construction materials” – sorry, I don’t think so. More like 30% at least.

  1. Steve Darden on December 4, 2008 at 7:02 pm said: Edit

_More like 30% at least.

Paul, thanks heaps for your comments. Here’s the relevant part of Dr. Peterson’s comment on commodity inputs [he gives the references as well]:

_While it is widely understood that nuclear energy costs have quite low sensitivity to the cost of uranium, it is not widely appreciated that the same applies to construction materials. If one takes the total quantity of steel, concrete, copper, and other materials required to build a light water reactor similar to those operating today 1, and then multiplies these quantities by the respective current commodity prices, the total contribution of commodity inputs is $36 per kilowatt of generation capacity 2, out of total construction prices that are estimated today to range from $3000 to $5000 per kilowatt today. The dominant cost of nuclear construction is instead in the added value that comes from converting these commodities into an operational nuclear power plant. Conversely, wind turbines require approximately a factor of 10 times as much steel and concrete to construct without considering storage capacity 3, and thus have construction costs that are sensitive to commodity costs and to potential future resource scarcity.

So he gave a range of 36/3000 to 36/5000 or 0.7% to 1.2%.

Can you educate us on the construction cost buildup – also on why quotes have gone up so much since 2000?

CBO Study: Nuclear Power’s Role in Generating Electricity

UPDATE: I have republished this 2008 post and comments to bring it “up front” with our ongoing discussion of new nuclear construction costs. At the end I’ve incorporated the 2008 comments.

I’ve been re-reading the CBO study from May 2008. This is probably the most current objective analysis of base load electrical generation options. Given the CBO levelized costing assumptions it appears that electric utilities will choose natural gas over 3rd generation nuclear unless they anticipate more than $45/ton CO2 carbon tax or equivalent:

Adding a carbon dioxide charge of about $45 per metric ton to the levelized cost estimates in the reference scenario would make nuclear power the least expensive source of additional base-load capacity (see the left panel of Figure 3-2). Up to that threshold, at all but the lowest level of charges, conventional natural gas technology would probably be the least costly option. Because coal is more carbon-intense than natural gas, the cost advantage of new capacity based on natural gas technology would grow in relation to coal technology as carbon dioxide charges increased; but the advantage that natural gas technology enjoyed over nuclear technology would shrink and eventually disappear as emission charges reached about $45 per metric ton. Thereafter, the levelized cost advantage of nuclear technology over conventional gas technology would grow. Although carbon dioxide charges would not change the cost of nuclear power plants at all, they would increase the cost of innovative fossil-fuel alternatives; as a result, the cost advantage that nuclear technology held over those technologies would increase with carbon dioxide charges but at a slower rate than that observed with conventional fossil-fuel technologies.

We know that construction costs for all types of generation have been going up rapidly with the increasing costs for steel, concrete etc. Nuclear is the most sensitive to construction costs, simply because nuclear fuel costs are negligible [conversely nuclear is insensitive to future fuel cost rises, but natural gas is extremely sensitive.) Here’s the relative sensitivities to lower or higher construction costs — again levelized 2006 dollars per megawatt hour:

The CBO study of course has to stick with already-built or on-order nuclear technology. But this may lead to drawing the wrong conclusions. Remember how much autos cost when each one was custom built? And the lousy quality?

That’s our experience of nuclear construction — custom design, custom built, custom approvals. But, given certainty of future CO2 charges, I believe that a competitive market will transform nuclear generation into a mass produced, modular product — and that costs will come down dramatically compared to alternatives.

We don’t know what future innovations will emerge, but as of today, the modular pebble-bed reactor [PBMR] technology looks very promising. Key advantages are safety by design (even chimps as operators can’t cause a nuclear accident), no proliferation worries, and perhaps most important – the design is MODULAR. That means industrial-scale mass production is possible, with all the attendant benefits. One of the most important benefits is the slashing of the financial risk of regulatory delays before a new plant is allowed to start up.

For more background on the Modular Pebble-bed design, see MIT’s study “The Future of Nuclear Power” [1], MIT prof. Andrew C. Kadak’s presentation “What Will it Take to Revive Nuclear Energy?” [PDF] [2], and his Pebble-bed presentation [PDF] [2a]. China is placing big bets here, see Wired’s “Let a Thousand Reactors Bloom” [3].

10 thoughts on “CBO Study: Nuclear Power’s Role in Generating Electricity”

  1. Rod Adams on August 26, 2008 at 8:06 pm said: Edit


It is always important to check the assumptions. The CBO study that you pointed to, though completed in 2008, apparently used a fuel price table that stopped with 2005 fuel prices. It thus assumed a base case of natural gas costing about $5.00 per million BTU.

Since the cost of fuel is about 93% of the levelized cost of electricity from a natural gas fired power plant, underestimating the cost of gas would tend to sway the computed decision in the wrong direction compared to less fuel intensive alternatives like nuclear power.

Nuclear can compete without a carbon tax against gas at current market prices – which are about $8.50 per million BTU and have been as high as $13 in the recent past and may get there again with a cold winter.

Luckily for gas buyers, it has been a fairly mild summer.

  1. Steve Darden on August 26, 2008 at 9:10 pm said: Edit

Rod – thanks for the data update. Does the increase in construction costs since the timestamp on the report data offset the underestimated natural gas prices? I.e., gas operating costs up, nuclear plant construction costs up.

I added PBMR to this post – since folks search for this acronym.

  1. Per Peterson on August 27, 2008 at 10:48 am said: Edit

While it is widely understood that nuclear energy costs have quite low sensitivity to the cost of uranium, it is not widely appreciated that the same applies to construction materials. If one takes the total quantity of steel, concrete, copper, and other materials required to build a light water reactor similar to those operating today 1, and then multiplies these quantities by the respective current commodity prices, the total contribution of commodity inputs is $36 per kilowatt of generation capacity 2, out of total construction prices that are estimated today to range from $3000 to $5000 per kilowatt today. The dominant cost of nuclear construction is instead in the added value that comes from converting these commodities into an operational nuclear power plant. Conversely, wind turbines require approximately a factor of 10 times as much steel and concrete to construct without considering storage capacity 3, and thus have construction costs that are sensitive to commodity costs and to potential future resource scarcity.

Right now demand for new reactors is clearly outstripping supply. While this current supply chain inelasticity will ease in 5 to 10 years, inelasticity in supply always results in higher prices. Thus we can expect nuclear construction prices to drop over the coming decade, but the main question is by how much. While it will never get down to the $36/kW cost of the commodity inputs, there is still potential that prices could drop greatly from the current values if modular construction and factory-based computer aided manufacturing are applied more broadly in the construction.


  1. From R.H. Bryan and I.T. Dudley, Oak Ridge National Laboratory, TM-4515, June 1974, current pressurized water reactors use 32.7 t of steel, 75 m3 of concrete, 0.69 t of copper, and smaller amounts of other materials per megawatt of capacity
  2. On March 25, 2008, the commodity prices of steel, concrete, and copper (which constitute 90% of the total commodities costs for a nuclear plant) were $601/t, $98/m3, and $7,634/t respectively.
  3. Wind requires 115 MT of steel and 218 m3 of concrete per megawatt, but has higher commodity input per unit of electricity generated due to a lower capacity factor (~25%) compared to nuclear (~90%), S. Pacca and A. Horvath, Environ. Sci. Technol., 36, 3194-3200 (2002).

    1. Rod Adams on August 27, 2008 at 2:26 pm said: Edit

The interesting thing about the numbers that are being bandied about with regard to nuclear construction costs is that they also include rather substantial allowances for risk premiums, interest costs, and inflation uncertainties.

Those costs can represent half of the final delivered price computation.

  1. Steve Darden on August 27, 2008 at 5:10 pm said: Edit

Dr. Peterson,

Thanks for taking the time to set us straight on the material inputs. 1% means nuclear plant costs are highly insensitive to that component. The CBO study bypassed the contributions to cost increases in their sensitivity analysis – simply assuming -50%, +100%.

Today I wrote a related post on the CERA index of power plant construction. Back of the envelope, assuming 22% of CERA’s basket of 30 plants are nuclear, I drew the inference that nuclear plant construction costs have increased around 400% since 2000. Versus the PCCI average of 230% across all modes of generation.

Similar to your comments, CERA attributes the increases to the surge in demand and the “dearth of experienced nuclear engineers in North America.”

CERA is tracking similar (210%) increases in the cost of upstream oil & gas projects – the UCCI having a similar 2005 takeoff. Much more depth on energy demand over-running supply can be found in the really excellent CIEP study “Oil turbulence in the next decade – An Essay on High Oil Prices in a Supply-constrained World”, Jan-Hein Jesse and Coby van der Linde, Clingendael International Energy Programme. They conclude that the next decade or so will see high volatility in oil markets – oscillating between marginal cost and user value.

Please advise if you have any references to recommend on the potential for nuclear costs to drop in an industry transformation to “mass production”, relatively speaking, of modular reactor components. Presumably, such standardized components would be pre-certified, so that on site certification would be reduced to a process more like inspections of other industrial facilities?

  1. Steve Darden on August 27, 2008 at 10:31 pm said: Edit


Well, it’s interesting that the CERA index explicitly doesn’t include risk premiums, or owner’s cost. It probably includes construction period interest. If my estimates of their basket are close it indicates a 2000 to 2008 Q1 cost increase of around 400% for nuclear and about 180% for non-nuclear.

I haven’t found a source to build up that figure from first principles – so I can’t confirm the PCCI index. I sat through the one hour CERA web-conference presentation of June 6 – hoping to learn the details. They do have a nuclear index, but didn’t present it. It is part of the distribution package sent to members.

Cheers, Steve

  1. JimHopf on August 28, 2008 at 4:51 pm said: Edit

I’d just like to add a bit to what Rod said earlier. Not only does the CBO study assume a natural gas price of $5 (or $6?) per MBTU, which is lower than the price even today, but they assume that it will remain at $5/6 even if we use gas for all new power plants (and possibly also replace existing coal plants with gas plants to meet CO2 reduction requirements). In other words, they assume that the price will remain fixed at a (low) value of $5/6, no matter how high the demand for gas gets!

They simply state that for CO2 prices between $6 and $45 per ton, gas will be the cheapest source, thereby implying that it will be chosen for all new generation. They ignore all feedback effects. In the real world, as more and more gas is chosen, the price of gas goes up until the price advantage disappears. In fact, the real truth is that, for baseload power, gas will not be an option, as it will be the most, not the least, expensive in the future (even w/ little or no CO2 price), since future gas costs will be way above $6. For that reason, utility executives are not even really thinking about gas as a future baseload option. There simply is not enough gas to go around to waste it on something like baseload generation. The choice will be between nuclear and coal.

The real question is what CO2 price is required to make nuclear cheaper than coal. This price is about $20 to $25 per ton of CO2.

  1. Steve Darden on August 28, 2008 at 6:51 pm said: Edit


Thanks – I agree with your all your points.

This price is about $20 to $25 per ton of CO2.

Doesn’t that depend on the capital cost? At 2005 CAPEX I thought $25 per ton CO2 would do it. At 4 x 2005 costs?

I’m confident new plant costs will come down. I’m optimistic that in a decade, constant dollars, that costs per MW will be lower than the 2005 CERA index.

But what do utility execs believe are the levelized costs?

  1. JimHopf on August 28, 2008 at 11:20 pm said: Edit


Well, the capital cost of coal plants has also gone up since then, as well as the price of coal itself (which has almost doubled). That said, the price of nuclear has gone up even more, if some of the latest estimates are to be believed. Thus, it could be that it would require ~$30 or more (but only for the first set of plants).

Of course, under any cap-and-trade system with hard (declining) CO2 limits, the CO2 price will rise to whatever it has to be to make nuclear cheaper than coal (given that renewables contribution is limited by intermittentcy, and that both gas as coal w/ CO2 sequestration will be more expensive than nuclear).

  1. Steve Darden on August 29, 2008 at 3:31 pm said: Edit

Thanks Jim – two important concepts in your comments

(1) but only for the first set of plants – because once deployment gets well underway the capital costs will come down. Probably operating costs as well.

(2) the CO2 price will rise to whatever it has to be to make nuclear cheaper than coal – because that is the new equilibrium, as existing coal utilities bid up permits until it becomes cheaper to build replacement nuclear than to keep paying for permits.

Regarding (2) I still prefer a revenue-neutral carbon tax over cap-and-trade. Most importantly because it gives utilities a predictable and stable future cost environment. Secondly, because it prevents government from getting its hands on a new revenue stream, while avoiding a rich growth medium for corruption and complexity.

What’s your view on that choice?

PS – I just finished a post on “Greens make the case for nuclear power”.

Comments are closed.

Morgan Stanley predicts driverless car ‘utopia’ by 2026

Morgan Stanley Research

Adam Jonas isn’t Nostradamus, but the Morgan Stanley analyst is predicting the road to complete vehicle autonomy will begin in 2026. What’s more, he says the technology will eventually reach 100 percent market penetration two decades later.

Wall Street forecasts aren’t generally superior to the dart-throwing monkeys, but they do create great charts. At end-2018 we can reality-test the Adam Jonas report: there should be at least a couple of full-autonomous offerings on the market.

The majority of media reports I see on robocars are focused on the idea that Danny-the-driver sells his Prius, buys his first robocar — but continues to drive to work every day. The “big change” is that now Danny can TXT while not-driving instead his usual TXTing-while-driving, otherwise known as DWD, Driving While Distracted. That will save a lot of lives, including those of cyclists like myself.

But that isn’t the significant revolution. The real revolution will be most obvious in high density cities like New York or San Francisco. High robocar penetration will happen there first because of the appeal of “whistle-cars”. We are already seeing this in the explosive growth of Uber, Lyft and similar services. Urbanites are proving they prefer to summon a just-in-time ride on their iPhone. Robocars will make this service even more convenient and a LOT cheaper. That’s pretty much the end of the self-owned urban car market.

Even earlier than the whistle-car we will see delivery-bot vehicles. This is where Amazon is going with their drone development program. The drones will be the “last block” of the delivery web. The delivery-bots will handle the larger, heavier delivery loads and supply the drones and the human powered deliveries.

American cities typically have 40 to 50% of their useable area eaten by automobiles, comprised of streets and parking. Most of that space can be released to productive use once robocars and delivery-bots reach full penetration. Perhaps cities will even allow building again – so the acute shortage of affordable housing can be eliminated. Or maybe not – the same old status-quo people will probably still control the city governments. And they already have their multi-million dollar positions – no “housing crisis” for them.

For a deep dive into the implications of robocars, I recommend Brad Templeton, now a consultant to Google’s robocar program. See Brad’s main robocar page: Where Robot Cars (Robocars) Can Really Take Us. There’s lots more here on Seekerblog

Why did nuclear plant construction costs quadruple from 1972 to 1988?

The short answer is Greenpeace and their cronies such as Friends of the Earth (FOE):

A major source of cost escalation in some plants was delays caused by opposition from well-organized “intervenor” groups that took advantage of hearings and legal strategies to delay construction. The Shoreham plant on Long Island was delayed for 3 years by intervenors who turned the hearings for a construction permit into a circus. The intervenors included a total imposter claiming to be an expert with a Ph.D. and an M.D. There were endless days of reading aloud from newspaper and magazine articles, interminable “cross examination” with no relevance to the issuance of a construction permit, and an imaginative variety of other devices to delay the proceedings and attract media attention.

That quote is from Chapter 9 COSTS OF NUCLEAR POWER PLANTS — WHAT WENT WRONG? of the online version of the book The Nuclear Energy Option by physicist Bernard L. Cohen, University of Pittsburgh. The book was published by Plenum Press, 1990, so it is slightly dated with respect to recent developments in modular mass-manufactured reactors (SMR), etc. Other than that it is a terrific resource — a concise handbook that covers all the high priority questions about nuclear power [risk/safety, radiation, costs, nuclear "waste", proliferation].

Prof. Cohen was there, on the scene so to speak, during the period of the 1970’s, 1980’s when Regulatory Turbulence, Regulatory Ratcheting and Intervenors quadrupled the cost of a nuclear power plant. Here’s an excerpt from Chapter 9 covering Regulatory Ratcheting and Regulatory Turbulence:

The Nuclear Regulatory Commission (NRC) and its predecessor, the Atomic Energy Commission Office of Regulation, as parts of the United States Government, must be responsive to public concern. Starting in the early 1970s, the public grew concerned about the safety of nuclear power plants: the NRC therefore responded in the only way it could, by tightening regulations and requirements for safety equipment.

Make no mistake about it, you can always improve safety by spending more money. Even with our personal automobiles, there is no end to what we can spend for safety — larger and heavier cars, blowout-proof tires, air bags, passive safety restraints, rear window wipers and defrosters, fog lights, more shock-absorbent bumpers, antilock brakes, and so on. In our homes we can spend large sums on fireproofing, sprinkler systems, and smoke alarms, to cite only the fire protection aspect of household safety. Nuclear power plants are much more complex than homes or automobiles, leaving innumerable options for spending money to improve safety. In response to escalating public concern, the NRC began implementing some of these options in the early 1970s, and quickened the pace after the Three Mile Island accident.

This process came to be known as “ratcheting.” Like a ratchet wrench which is moved back and forth but always tightens and never loosens a bolt, the regulatory requirements were constantly tightened, requiring additional equipment and construction labor and materials. According to one study,4 between the early and late 1970s, regulatory requirements increased the quantity of steel needed in a power plant of equivalent electrical output by 41%, the amount of concrete by 27%, the lineal footage of piping by 50%, and the length of electrical cable by 36%. The NRC did not withdraw requirements made in the early days on the basis of minimal experience when later experience demonstrated that they were unnecessarily stringent. Regulations were only tightened, never loosened. The ratcheting policy was consistently followed.

In its regulatory ratcheting activities, the NRC paid some attention to cost effectiveness, attempting to balance safety benefits against cost increases. However, NRC personnel privately concede that their cost estimates were very crude, and more often than not unrealistically low. Estimating costs of tasks never before undertaken is, at best, a difficult and inexact art.


Clearly, the regulatory ratcheting was driven not by new scientific or technological information, but by public concern and the political pressure it generated. Changing regulations as new information becomes available is a normal process, but it would normally work both ways. The ratcheting effect, only making changes in one direction, was an abnormal aspect of regulatory practice unjustified from a scientific point of view. It was a strictly political phenomenon that quadrupled the cost of nuclear power plants, and thereby caused no new plants to be ordered and dozens of partially constructed plants to be abandoned.

Regulatory Turbulence

We now return to the question of wildly escalating labor costs for construction of nuclear plants. They were not all directly the result of regulatory ratcheting, as may be seen from the fact that they did not occur in the “best experience” projects. Regulatory ratcheting applied to new plants about to be designed is one thing, but this ratcheting applied to plants under construction caused much more serious problems. As new regulations were issued, designs had to be modified to incorporate them. We refer to effects of these regulatory changes made during the course of construction as “regulatory turbulence,” and the reason for that name will soon become evident.

As anyone who has tried to make major alterations in the design of his house while it was under construction can testify, making these changes is a very time-consuming and expensive practice, much more expensive than if they had been incorporated in the original design. In nuclear power plant construction, there were situations where the walls of a building were already in place when new regulations appeared requiring substantial amounts of new equipment to be included inside them. In some cases this proved to be nearly impossible, and in most cases it required a great deal of extra expense for engineering and repositioning of equipment, piping, and cables that had already been installed. In some cases it even required chipping out concrete that had already been poured, which is an extremely expensive proposition.

Constructors, in attempting to avoid such situations, often included features that were not required in an effort to anticipate rule changes that never materialized. This also added to the cost. There has always been a time-honored tradition in the construction industry of on-the-spot innovation to solve unanticipated problems; the object is to get things done. The supercharged regulatory environment squelched this completely, seriously hurting the morale of construction crews. For example, in the course of many design changes, miscalculations might cause two pipes to interfere with one another, or a pipe might interfere with a valve. Normally a construction supervisor would move the pipe or valve a few inches, but that became a serious rule violation. He now had to check with the engineering group at the home office, and they must feed the change into their computer programs for analyzing vibrations and resistance to earthquakes. It might take many hours for approval, and in the meanwhile, pipefitters and welders had to stand around with nothing to do.

Requiring elaborate inspections and quality control checks on every operation frequently held up progress. If an inspector needed extra time on one job, he was delayed in getting to another. Again, craft labor was forced to stand around waiting. In such situations, it sometimes pays to hire extra inspectors, who then have nothing to do most of the time. I cannot judge whether all of these new safety procedures were justifiable as safety improvements, but there was a widespread feeling among those involved in implementing them that they were not. Cynicism became rampant and morale sagged

Prof. Cohen goes on to document the history of how Greenpeace and friends managed to destroy the Shoreham, Long Island plant — which was eventually sold to NY state for $1.


But the worst delay came after the Shoreham plant was completed. The NRC requires emergency planning exercises for evacuation of the nearby population in the event of certain types of accidents. The utility provides a system of warning horns and generally plans the logistics, but it is necessary to obtain cooperation from the local police and other civil authorities. Officials in Suffolk County, where Shoreham is located, refused to cooperate in these exercises, making it impossible to fulfill the NRC requirement. After years of delay, the NRC changed its position and ruled that in the event of an actual accident, the police and civil authorities would surely cooperate. It therefore finally issued an operating license. By this time the situation had become a political football, with the governor of New York deeply involved. He apparently decided that it was politically expedient to give in to the opponents of the plant. The state of New York therefore offered to “buy” the plant from the utility for $1 and dismantle it, with the utility receiving enough money from various tax savings to compensate for its construction expenditures. This means that the bill would effectively be footed by U.S. taxpayers. As of this writing, there are moves in Congress to prevent this. The ironic part of the story is that Long Island very badly needs the electricity the Shoreham plant can produce.

“Safe, cheap, clean energy is probably the most important thing an individual can do for the future of the world”


“I have studied a lot about what I think is sort of the best use of my time and money and what I think will help the world the most. And I really do believe that safe, cheap, clean energy is probably the most important thing an individual can do for the future of the world.” — SAM ALTMAN

If you listen to the Econtalk interview I think you will agree that Sam has done his homework. Not surprisingly I think his conclusions are indicators of an open, inquiring mind:

“There are two nuclear energy companies in this batch. I believe that–the 20th century was clearly the carbon century. And I believe the 22nd century is going to be the atomic power century. I’m very convinced of that. It’s just a question of how long it takes us.

Y Combinator is well-positioned to harvest the rewards of innovations that require a long development cycle and heaps of capital. Unlike the typical 10 year venture fund, YC makes a large number of small ($120k) bets, 700+ such bets since Paul Graham launched YC in 2005. New nuclear generation is obviously a very long-term bet.

Question: will the NRC license a new design that isn’t just a variation of existing PWR designs? How is it possible to innovate in this regulatory environment?

I think it will take way too long and too much capital to launch a new design based on NRC licensing. So Sam Altman’s new ventures will almost certainly have to move to a friendly-regulator nation for the initial licensing. Note: Sam is more optimistic  than I am about the NRC. That said, if I were talking my book publicly I would be carefully deferential to the NRC.

Update: I found one of the two YC S14 batch nuclear companies. It is Helion Energy who is building an SMR concept. But it is FUSION, not fission:

Helion is making a fusion engine 1,000 times smaller, over 500 times cheaper, and realizable 10 time faster than other projects.

Obi Wan: could this be the one that works? There’s a bit more at TechCrunch. Enjoy the Sam Altman interview – it’s not your everyday porridge.

Why is Econtalk interviewing all these Silicon Valley entrepreneurs and VCs? Since Russ Roberts in now full time at Stanford’s Hoover Institution, he has been spending more of his time at the vortex of the Silicon Valley innovation cluster. One of the benefits is that he is becoming progressively more involved-with and excited-about the innovation culture. So his Econtalk guests include a growing number of Silicon Valley insiders. In July Russ interviewed Sam Altman, CEO of the iconic accelerator Y Combinator (YC). Sam confessed in the interview that he doesn’t filter himself very well – meaning it was a refreshingly frank discussion.

The Efficient Revolution, the Malthusian Trap and the Circle of Trust

Dear Reader, I hope you’ve experienced joy today. I have experienced the special joy of discovering a fine mind and writer. For me it’s the joy of wanting to know more about their ideas and their progress on an important project.

My discovery was Finnish author Lauri Muranen, proprietor of The Efficient Revolution. I became aware of Lauri recently when I noticed smart Tweets originating from one @LauriMuranen. That meant we were following some of the same people — an indication that Lauri may be someone I will profit from following — he could be a source of fresh ideas and perspectives.

So, what happened today was Lauri replied to Richard Tol’s citation — a citation of what I took to be another tiresome “peak planet” piece. I got only so far as the Guardian headline, that was enough for me.

Limits to Growth was right. New research shows we’re nearing collapse: Four decades after the book was published, Limit to Growth’s forecasts have been vindicated by new Australian research. Expect the early stages of global collapse to start appearing soon.

Minutes later Lauri tweeted to Richard “I suppose it didn’t occur to the authors that most of the indicators they present paint a rather positive picture”. Hmm… I conclude that I need to go back and actually read that article. Lauri was precisely correct — read the linked piece, you’ll see why I was motivated to go one more step — to see where else Lauri is writing. My good fortune today was that Lauri’s topmost post was The World Overshoot Day, headed by the above Earth Overshoot Days graphic. That “closes the sale”, for now I know I need to read further. Looking for Lauri’s RSS feed, I don’t find it on the homepage, so I open the homepage source to search for the usual RSS/feed keywords. Ah, there it is — so Lauri is added to my Energy Policy feeds, then I go back to reading About The Efficient Revolution.

May I explain why I’ve bothered to write about today’s discovery? That’s because this is a good example of the discovery process by which I grow my “circle of trust”. That’s a terrible name, but it’s what I have for many years called the group of thinkers that get a share of my attention. My attention is just about the most valuable thing I have, so I try to squeeze the most value I can from the scarce minutes of my attention. 

So what happened here is very simple. I follow on Twitter a very clever Dutch energy economist Richard Tol or @RichardTol. I do that because I learn new things from Richard, while he is very careful not to waste my attention. E.g., by tweeting 50 items per day. By sharing those citations with me, Richard curates a part of his world. For me the signal-to-noise ratio of Richard’s transmission is very high, so we have a deal.

All the members of my “circle of trust” are like Richard, in that they are much more clever than I, so devoting some of my attention to their signal rewards me highly for the fragments of time I’m able to spend with them. As you’ve probably surmised, everyone in my “circle of trust” got there via referral by earlier members. After reading some of the new candidate’s work I may decide to give them a probationary membership. They keep the membership so long as they hold up their end: very high-quality and high-signal-to-noise.

I hope you are still with me, because my objective today is to persuade you to follow @LauriMuranen, to read The Efficient Revolution, and hopefully to contribute to Lauri’s project. I think I can “close the sale” if you’ll hang in there to read just a few paragraphs from his About the Efficient Revolution

The efficient revolution is my attempt to write a ‘crowd sourced’ book about the story how humanity has been able to cut its chains of virtual slavery to the finite boundaries of earth. The success has been achieved via circumventing those boundaries with efficiency – by getting more out from less. Moreover there is plenty of evidence suggesting that this will be the way we can escape our current predicament.

Let me explain why this particular story and why crowd sourced.

One often hears that we are on the brink of peak this or verge of that. Be it oil, phosphorus, fresh water, employment or common sense. In effect, we are told that we are overshooting the environment’s capacity to replenish resources on par with our consumption.

While overconsumption does present major challenges, I would argue that this line of thinking constitutes a Malthusian thought trap.

Reverend Thomas Robert Malthus was an English economist, who predicted in his 1798 classic An Essay to the Principle of Population, that England would soon face severe food crisis due to quickly rising population. The idea was that while human population is growing exponentially (1, 2, 4, 8…), food production only grows linearly (1, 2, 3, 4…). The inevitable consequence of such development is that at some point food consumption will exceed food production and hunger will result.

This dilemma is known as a Malthusian trap.

What I call a Malthusian thought trap is the failure to appreciate the dynamics of developed human societies to innovate their way out of such traps, as happened in England in the 1800s and as is happening in the world today. A Malthusian prediction, such as the famous Club of Rome prediction on the depletion of world’s resources, assumes that societies stand idly by as the proverbial house around them is on fire.

This is not the case of course.


The central argument of this book/blog is that human societies have been far better able to escape the traps of finite resources and environmental constraints (amid growing populations) than they get credit for.

Sold? Excellent — you can see how I got hooked:-) Not sold? Well, did you see how Lauri introduces the Comment area of each post?


This makes it completely clear why he terms this project a ‘crowd sourced’ book. I see a resonance there with my SeekerBlog tagline “Many of the things we think are true are not. Together we can fix that.”