CERA: Construction costs for new nuclear plants up over 230% since 2000

UPDATE: I have republished this 2008 post and comments to bring it “up front” with our ongoing discussion of new nuclear construction costs. At the end I’ve incorporated the 2008 comments. 

UPDATE: Per Peterson, Professor and a former chair of the Department of Nuclear Engineering at the University of California, Berkeley, was kind enough to comment on yesterday’s post on the CBO study. Dr. Peterson noted that only about 1% of new nuclear plant construction cost is construction materials. So the theme attributing the rapid cost rises to commodity prices has no basis. Contrariwise, wind turbine construction/installation require at least 10x the materials input per kilowatt — so have higher sensitivity to price and availability of steel, concrete, copper, etc. I cannot accurately summarize in fewer words, so I recommend you read his comments carefully.

Dan Yergin’s Cambridge Energy Research Associates (CERA) maintains the Power Capital Costs Index (PCCI), depicted in the graphic at left – as of May 2008 [click on the thumbnail for full size chart]. In brief the PCCI shows that a power plant that cost $1 billion in 2000 would, on average, cost $2.31 billion in May [in constant 2000 dollars].

You can infer that the increase in the cost of new nuclear plant construction has increased by more than that 230%. As you can see in the PCCI chart the non-nuclear costs are up 180%. The PCCI is assembled from data on a basket of 30 power generation facilities in North America. I don’t know what percentage of the capital base is nuclear so I’ll speculate that it’s similar to the current 22% that nuclear contributes to US generation. That implies nuclear construction costs are up about 400% since 2000.

I may be able to get more background from the CERA Capital Cost Analysis Forum – Power. But I discovered that viewing the replay of the June 6 webconference call required IE6, so I’ll need to fire up a windows PC to access it.

On factors driving the PCCI increases since 2000, CERA writes:

…Demand for new power generation facilities remains high worldwide, leading to continued tightness in equipment markets. Cost increases, supply issues and longer delivery times are exacerbated as manufacturers struggle to keep up with demand. The weakening U.S. dollar also increases the costs of global procurement for equipment and materials.

The number of engineers in the workforce is also declining as older workers retire and are not being replaced. The resulting shortages in plant design teams add additional delays to construction schedules. The current increase in construction for nuclear power generation and the dearth of experienced nuclear engineers in North America has been a key driver behind cost escalation.

Recent cancellations of proposed coal plants in the United States due to uncertainty over environmental regulations has provided some slowing in cost increases in the U.S. coal industry. However, international competition for coal boilers, particularly in Southeast Asia, is keeping the equipment order books very active.

Concerns over a looming U.S. recession and subsequent cut backs in residential construction have offered little relaxation to power construction. The residential slump does not free up the skilled workers required in the power industry and there is no overlap of the specialist metals and equipment required.


Upstream Capital Cost Index (UCCI) Courtesy IHS

I wonder if we are looking at market reactions to an impulse in demand. In the short run [say 5 years] the supply of new nuclear plants is inelastic. Demand has increased considerably beyond expectations, so equilibrium is only achieved by higher prices. We are seeing similar supply/demand responses in several energy sectors. The headlines hammer on oil prices. Note that the UCCI is only 10% less than the PCCI.

The UCCI is based upon a portfolio of 28 upstream oil and gas projects, so it represents the overnight cost of investment in both oil & gas field development and transportation. It may include finding costs, but I’m not sure. I do know that the cost per barrel-equivalent of finding + development costs has been increasing about as fast as oil companies have been able to ramp up their investments. The net result so far is no increase in reserve-additions, which are still lagging depletion.

2 thoughts on “CERA: Construction costs for new nuclear plants up over 230% since 2000”

  1. Paul on December 4, 2008 at 1:44 pm said: Edit

“only about 1% of new nuclear plant construction cost is construction materials” – sorry, I don’t think so. More like 30% at least.

  1. Steve Darden on December 4, 2008 at 7:02 pm said: Edit

_More like 30% at least.
_

Paul, thanks heaps for your comments. Here’s the relevant part of Dr. Peterson’s comment on commodity inputs [he gives the references as well]:

_While it is widely understood that nuclear energy costs have quite low sensitivity to the cost of uranium, it is not widely appreciated that the same applies to construction materials. If one takes the total quantity of steel, concrete, copper, and other materials required to build a light water reactor similar to those operating today 1, and then multiplies these quantities by the respective current commodity prices, the total contribution of commodity inputs is $36 per kilowatt of generation capacity 2, out of total construction prices that are estimated today to range from $3000 to $5000 per kilowatt today. The dominant cost of nuclear construction is instead in the added value that comes from converting these commodities into an operational nuclear power plant. Conversely, wind turbines require approximately a factor of 10 times as much steel and concrete to construct without considering storage capacity 3, and thus have construction costs that are sensitive to commodity costs and to potential future resource scarcity.
_

So he gave a range of 36/3000 to 36/5000 or 0.7% to 1.2%.

Can you educate us on the construction cost buildup – also on why quotes have gone up so much since 2000?

CBO Study: Nuclear Power’s Role in Generating Electricity

UPDATE: I have republished this 2008 post and comments to bring it “up front” with our ongoing discussion of new nuclear construction costs. At the end I’ve incorporated the 2008 comments.

I’ve been re-reading the CBO study from May 2008. This is probably the most current objective analysis of base load electrical generation options. Given the CBO levelized costing assumptions it appears that electric utilities will choose natural gas over 3rd generation nuclear unless they anticipate more than $45/ton CO2 carbon tax or equivalent:

Adding a carbon dioxide charge of about $45 per metric ton to the levelized cost estimates in the reference scenario would make nuclear power the least expensive source of additional base-load capacity (see the left panel of Figure 3-2). Up to that threshold, at all but the lowest level of charges, conventional natural gas technology would probably be the least costly option. Because coal is more carbon-intense than natural gas, the cost advantage of new capacity based on natural gas technology would grow in relation to coal technology as carbon dioxide charges increased; but the advantage that natural gas technology enjoyed over nuclear technology would shrink and eventually disappear as emission charges reached about $45 per metric ton. Thereafter, the levelized cost advantage of nuclear technology over conventional gas technology would grow. Although carbon dioxide charges would not change the cost of nuclear power plants at all, they would increase the cost of innovative fossil-fuel alternatives; as a result, the cost advantage that nuclear technology held over those technologies would increase with carbon dioxide charges but at a slower rate than that observed with conventional fossil-fuel technologies.

We know that construction costs for all types of generation have been going up rapidly with the increasing costs for steel, concrete etc. Nuclear is the most sensitive to construction costs, simply because nuclear fuel costs are negligible [conversely nuclear is insensitive to future fuel cost rises, but natural gas is extremely sensitive.) Here’s the relative sensitivities to lower or higher construction costs — again levelized 2006 dollars per megawatt hour:

The CBO study of course has to stick with already-built or on-order nuclear technology. But this may lead to drawing the wrong conclusions. Remember how much autos cost when each one was custom built? And the lousy quality?

That’s our experience of nuclear construction — custom design, custom built, custom approvals. But, given certainty of future CO2 charges, I believe that a competitive market will transform nuclear generation into a mass produced, modular product — and that costs will come down dramatically compared to alternatives.

We don’t know what future innovations will emerge, but as of today, the modular pebble-bed reactor [PBMR] technology looks very promising. Key advantages are safety by design (even chimps as operators can’t cause a nuclear accident), no proliferation worries, and perhaps most important – the design is MODULAR. That means industrial-scale mass production is possible, with all the attendant benefits. One of the most important benefits is the slashing of the financial risk of regulatory delays before a new plant is allowed to start up.

For more background on the Modular Pebble-bed design, see MIT’s study “The Future of Nuclear Power” [1], MIT prof. Andrew C. Kadak’s presentation “What Will it Take to Revive Nuclear Energy?” [PDF] [2], and his Pebble-bed presentation [PDF] [2a]. China is placing big bets here, see Wired’s “Let a Thousand Reactors Bloom” [3].

10 thoughts on “CBO Study: Nuclear Power’s Role in Generating Electricity”

  1. Rod Adams on August 26, 2008 at 8:06 pm said: Edit

Steve:

It is always important to check the assumptions. The CBO study that you pointed to, though completed in 2008, apparently used a fuel price table that stopped with 2005 fuel prices. It thus assumed a base case of natural gas costing about $5.00 per million BTU.

Since the cost of fuel is about 93% of the levelized cost of electricity from a natural gas fired power plant, underestimating the cost of gas would tend to sway the computed decision in the wrong direction compared to less fuel intensive alternatives like nuclear power.

Nuclear can compete without a carbon tax against gas at current market prices – which are about $8.50 per million BTU and have been as high as $13 in the recent past and may get there again with a cold winter.

Luckily for gas buyers, it has been a fairly mild summer.

  1. Steve Darden on August 26, 2008 at 9:10 pm said: Edit

Rod – thanks for the data update. Does the increase in construction costs since the timestamp on the report data offset the underestimated natural gas prices? I.e., gas operating costs up, nuclear plant construction costs up.

I added PBMR to this post – since folks search for this acronym.

  1. Per Peterson on August 27, 2008 at 10:48 am said: Edit

While it is widely understood that nuclear energy costs have quite low sensitivity to the cost of uranium, it is not widely appreciated that the same applies to construction materials. If one takes the total quantity of steel, concrete, copper, and other materials required to build a light water reactor similar to those operating today 1, and then multiplies these quantities by the respective current commodity prices, the total contribution of commodity inputs is $36 per kilowatt of generation capacity 2, out of total construction prices that are estimated today to range from $3000 to $5000 per kilowatt today. The dominant cost of nuclear construction is instead in the added value that comes from converting these commodities into an operational nuclear power plant. Conversely, wind turbines require approximately a factor of 10 times as much steel and concrete to construct without considering storage capacity 3, and thus have construction costs that are sensitive to commodity costs and to potential future resource scarcity.

Right now demand for new reactors is clearly outstripping supply. While this current supply chain inelasticity will ease in 5 to 10 years, inelasticity in supply always results in higher prices. Thus we can expect nuclear construction prices to drop over the coming decade, but the main question is by how much. While it will never get down to the $36/kW cost of the commodity inputs, there is still potential that prices could drop greatly from the current values if modular construction and factory-based computer aided manufacturing are applied more broadly in the construction.

References:

  1. From R.H. Bryan and I.T. Dudley, Oak Ridge National Laboratory, TM-4515, June 1974, current pressurized water reactors use 32.7 t of steel, 75 m3 of concrete, 0.69 t of copper, and smaller amounts of other materials per megawatt of capacity
  2. On March 25, 2008, the commodity prices of steel, concrete, and copper (which constitute 90% of the total commodities costs for a nuclear plant) were $601/t, $98/m3, and $7,634/t respectively.
  3. Wind requires 115 MT of steel and 218 m3 of concrete per megawatt, but has higher commodity input per unit of electricity generated due to a lower capacity factor (~25%) compared to nuclear (~90%), S. Pacca and A. Horvath, Environ. Sci. Technol., 36, 3194-3200 (2002).

    1. Rod Adams on August 27, 2008 at 2:26 pm said: Edit

The interesting thing about the numbers that are being bandied about with regard to nuclear construction costs is that they also include rather substantial allowances for risk premiums, interest costs, and inflation uncertainties.

Those costs can represent half of the final delivered price computation.

  1. Steve Darden on August 27, 2008 at 5:10 pm said: Edit

Dr. Peterson,

Thanks for taking the time to set us straight on the material inputs. 1% means nuclear plant costs are highly insensitive to that component. The CBO study bypassed the contributions to cost increases in their sensitivity analysis – simply assuming -50%, +100%.

Today I wrote a related post on the CERA index of power plant construction. Back of the envelope, assuming 22% of CERA’s basket of 30 plants are nuclear, I drew the inference that nuclear plant construction costs have increased around 400% since 2000. Versus the PCCI average of 230% across all modes of generation.

Similar to your comments, CERA attributes the increases to the surge in demand and the “dearth of experienced nuclear engineers in North America.”

CERA is tracking similar (210%) increases in the cost of upstream oil & gas projects – the UCCI having a similar 2005 takeoff. Much more depth on energy demand over-running supply can be found in the really excellent CIEP study “Oil turbulence in the next decade – An Essay on High Oil Prices in a Supply-constrained World”, Jan-Hein Jesse and Coby van der Linde, Clingendael International Energy Programme. They conclude that the next decade or so will see high volatility in oil markets – oscillating between marginal cost and user value.

Please advise if you have any references to recommend on the potential for nuclear costs to drop in an industry transformation to “mass production”, relatively speaking, of modular reactor components. Presumably, such standardized components would be pre-certified, so that on site certification would be reduced to a process more like inspections of other industrial facilities?

  1. Steve Darden on August 27, 2008 at 10:31 pm said: Edit

Rod,

Well, it’s interesting that the CERA index explicitly doesn’t include risk premiums, or owner’s cost. It probably includes construction period interest. If my estimates of their basket are close it indicates a 2000 to 2008 Q1 cost increase of around 400% for nuclear and about 180% for non-nuclear.

I haven’t found a source to build up that figure from first principles – so I can’t confirm the PCCI index. I sat through the one hour CERA web-conference presentation of June 6 – hoping to learn the details. They do have a nuclear index, but didn’t present it. It is part of the distribution package sent to members.

Cheers, Steve

  1. JimHopf on August 28, 2008 at 4:51 pm said: Edit

I’d just like to add a bit to what Rod said earlier. Not only does the CBO study assume a natural gas price of $5 (or $6?) per MBTU, which is lower than the price even today, but they assume that it will remain at $5/6 even if we use gas for all new power plants (and possibly also replace existing coal plants with gas plants to meet CO2 reduction requirements). In other words, they assume that the price will remain fixed at a (low) value of $5/6, no matter how high the demand for gas gets!

They simply state that for CO2 prices between $6 and $45 per ton, gas will be the cheapest source, thereby implying that it will be chosen for all new generation. They ignore all feedback effects. In the real world, as more and more gas is chosen, the price of gas goes up until the price advantage disappears. In fact, the real truth is that, for baseload power, gas will not be an option, as it will be the most, not the least, expensive in the future (even w/ little or no CO2 price), since future gas costs will be way above $6. For that reason, utility executives are not even really thinking about gas as a future baseload option. There simply is not enough gas to go around to waste it on something like baseload generation. The choice will be between nuclear and coal.

The real question is what CO2 price is required to make nuclear cheaper than coal. This price is about $20 to $25 per ton of CO2.

  1. Steve Darden on August 28, 2008 at 6:51 pm said: Edit

Jim,

Thanks – I agree with your all your points.

This price is about $20 to $25 per ton of CO2.

Doesn’t that depend on the capital cost? At 2005 CAPEX I thought $25 per ton CO2 would do it. At 4 x 2005 costs?

I’m confident new plant costs will come down. I’m optimistic that in a decade, constant dollars, that costs per MW will be lower than the 2005 CERA index.

But what do utility execs believe are the levelized costs?

  1. JimHopf on August 28, 2008 at 11:20 pm said: Edit

Steve,

Well, the capital cost of coal plants has also gone up since then, as well as the price of coal itself (which has almost doubled). That said, the price of nuclear has gone up even more, if some of the latest estimates are to be believed. Thus, it could be that it would require ~$30 or more (but only for the first set of plants).

Of course, under any cap-and-trade system with hard (declining) CO2 limits, the CO2 price will rise to whatever it has to be to make nuclear cheaper than coal (given that renewables contribution is limited by intermittentcy, and that both gas as coal w/ CO2 sequestration will be more expensive than nuclear).

  1. Steve Darden on August 29, 2008 at 3:31 pm said: Edit

Thanks Jim – two important concepts in your comments

(1) but only for the first set of plants – because once deployment gets well underway the capital costs will come down. Probably operating costs as well.

(2) the CO2 price will rise to whatever it has to be to make nuclear cheaper than coal – because that is the new equilibrium, as existing coal utilities bid up permits until it becomes cheaper to build replacement nuclear than to keep paying for permits.

Regarding (2) I still prefer a revenue-neutral carbon tax over cap-and-trade. Most importantly because it gives utilities a predictable and stable future cost environment. Secondly, because it prevents government from getting its hands on a new revenue stream, while avoiding a rich growth medium for corruption and complexity.

What’s your view on that choice?

PS – I just finished a post on “Greens make the case for nuclear power”.

Comments are closed.

Why did nuclear plant construction costs quadruple from 1972 to 1988?

The short answer is Greenpeace and their cronies such as Friends of the Earth (FOE):

A major source of cost escalation in some plants was delays caused by opposition from well-organized “intervenor” groups that took advantage of hearings and legal strategies to delay construction. The Shoreham plant on Long Island was delayed for 3 years by intervenors who turned the hearings for a construction permit into a circus. The intervenors included a total imposter claiming to be an expert with a Ph.D. and an M.D. There were endless days of reading aloud from newspaper and magazine articles, interminable “cross examination” with no relevance to the issuance of a construction permit, and an imaginative variety of other devices to delay the proceedings and attract media attention.

That quote is from Chapter 9 COSTS OF NUCLEAR POWER PLANTS — WHAT WENT WRONG? of the online version of the book The Nuclear Energy Option by physicist Bernard L. Cohen, University of Pittsburgh. The book was published by Plenum Press, 1990, so it is slightly dated with respect to recent developments in modular mass-manufactured reactors (SMR), etc. Other than that it is a terrific resource — a concise handbook that covers all the high priority questions about nuclear power [risk/safety, radiation, costs, nuclear "waste", proliferation].

Prof. Cohen was there, on the scene so to speak, during the period of the 1970’s, 1980’s when Regulatory Turbulence, Regulatory Ratcheting and Intervenors quadrupled the cost of a nuclear power plant. Here’s an excerpt from Chapter 9 covering Regulatory Ratcheting and Regulatory Turbulence:

The Nuclear Regulatory Commission (NRC) and its predecessor, the Atomic Energy Commission Office of Regulation, as parts of the United States Government, must be responsive to public concern. Starting in the early 1970s, the public grew concerned about the safety of nuclear power plants: the NRC therefore responded in the only way it could, by tightening regulations and requirements for safety equipment.

Make no mistake about it, you can always improve safety by spending more money. Even with our personal automobiles, there is no end to what we can spend for safety — larger and heavier cars, blowout-proof tires, air bags, passive safety restraints, rear window wipers and defrosters, fog lights, more shock-absorbent bumpers, antilock brakes, and so on. In our homes we can spend large sums on fireproofing, sprinkler systems, and smoke alarms, to cite only the fire protection aspect of household safety. Nuclear power plants are much more complex than homes or automobiles, leaving innumerable options for spending money to improve safety. In response to escalating public concern, the NRC began implementing some of these options in the early 1970s, and quickened the pace after the Three Mile Island accident.

This process came to be known as “ratcheting.” Like a ratchet wrench which is moved back and forth but always tightens and never loosens a bolt, the regulatory requirements were constantly tightened, requiring additional equipment and construction labor and materials. According to one study,4 between the early and late 1970s, regulatory requirements increased the quantity of steel needed in a power plant of equivalent electrical output by 41%, the amount of concrete by 27%, the lineal footage of piping by 50%, and the length of electrical cable by 36%. The NRC did not withdraw requirements made in the early days on the basis of minimal experience when later experience demonstrated that they were unnecessarily stringent. Regulations were only tightened, never loosened. The ratcheting policy was consistently followed.

In its regulatory ratcheting activities, the NRC paid some attention to cost effectiveness, attempting to balance safety benefits against cost increases. However, NRC personnel privately concede that their cost estimates were very crude, and more often than not unrealistically low. Estimating costs of tasks never before undertaken is, at best, a difficult and inexact art.

(…)

Clearly, the regulatory ratcheting was driven not by new scientific or technological information, but by public concern and the political pressure it generated. Changing regulations as new information becomes available is a normal process, but it would normally work both ways. The ratcheting effect, only making changes in one direction, was an abnormal aspect of regulatory practice unjustified from a scientific point of view. It was a strictly political phenomenon that quadrupled the cost of nuclear power plants, and thereby caused no new plants to be ordered and dozens of partially constructed plants to be abandoned.

Regulatory Turbulence

We now return to the question of wildly escalating labor costs for construction of nuclear plants. They were not all directly the result of regulatory ratcheting, as may be seen from the fact that they did not occur in the “best experience” projects. Regulatory ratcheting applied to new plants about to be designed is one thing, but this ratcheting applied to plants under construction caused much more serious problems. As new regulations were issued, designs had to be modified to incorporate them. We refer to effects of these regulatory changes made during the course of construction as “regulatory turbulence,” and the reason for that name will soon become evident.

As anyone who has tried to make major alterations in the design of his house while it was under construction can testify, making these changes is a very time-consuming and expensive practice, much more expensive than if they had been incorporated in the original design. In nuclear power plant construction, there were situations where the walls of a building were already in place when new regulations appeared requiring substantial amounts of new equipment to be included inside them. In some cases this proved to be nearly impossible, and in most cases it required a great deal of extra expense for engineering and repositioning of equipment, piping, and cables that had already been installed. In some cases it even required chipping out concrete that had already been poured, which is an extremely expensive proposition.

Constructors, in attempting to avoid such situations, often included features that were not required in an effort to anticipate rule changes that never materialized. This also added to the cost. There has always been a time-honored tradition in the construction industry of on-the-spot innovation to solve unanticipated problems; the object is to get things done. The supercharged regulatory environment squelched this completely, seriously hurting the morale of construction crews. For example, in the course of many design changes, miscalculations might cause two pipes to interfere with one another, or a pipe might interfere with a valve. Normally a construction supervisor would move the pipe or valve a few inches, but that became a serious rule violation. He now had to check with the engineering group at the home office, and they must feed the change into their computer programs for analyzing vibrations and resistance to earthquakes. It might take many hours for approval, and in the meanwhile, pipefitters and welders had to stand around with nothing to do.

Requiring elaborate inspections and quality control checks on every operation frequently held up progress. If an inspector needed extra time on one job, he was delayed in getting to another. Again, craft labor was forced to stand around waiting. In such situations, it sometimes pays to hire extra inspectors, who then have nothing to do most of the time. I cannot judge whether all of these new safety procedures were justifiable as safety improvements, but there was a widespread feeling among those involved in implementing them that they were not. Cynicism became rampant and morale sagged

Prof. Cohen goes on to document the history of how Greenpeace and friends managed to destroy the Shoreham, Long Island plant — which was eventually sold to NY state for $1.

NewImage

But the worst delay came after the Shoreham plant was completed. The NRC requires emergency planning exercises for evacuation of the nearby population in the event of certain types of accidents. The utility provides a system of warning horns and generally plans the logistics, but it is necessary to obtain cooperation from the local police and other civil authorities. Officials in Suffolk County, where Shoreham is located, refused to cooperate in these exercises, making it impossible to fulfill the NRC requirement. After years of delay, the NRC changed its position and ruled that in the event of an actual accident, the police and civil authorities would surely cooperate. It therefore finally issued an operating license. By this time the situation had become a political football, with the governor of New York deeply involved. He apparently decided that it was politically expedient to give in to the opponents of the plant. The state of New York therefore offered to “buy” the plant from the utility for $1 and dismantle it, with the utility receiving enough money from various tax savings to compensate for its construction expenditures. This means that the bill would effectively be footed by U.S. taxpayers. As of this writing, there are moves in Congress to prevent this. The ironic part of the story is that Long Island very badly needs the electricity the Shoreham plant can produce.

“Safe, cheap, clean energy is probably the most important thing an individual can do for the future of the world”

Samaltman

“I have studied a lot about what I think is sort of the best use of my time and money and what I think will help the world the most. And I really do believe that safe, cheap, clean energy is probably the most important thing an individual can do for the future of the world.” — SAM ALTMAN

If you listen to the Econtalk interview I think you will agree that Sam has done his homework. Not surprisingly I think his conclusions are indicators of an open, inquiring mind:

“There are two nuclear energy companies in this batch. I believe that–the 20th century was clearly the carbon century. And I believe the 22nd century is going to be the atomic power century. I’m very convinced of that. It’s just a question of how long it takes us.

Y Combinator is well-positioned to harvest the rewards of innovations that require a long development cycle and heaps of capital. Unlike the typical 10 year venture fund, YC makes a large number of small ($120k) bets, 700+ such bets since Paul Graham launched YC in 2005. New nuclear generation is obviously a very long-term bet.

Question: will the NRC license a new design that isn’t just a variation of existing PWR designs? How is it possible to innovate in this regulatory environment?

I think it will take way too long and too much capital to launch a new design based on NRC licensing. So Sam Altman’s new ventures will almost certainly have to move to a friendly-regulator nation for the initial licensing. Note: Sam is more optimistic  than I am about the NRC. That said, if I were talking my book publicly I would be carefully deferential to the NRC.

Update: I found one of the two YC S14 batch nuclear companies. It is Helion Energy who is building an SMR concept. But it is FUSION, not fission:

Helion is making a fusion engine 1,000 times smaller, over 500 times cheaper, and realizable 10 time faster than other projects.

Obi Wan: could this be the one that works? There’s a bit more at TechCrunch. Enjoy the Sam Altman interview – it’s not your everyday porridge.

Why is Econtalk interviewing all these Silicon Valley entrepreneurs and VCs? Since Russ Roberts in now full time at Stanford’s Hoover Institution, he has been spending more of his time at the vortex of the Silicon Valley innovation cluster. One of the benefits is that he is becoming progressively more involved-with and excited-about the innovation culture. So his Econtalk guests include a growing number of Silicon Valley insiders. In July Russ interviewed Sam Altman, CEO of the iconic accelerator Y Combinator (YC). Sam confessed in the interview that he doesn’t filter himself very well – meaning it was a refreshingly frank discussion.

How can the developing world escape poverty without climate change calamity?

This article is the result of some very interesting discussions below a recent TEC article on the potential of coal, nuclear and wind/solar to supply the rapidly growing energy needs of the developing world. In that article, I estimated that nuclear is roughly an order of magnitude less scalable than coal, but more than double as scalable as wind/solar. These estimations were challenged by both nuclear and wind advocates and, as such critical discussions often do, have prompted much closer investigations into this issue. In particular, data pertaining to the near-term prospects of nuclear energy in China, the nation accounting for fully 43% of nuclear plants currently under construction, has been analysed in more detail. — SCHALK CLOETE

Schalk Cloete’s superpower is the ability to execute and explain exactly the analysis required to penetrate a difficult, controversial topic. And there are a few others – you know who you are. 

Schalk’s recent article Can Nuclear Make a Substantial Near-Term Contribution? supports answers to my “most important questions”: How can we help the large fast-growers to make the transition from fossil to clean energy? For discussion, let’s focus on three key nations:

  1. China
  2. India
  3. Africa

The reason I posed this in terms of three different developing countries is because the support & partnership that the rich countries can offer is different in each case. 

  1. China is already putting more resource than any other nation into building up their nuclear deployment capability. Even so, China can benefit hugely from without-limit contributions of capital, science, and engineering know how. I left regulatory know how off that list, though there may be possible contributions there. As it stands today the US NRC is probably mostly a hinderance to the deployment of advanced nuclear – not because of the NRC staff, but because of the budgetary straight-jacket imposed by the US Congress (make the ‘customers’ pay for everything up front).
  2. India is improving their nuclear deployment capability at a slow, deliberate pace. But India too could benefit from external technology contributions. Remember that India was cut off for decades from western nuclear tech as punishment for their indigenous nuclear weapons development.
  3. Africa needs affordable energy-machines that are suitable to their infrastructure and operational capabilities. If Africa does not have access to affordable and suitable nuclear they will have no real choice but to build more and more coal and gas.

Cumulative CO2 avoidance potential over lifetime of investment (Gton CO2)

 

Our affordability challenge is that we need to offer clean, reliable electricity at the best price per ton CO2 avoided. So what can compete economically with coal and natural gas? If you study Schalk’s chart for a few minutes I think you will conclude, as I have, that we need to pull out all the stops to accelerate deployment of mass-manufactured “nuclear batteries”. By “batteries” I mean simply that no-maintenance energy-machines that can be rapidly installed by underground burial, connected to the grid, then left alone for up to four decades until the maintenance crew arrives to replace the “battery”, trundling the original off to the factory for refueling. 

China is training-up to build and staff Western-style plants like the AP1000 – which China will be building internally on Chinese-owned IP. That is not going to happen very soon and at scale in Africa. While my guess is that India will need some time to develop their skill-base and supply chain. Sadly, Greenpeace has succeeded in preventing availability of the simple plants that Africa wants to purchase. Given the reality of the nuclear supply chain, it will be close to two decades before vendors are manufacturing and installing plants suitable for most low-tech nations.

Africa isn’t waiting for someone to make a clean generation option available to energize their growth. Currently seven of the ten fastest growing economies are in Africa. Sadly the massive scale of African urbanization and growth is going to be enabled the same way it happened in Europe, N and S America – building relatively cheap coal and gas plants as fast as they can be built. That trajectory will end very badly unless we get serious about what happens next. We can create a happy ending if, inside the next two decades, we achieve the capability to produce affordable nuclear plants that can be installed and operated without losing two additional decades developing a deeply-trained nuclear workforce and local supply chain. By 2015 Africa’s urban population is expected to triple [UN World Urbanization Prospects: The 2011 Revision].

It’s obvious that these SMR designs must be substitutable for the fossil thermal machines that got built in the first phase of dirty industrialization. It will be a lot easier and cheaper if the first-stage dirty plants are designed for such an evolution: rip the dirty heat out, stick the clean heat in.

There’s heaps more to be learned by studying Schalk’s essay, so get on over there. If you find any flaws in his work, please contribute to the dialogue there on TEC (I am subscribed to those comments).

Footnotes from Shalk’s essay: why China’s nuclear avoidance potential is actually greater than the above chart.

[1] It should also be mentioned that the Chinese tariff system favors wind over nuclear by paying a fixed feed-in tariff of $83–100/MWh to wind and $70/MWh to nuclear. Another important factor to consider is the reduced value of wind relative to nuclear due to the variability of wind power (see my previous articles on this subject here and here). Wind power also requires expensive high voltage transmission networks to transport power from good wind locations to population centres, something which is creating substantial challenges. Thus, if the playing field were to be leveled, the difference between nuclear and wind scaling rates should increase substantially.

LNT, UNSCEAR and the NRC “State-of-the-Art Reactor Consequence Analyses”

UNSCEAR 2012 “Therefore, the Scientific Committee does not recommend multiplying very low doses by large numbers of individuals to estimate numbers of radiation-induced health effects within a population exposed to incremental doses at levels equivalent to or lower than natural background levels;”

The main NRC SOARCA page, which indexes the definitive 2012 NRC severe accident study. This study is large so I’ll rely on the NRC’s own words of summary:

SOARCA’s main findings fall into three basic areas: how a reactor accident progresses; how existing systems and emergency measures can affect an accident’s outcome; and how an accident would affect the public’s health. The project’s preliminary findings include:

  • Existing resources and procedures can stop an accident, slow it down or reduce its impact before it can affect public health;
  • Even if accidents proceed uncontrolled, they take much longer to happen and release much less radioactive material than earlier analyses suggested; and
  • The analyzed accidents would cause essentially zero immediate deaths and only a very, very small increase in the risk of long-term cancer deaths.

Rod Adams posted his thorough analysis of UNSCEAR here, which Rod summarizes thusly:

  • The individual early fatality risk from SOARCA scenarios is essentially zero.
  • Individual LCF risk from the selected specific, important scenarios is thousands of times lower than the NRC Safety Goal and millions of times lower than the general cancer fatality risk in the United States from all causes, even assuming the LNT dose-response model.

If I may underscore that last: even assuming the LNT dose-response model For more plain English here’s UK environmentalist Mark Lynas in Why Fukushima death toll projections are based on junk science:

As the Health Physics Society explains[1] in non-scientific language anyone can understand:

…the concept of collective dose has come under attack for some misuses. The biggest example of this is in calculating the numbers of expected health effects from exposing large numbers of people to very small radiation doses. For example, you might predict that, based on the numbers given above, the population of the United States would have about 40,000 fatal cancers from background radiation alone. However, this is unlikely to be true for a number of reasons. Recently, the International Council on Radiation Protection issued a position statement saying that the use of collective dose for prediction of health effects at low exposure levels is not appropriate. The reason for this is that if the most highly exposed person receives a trivial dose, then everyone’s dose will be trivial and we can’t expect anyone to get cancer. [my emphasis]

The HPS illustrates this commonsensical statement with the following analogy:

Another way to look at it is that if I throw a 1-gram rock at everyone in the United States then, using the collective dose model, we could expect 270 people to be crushed to death because throwing a one-ton rock at someone will surely kill them. However, we know this is not the case because nobody will die from a 1-gram rock. The Health Physics Society also recommends not making risk estimates based on low exposure levels.

James Conca explains the UNSCEAR 2012 report, which finally drove a stake into the heart of LNT:

The United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) (UNSCEAR 2012) submitted the report that, among other things, states that uncertainties at low doses are such that UNSCEAR “does not recommend multiplying low doses by large numbers of individuals to estimate numbers of radiation-induced health effects within a population exposed to incremental doses at levels equivalent to or below natural background levels.” (UNDOC/V1255385)

You know, like everyone’s been doing since Chernobyl. Like everyone’s still doing with Fukushima.

Finally, the world may come to its senses and not waste time on the things that aren’t hurting us and spend time on the things that are. And on the people that are in real need. Like the infrastructure and economic destruction wrought by the tsunami, like cleaning up the actual hot spots around Fukushima, like caring for the tens of thousands of Japanese living in fear of radiation levels so low that the fear itself is the only thing that is hurting them, like seriously preparing to restart their nuclear fleet and listening to the IAEA and the U.S. when we suggest improvements.

The advice on radiation in this report will clarify what can, and cannot, be said about low dose radiation health effects on individuals and large populations. Background doses going from 250 mrem (2.5 mSv) to 350 mrem (3.5 mSv) will not raise cancer rates or have any discernable effects on public health. Likewise, background doses going from 250 mrem (2.5 mSv) to 100 mrem (1 mSv) will not decrease cancer rates or effect any other public health issue.

Note – although most discussions are for acute doses (all at once) the same amount as a chronic dose (metered out over a longer time period like a year) is even less effecting. So 10 rem (0.1 Sv) per year, either as acute or chronic, has no observable effect, while 10 rem per month might.

UNSCEAR also found no observable health effects from last year’s nuclear accident in Fukushima. No effects.

The Japanese people can start eating their own food again, and moving back into areas only lightly contaminated with radiation levels that are similar to background in many areas of the world like Colorado and Brazil.

Low-level contaminated soil, leaves and debris in Fukushima Prefecture piling up in temporary storage areas. (Photo by James Hackett, RJLee Group)

The huge waste of money that is passing for clean-up now by just moving around dirt and leaves (NYTimes) can be focused on clean-up of real contamination near Fukushima using modern technologies. The economic and psychological harm wrought by the wrong-headed adoption of linear no-threshold dose effects for doses less than 0.1 Sv (10 rem) has been extremely harmful to the already stressed population of Japan, and to continue it would be criminal.

To recap LNT, the Linear No-Threshold Dose hypothesis is a supposition that all radiation is deadly and there is no dose below which harmful effects will not occur. Double the dose, double the cancers. First put forward after WWII by Hermann Muller, and adopted by the world body, including UNSCEAR, its primary use was as a Cold War bargaining chip to force cessation of nuclear weapons testing. The fear of radiation that took over the worldview was a side-effect (Did Muller Lie?).

(…snip…)

In the end, if we don’t reorient ourselves on what is true about radiation and not on the fear, we will fail the citizens of Japan, Belarus and the Ukraine, and we will continue to spend time and money on the wrong things…

That’s just Jim’s summary – please read his complete essay for the charts, tables and implications for Japan. And did Muller Lie? The evidence seems pretty conclusive that all this enormous waste of resources was based on a lie. Not to mention the fear, and in the case of Fukushima at least a thousand unnecessary deaths due to the panic and mismanagement of the evacuation.

Footnotes:

[1] While link testing, I found that Mark’s HPS link fails – that’s the Internet. Here’s the most recent HPS position statement I could find this morning. Radiation Risk In Perspective: Position Statement Of The Health Physics Society (updated 2010) 

In accordance with current knowledge of radiation health risks, the Health Physics Society recommends against quantitative estimation of health risks below an individual dose1 of 50 millisievert (mSv) in one year or a lifetime dose of 100 mSv above that received from natural sources. Doses from natural background radiation in the United States average about 3 mSv per year. A dose of 50 mSv will be accumulated in the first 17 years of life and 0.25 Sv in a lifetime of 80 years. Estimation of health risk associated with radiation doses that are of similar magnitude as those received from natural sources should be strictly qualitative and encompass a range of hypothetical health outcomes, including the possibility of no adverse health effects at such low levels.

There is substantial and convincing scientific evidence for health risks following high-dose exposures. However, below 50– 100 mSv (which includes occupational and environmental exposures), risks of health effects are either too small to be observed or are nonexistent.

[2] Environmentalist Stewart Brand on the retirement of LNT.

[3] Report of the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) Fifty-ninth session (21-25 May 2012) [PDF]. 

[4] EPA’s decision to allow risk-based decisions to guide responses to radiological events

The more you know about nuclear power the more you like it, Part 2

This is a sequel to The more you know about nuclear power the more you like it, Part 1, where I promised to look at the relative nuclear support amongst print and TV media, scientists and the public. A personal favorite technical source on nuclear power is prof. Bernard Cohen’s textbook The Nuclear Energy Option. While the book is out of print there is a very well-executed online version. For this post we need Chapter 4 Is The Public Ready For More Nuclear Power?

Prof. Cohen analyzed a broad range of opinion surveys that were available at the time of writing ~1990. Here I just want to focus on the hypothesis that “The more you know about nuclear power the more you like it.” If we collected fresh surveys today we might find the absolute levels a bit different, but I claim the relative proportions should be very similar. Here’s the relevant paragraphs from Chapter 4:

While public support of nuclear power has only recently been turning favorable, the scientific community has always been steadfastly supportive. In 1980, at the peak of public rejection, Stanley Rothman and Robert Lichter, social scientists from Smith College and Columbia University, respectively, conducted a poll of a random sample of scientists listed in American Men and Women of Science, The “Who’s Who” of scientists.1 They received a total of 741 replies. They categorized 249 of these respondents as “energy experts” based on their specializing in energy-related fields rather broadly defined to include such disciplines as atmospheric chemistry, solar energy, conservation, and ecology. They also categorized 72 as nuclear scientists based on fields of specialization ranging from radiation genetics to reactor physics. Some of their results are listed in Table 1.

NewImage

From Table 1 we see that 89% of all scientists, 95% of scientists involved in energy-related fields, and 100% of radiation and nuclear scientists favored proceeding with the development of nuclear power. Incidentally, there were no significant differences between responses from those employed by industry, government, and universities. There was also no difference between those who had and had not received financial support from industry or the government.

Another interesting question was whether the scientists would be willing to locate nuclear plants in cities in which they live (actually, no nuclear plants are built within 20 miles of heavily populated areas). The percentage saying that they were willing was 69% for all scientists, 80% for those in energy-related sciences, and 98% for radiation and nuclear scientists. This was in direct contrast to the 56% of the general public that said it was not willing.

Rothman and Lichter also surveyed opinions of various categories of media journalists and developed ratings for their support of nuclear energy. Their results are shown in Table 2. [which I've rendered in chart form]

Click to embiggen

We see that scientists are much more supportive of nuclear power than journalists, and press journalists are much more supportive than the TV people who have had most of the influence on the public, even though they normally have less time to investigate in depth. There is also a tendency for science journalists to be more supportive then other journalists.

In summary, these Rothman-Lichter surveys show that scientists have been much more supportive of nuclear power than the public or the TV reporters, producers, and journalists who “educate” them. Among scientists, the closer their specialty to nuclear science, the more supportive they are. This is not much influenced by job security considerations, since the level of support is the same for those employed by universities, where tenure rules protect jobs, as it is for those employed in industry. Moreover, job security for energy scientists is not affected by the status of the nuclear industry because they are largely employed in enterprises competing with nuclear energy. In fact, most nuclear scientists work in research on radiation and the ultimate nature of matter, and are thus not affected by the status of the nuclear power industry. Even among journalists, those who are most knowledgeable are the most supportive. The pattern is very clear — the more one knows about nuclear power, the more supportive one becomes.

For the 2014 perspective, please read Geoff Russell’s wonderful new book GreenJacked! The derailing of environmental action on climate change

Geoff articulates how Greenpeace, Friends of the Earth, Sierra Club and the like thwarted the substitution of clean nuclear for dirty coal. Those organizations could not admit today what will be completely obvious after reading Greenjacked!: that if they had supported nuclear power from the 1960s to today, then all of the developed world could easily have been like France, Sweden and Ontario province — powering advanced societies with nearly carbon-free nuclear energy.

The more you know about nuclear power the more you like it, Part 1


Image and caption credit Chattanooga Times Free Press: Houses in the Hunter Trace subdivision in north Hamilton County are within a few hundred yards of the Sequoyah Nuclear Power Plant near Soddy-Daisy. Neighbors to the nuclear plant say they don’t mind living close to the TVA plant. Staff Photo by Dave Flessner

In 2002 I started looking into our low-carbon energy options. Over the next two years I learned there is no perfect-zero-carbon energy option. I learned that realistic low-carbon energy policy is about deploying scaleable and affordable electricity generation. To my surprise, like the five environmentalists of Pandora’s Promise, I discovered that my anti-nuclear view was based on fictions. I had carried around “The Washington Post accepted” wisdom for decades without ever asking “Why is that true?”

As I was studying the nuclear option, it became blindingly obvious that the people who feared nuclear knew essentially nothing about the subject. Conversely the people who were most knowledgeable about nuclear supported large-scale nuclear deployment as a practical way to replace coal.

And, very interesting, the people who live in the neighborhoods of existing nuclear plants tend to be very favorable to building more nuclear. Including new nuclear plants to be constructed literally “In their own back yard”, a reversal of the expected NIMBY attitudes. Of course there are economic benefits to the neighbors of a plant, including the taxes paid to the regional government entity. The economic incentives gave people a reason to want to be there, so it motivated them to ask some serious questions:

  • “Should I buy a home near that nuclear plant?”
  • “Will my children be harmed?”
  • “What if there is an accident?”

From reading the recent NEI annual polls I developed an untested hypothesis: the more contact you have with people who work at a nearby nuclear plant, the less you fear nuclear and the more you appreciate the benefits of clean electricity. It’s easy to informally ask your neighbors “what’s the truth?” about things that worry you. And you learn the people who operate the plant are just as devoted to their children as you are.

Here is another encouraging trend: there are significant numbers “voting with their feet” by moving into nuclear plant neighborhoods.

USA 2010 census: the population living within 10 miles of nuclear power plants rose by 17 percent in the past decade.

And if you read the same surveys that I did you will see how strongly the neighbors’ attitudes contrast to the typical media fear-mongering. Examples:

Neighbor of the Sequoyah Nuclear Power Plant “This is a safer neighborhood than most areas and I really don’t think much about the plant, other than it provides a great walking area for me,” said Blanche DeVries, who moved near Sequoyah three years ago.

NEI 2013 survey similar to 2005, 2007, 2009, and 2011 “familiarity with nuclear energy leads to support.” 

NEI 2013 survey “80 percent agree with keeping the option to build more nuclear power plants in the future”

BBC Living near a nuclear power station

  • Q: “What’s it like to have a reactor on the doorstep?”
  • A: “I live not more than 100 yards…and it doesn’t worry me.”

NEI survey 2009: “Eighty-four percent of Americans living near nuclear power plants favor nuclear energy, while an even greater number—90 percent―view the local power station positively, and 76 percent support construction of a new reactor near them, according to a new public opinion survey of more than 1,100 adults across the United States.”

NEI survey 2013 [PDF]: “81 percent of residents near commercial reactors favor the use of nuclear energy, 47 percent strongly.”

NewImage

UK 2013 Why we love living next to a nuclear power plant: “It’s cheap, it’s quiet and, say the residents of Dungeness, blissfully safe”. “Here, by contrast, everyone I talk to enthuses about a strong feeling of security and a rare kind of community spirit. Put simply, they live in houses that happen to be next door to a nuclear power station because it makes them feel safe.”

Next we will look at the relative nuclear support amongst print & TV media, scientists and the public The more you know about nuclear power the more you like it, Part 2.

Biomass, solar and wind cannot sustain an advanced society


EROIs of all energy techniques with economic “threshold”. Biomass: Maize, 55 t/ ha per year harvested (wet). Wind: Location is Northern Schleswig Holstein (2000 full- load hours). Coal: Transportation not included. Nuclear: Enrichment 83% centrifuge, 17% diffusion. PV: Roof installation. Solar CSP: Grid connection to Europe not included. Source: Weißbach et al., Energy 52 (2013) 210

Since I first read the Weißbach et al paper, I’ve been eagerly awaiting publication of John Morgan’s article, first published in Chemistry in Australia. Fortunately Barry Brook has republished John’s article as a guest post. Here’s a paraphrased summary:

Wind and solar cannot sustain an OECD level society. Adding energy storage buffers the variability, but further reduces the EROI below the economic limit. Therefore solar and wind can reduce the emissions of fossil fuels, but cannot eliminate them. They offer mitigation, but not replacement.

If we want to cut emissions and replace fossil fuels, it can be done, and the solution is to be found in the upper right of the figure. France and Ontario, two modern, advanced societies, have all but eliminated fossil fuels from their electricity grids, which they have built from the high EROEI sources of hydroelectricity and nuclear power. Ontario in particular recently burnt its last tonne of coal, and each jurisdiction uses just a few percent of gas fired power. This is a proven path to a decarbonized electricity grid.

But the idea that advances in energy storage will enable renewable energy is a chimera – the Catch-22 is that in overcoming intermittency by adding storage, the net energy is reduced below the level required to sustain our present civilization.

I suggest you go straight over to Brave New Climate: The Catch-22 of Energy Storage. And follow the comments – there are already some excellent contributions and additional resource links. One important resource is included in the supplementals of the Weißbach et al. paper – that’s the spreadsheet containing all the materials reference data, assumptions and the EROI and EMROI computations. Total transparency — after several hours working through the spreadsheets I cannot find anything to criticize. If I do find some issues I’ll add updates here.

UPDATE  Keith Pickering wrote an analysis of Weißbach et al here GETTING TO ZERO: Is renewable energy economically viable? I liked Keith’s summary of how wind dilutes the higher EROI of higher value sources like hydro:

Wind is a tricky case. If you ask most people, they will tell you that we don’t currently have energy storage for wind. In fact we do, but the buffering for wind comes from natural gas powerplants, which are typically built at the same time wind is deployed. When the wind dies, the backup gas plants are turned on, to keep the grid power reliable. Thus the energy storage for wind is embodied in the natural gas that isn’t burned when the wind turbine is producing peak output.

This means that wind, as it’s used now in the US, isn’t really zero-fossil. It’s a hybrid system that’s part wind, part natural gas. And considering the availability of wind (30% is typical for a wind turbine), most of the energy actually comes from the fossil side of the equation. We’re using the wind to offset some of the CO2 emissions from the gas plant (which is good), but instead of getting to zero, we’re just walking toward the cliff instead of running toward it.

Denmark currently is one of the most wind-energy-intensive countries in the world, which works because they buffer their wind energy against hydroelectric power from Norway and Sweden. When the wind is blowing in Denmark, they export electricity to Sweden, which then can turn down its hydro plants (thus keeping more water stored in the reservoirs behind the dam). When the wind dies, Sweden turns up the taps on the hydroelectric production, and exports that stored energy back to Denmark. It’s a great zero-fossil system, but it’s only possible because of the unique geography that places a flat windy country right next to a couple of wet mountainous countries.

Finally, it’s important to note that the grid-buffering sources for wind (hydro in Denmark, gas in the US) both have a higher EROI than wind itself. Thus these hybrid systems do make economic sense, but that’s partly because the buffering portion makes economic sense on its own. Essentially, these hybrid systems dilute the EROI of hydro or gas, in order to subsidize the EROI of the wind portion of system. For the hybrid gas system that makes sense, because the reduction in CO2 is worth it. For the hydro-buffered system, the question is more problematic. In any case, it’s clear that if wind had to be buffered with a non-generating storage-only system, the economics would be difficult to justify.

Keith also has a very concise summary of the increasing EROI of nuclear fission:

One reason previous studies on nuclear have been all over the map is that it’s a moving target: the EROI of nuclear has been rising rapidly during the past 20 years (and will continue to rise) as the industry switches from gas-diffusion enrichment of uranium, to centrifuge enrichment (which is 35 times more energy efficient). Since uranium enrichment is a major part of energy input, this makes a huge difference. A nuclear plant using 100% gas diffusion would have and EROI of 31, EMROI of 34, comparable to coal. Weißbach’s numbers above are based on 83% centrifuge, 17% diffusion. The World Nuclear Association projection is that there will be no more diffusion enrichment anywhere in the world by 2017. With 100% centrifuge, nuclear will have an EROI of 106, EMROI of 166 according to Weißbach’s analysis. In other words, the switch from diffusion to centrifuge roughly quadruples the overall energy efficiency of nuclear power.

Beyond that, there is a new laser enrichment process being developed called SILEX which promises to be 10 times more energy efficient than centrifuge. And even beyond that, some Gen IV reactor designs (the fast neutron reactor, and the liquid-fuel thorium reactor, or LFTR) don’t use enrichment at all, and could therefore come in at EROI of about 114, EMROI of 187.

Keith used the Weißbach et al supplementary spreadsheets to do these calculations.

Sam Altman “I believe the 22nd century is going to be the atomic power century”

Samaltman

“I have studied a lot about what I think is sort of the best use of my time and money and what I think will help the world the most. And I really do believe that safe, cheap, clean energy is probably the most important thing an individual can do for the future of the world.” — SAM ALTMAN

If you listen to the Econtalk interview I think you will agree that Sam has done his homework. Not surprisingly I think his conclusions are indicators of an open, inquiring mind. Evidence:

“There are two nuclear energy companies in this batch. I believe that–the 20th century was clearly the carbon century. And I believe the 22nd century is going to be the atomic power century. I’m very convinced of that. It’s just a question of how long it takes us.

Y Combinator is in a good position to harvest the rewards of innovations that require a long development cycle and heaps of capital. Unlike the typical 10 year venture fund, YC makes a large number of small ($120k) bets, 700+ such bets so far since Paul Graham launched YC in 2005. New nuclear generation is obviously a very long-term bet.

Question: will the NRC license a new design that isn’t just a variation of existing PWR designs? How is it possible to innovate in this regulatory environment?

I think it will take way too long and too much capital to launch a new design based on NRC licensing. So Sam Altman’s new ventures will almost certainly have to move to a friendly-regulator nation for the initial licensing. Sam is more optimistic  than I am about the NRC. That said, if I were talking my book publicly I would be extremely deferential to the NRC.

Update: I found one of the two YC S14 batch nuclear companies. It is Helion Energy who is building an SMR concept. But it is FUSION, not fission:

Helion is making a fusion engine 1,000 times smaller, over 500 times cheaper, and realizable 10 time faster than other projects.

Obi Wan: could this be the one that works? There’s a bit more at TechCrunch. Enjoy the Sam Altman interview – it’s not your everyday porridge.

Why is Econtalk interviewing all these Silicon Valley entrepreneurs and VCs? Since Russ Roberts in now full time at Stanford’s Hoover Institution, he has been spending more of his time at the vortex of the Silicon Valley innovation cluster. One of the benefits is that he is becoming progressively more involved-with and excited-about the innovation culture. So his Econtalk guests include a growing number of Silicon Valley insiders. In July Russ interviewed Sam Altman, CEO of the iconic accelerator Y Combinator (YC). Sam confessed in the interview that he doesn’t filter himself very well – meaning it was a refreshingly frank discussion.