Why the Best Path to a Low-Carbon Future is Not Wind or Solar Power

Figure A. source Economist Sun, wind and drain: Wind and solar power are even more expensive than is commonly thought

Figure B. source Charles Frank The Net Benefits of Low and No-Carbon Electricity Technologies

Figure A and B summarize some of the conclusions of the recent paper by economist and Brookings senior fellow Charles Frank. The paper might not have attracted much attention outside the usually wonkish energy policy circles. But The Economist wrote a full page review which quickly became a lightning-rod for much shouting by pro-renewables activists. There are three levels for you to study the results — in increasing order of difficulty:

  1. Economist: Sun, wind and drain: Wind and solar power are even more expensive than is commonly thought
  2. Brookings blog post by Charles Frank: Why the Best Path to a Low-Carbon Future is Not Wind or Solar Power
  3. Brookings paper by Charles Frank: The Net Benefits of Low and No-Carbon Electricity Technologies [PDF]

The Economist article will not be a favorite with Angela Merkel, as is nicely summarized in the last paragraph:

The implication of Mr Frank’s research is clear: governments should target emissions reductions from any source rather than focus on boosting certain kinds of renewable energy

I've read all 182 tedious comments, which I cannot recommend because the majority are non-referenced complaints from boosters. Approximately none of the Economist commenters had read the Frank paper. So my take is you can skip #1, read #2 for a good introduction, then work your way through #3.

Figure A is a nice graphic produced from Figure B which is the “money table” of the Frank paper. I've included Figure B so you can quickly grasp what the Cost vs Benefit bars mean in the graphic. There's a minor error in the graphic: the Wind cost/benefit bar is missing the mark for “net benefit” which is a negative $25k/MW not zero.

What Figure A and B claim to tell us is that in the USA new combined-cycle gas plants offer the greatest net benefit given a large set of assumptions. Dr. Frank's paper is a model of transparency — every assumption and parameter is referenced and further qualified by end-notes. Even though this is a simplified methodology for estimating net benefits, there are still a heap of assumptions that must be understood in order to assess where the results might be applicable. I'll summarize a few that I think are critical:

  • Net benefits are calculated on the assumption that new generation replaces on average 22 hours/day of coal non-peak generation and 2 hours/day of single-cycle gas peak generation
  • This is USA-centric, based upon EIA 2013 data
  • Therefore relatively very low methane (gas) prices
  • Therefore relatively high insolation, moderately high wind resource

For energy policy wonks I will highlight a few weak spots in the paper:

  • Most important is that Frank's Adjusted Capacity Cost does not fully reflect the negative reliability impact of VRE.
  • I will speculate that Dr. Frank chose to avoid the complexity of Capacity Credit to keep the presentation accessible. (Capacity Credit estimates the amount of firm, dispatchable generation that can be replaced by VRE without reducing reliability.)
  • Dr. Frank does not examine how Net Benefits vary with VRE penetration. Detailed modeling shows that increasing VRE has large effects on reliability.
  • Capacity Credit for VRE generation is inversely proportional to penetration. The more wind/solar you build the less marginal value you get.
  • The Frank paper is directed at a future powered by less coal (that's good) but not a zero-carbon future (which we must achieve).
  • If we build a strategy for the goal of Zero Emissions we will still likely build Gas CC in quantity because it is fast to build, relatively cheap and politically acceptable. But looking out a century to achieving Zero will help us focus on ramping up nuclear as fast as feasible and safe. We cannot wait 50 years to get started.

Why do I think the Frank paper is important? This is a serious effort to help policy-makers understand why subsidies supporting wind and solar are such an expensive and inefficient way to reduce carbon emissions. And Dr. Frank illustrates why traditional LCOE analysis overvalues wind and solar. And yes, the headline results are US-centric, but there is a serious effort to support generalizing the results by:

Sensitivity to Carbon Prices: In Tables 9A and 9B, the net benefits for both wind and solar are negative. However, if the carbon price is increased from $50 to $61.87 or above, then the net benefits of wind are positive (as shown in Table 11). Above $185.84, the net benefits of solar are also positive.

My interpretation of that result is that solar costs at least $185/ton CO2 avoided. For a society with finite resources, the cost/ton of CO2 abatement is a rather important number.

Sensitivity to Natural Gas Prices: The results in Tables 9A and 9B are highly sensitive to historically volatile natural gas prices. In the United States, the average annual cost of natural gas to electricity producers reached a high of $9.01 per million Btu in 2008. The average monthly cost reached a low of $2.68 in April 2012 (EIA, November 2013, Table 9.10.). The variation among countries, and the effect on net benefits, is illustrated in Table 12.

Note that nuclear becomes the highest net-benefit policy when gas prices exceed about $9/MBtu. Current UK prices are above that level, which is where US prices were only six years ago.

My bottom line is: this paper is good starting point. Please keep in mind that the true cost of variability for wind and solar is significantly understated, as the value of VRE falls as penetration increases. Still, I appreciate that adding complete VRE analysis would have made this paper much more cumbersome.

Fortunately, there has been some very good work on VRE and System LCOE in the past couple of years. In a future post I will get into the research of Lion Hirth et al and the Potsdam Institute for Climate Impact Research. For the eager here are three good references for in-depth modeling studies of high penetration VRE:

  1. Hirth, Lion, The Optimal Share of Variable Renewables. How the Variability of Wind and Solar Power Affects Their Welfare-Optimal Deployment (November 8, 2013). FEEM Working Paper No. 90.2013. Available at SSRN: http://ssrn.com/abstract=2351754 or http://dx.doi.org/10.2139/ssrn.2351754
  2. Ueckerdt, Falko and Hirth, Lion and Luderer, Gunnar and Edenhofer, Ottmar, System LCOE: What are the Costs of Variable Renewables? (January 14, 2013). Available at SSRN: http://ssrn.com/abstract=2200572 or http://dx.doi.org/10.2139/ssrn.2200572
  3. Hirth, Lion and Ueckerdt, Falko and Edenhofer, Ottmar, Why Wind is Not Coal: On the Economics of Electricity (April 24, 2014). FEEM Working Paper No. 39.2014. Available at SSRN: http://ssrn.com/abstract=2428788 or http://dx.doi.org/10.2139/ssrn.2428788

 

 

The Energy Collective is attracting serious energy professionals

On The Energy Collective site (EC) I am noticing more and more contributions from people who know what they are talking about. Like Barry Brook’s BraveNewClimate.com, EC is a place where you can read conversations where at least one side has real-world experience with some aspect of the energy business.

E.g., I try to keep up with the writings of Joris van Dorp via his RSS comment feed at EC; similarly Engineer-Poet by his RSS comments feed at EC. Today that path led to a lightweight post by Sarah Battaglia, a social media person at Energy Curtailment Specialists (a company that begins it’s website pitch with “We have never sustained a complaint at the Better Business Bureau.”) This post used 26 keywords for a 500 word piece. I only encountered this because van Dorp and Engineer-Poet both invested their time to explain to Sarah how things work in the real-world. Sarah’s post contained bits of wisdom like this:

Some plants may also use nuclear power to generate electricity, but this method is relatively expensive and may be hazardous to human health and the environment.

Later Sarah comments

You don’t see many solar panels exploding or causing dangerous radiation to local residents.

Joris van Dorp replied:

As a matter of fact, solar panels do cause one thing to explode: energy costs! :)

Here in Europe, Spanish investors in solar power systems are still reeling from the collapse of the subsidy system in their country as a result of its deep financial crisis. Many have seen family fortunes dissappear as banks seized their solar farms which became unprofitable overnight when ‘guaranteed’ subsidies were eliminated in desperate attempts to prevent sovereign default. While Spain has the best sunshine in Europe, there is virtually no PV being installed there since the crisis. PV is simply too costly and Spain is simply too deep in the hole financially and socially for it to even consider extracting the necessary funding for subsidies from its impoverished tax payers.

I agree that we have to find a way to exist for centuries to come. Fortunately we don’t need to look far. Nuclear energy has the proven potential to provide limitless energy cheaper than coal. While the nuclear option has many other unique benefits to humanity, this particular characteristic of being able to compete (without subsidies) with coal means that it is the only credible hope humanity has of stopping antropogenic global warming and ever increasing fossil fuel dependence.

“Hot new” energy technologies are exciting and interesting, and I love them, but they are no use. The political will to provide for their permanent (!) heavy subsidy does not exist in developed countries. And in developing countries, the enthousiasm for such permanent subsidies is of course even less. Hence, a thousand coal fired power plants are planned or under construction globally today.

Coal is death. Nuclear power is hope. Renewable energy is a distraction, unfortunately.

Bob Meinitz is another frequent fact-based commentator:

Sarah, it’s true that solar panels don’t explode (most deaths from solar are the result of falls from rooftops during installation). The fact is that while most people harbor an irrational fear of radiation, nuclear is five times safer than solar per unit of energy generated, it occupies a tiny fraction of the land area, and delivers power with six times the capacity factor, day or night, rain or shine.

While powering the world on renewables alone is theoretically possible, from a practical standpoint the chances of doing so are zero. It would cost hundreds of $trillions in 2014, and there’s no evidence that the price will drop fast enough or that a buildout could happen fast enough to keep pace with the exploding global demand for energy. The graph below makes that evidently clear – wind’s contribution to world electricity is the skinny purple line, and solar’s contribution is invisible.

This is what I mean by reinforcing misperceptions. It’s critical that we examine our energy options on a factual basis, and not one of popular culture myths which persist four decades after The China Syndrome if we’re going to have a chance of getting a handle on climate change.

Joris van Dorp replies:

Bob, that graph makes your point perfectly.

Besides, the graph also shows the serious consequences of anti-nuclearism. Between 1991 and 2011, twenty years have passed in which nuclear energy grew only very slightly. If the world had adopted the French nuclear energy strategy, then fossil fuels for electricity generation would have been zero today. Nuclear and hydro together would have covered the whole electricity demand.

Advocates for renewables against nuclear don’t seem to understand that their ideology is no solution to the problem of exponential fossil fuel consumption. On the contrary, as the graph clearly indicates, *their ideology is itself the cause *of historical exponential fossil fuel consumption for electricity generation.

 

Appeals to the climate consensus can give the wrong impression

Credit John Cook 2014 www.skepticalscience.com

Image credit John Cook http://www.skepticalscience.com, Closing the Consensus Gap on Climate Change

Dr. Will Howard, recently published an essay that will appeal to those of you interested in science communications, especially in the challenging and politically-charged context of climate change. Dr. Howard makes the extremely important point that “scientific consensus” on climate change reflects strong consilience of evidence. I confess that I had to look up “consilience” to learn that it is indeed the perfect term to capture how we have developed confidence in our understanding of the causal connections between human-generated greenhouse gases and climate change.

In public discourse, if we had chosen “consilience of evidence” to describe the accumulation of research, then perhaps people might have understood more readily that we are not talking about the results of an opinion poll or a negotiated statement (yes, the IPCC Summary for Policymakers [PDF] is a negotiated statement, though I don’t know how else such a summary could be produced).

I thought Will’s essay captured this science communications challenge succinctly, and especially how this strong consilience of evidence is separate from the politics of what to do about it:

“Consensus” is understood differently in science compared to politics or society.

Scientists use this word to refer to consilience of multiple lines of evidence that underlie widespread agreement or support a theory.

In the case of climate change, multiple lines of evidence underpin the prevailing view that the climate system is showing decade-on-decade warming over the past 50 years.

In particular, this warming bears temporal and spatial patterns, or “fingerprints”, that point to human causes.

For example, the stratosphere (that part of the atmosphere higher than about 11 km) has been cooling as the lower atmosphere and the ocean warm. This is the pattern we expect from the addition of greenhouse gases and not from, say, changes in the sun’s output.

But in public and especially political discourse, “consensus” tends to imply majority opinion or concurrence. As consensus in this public context is often arrived at by negotiation, saying there’s a scientific “consensus” may imply to the community that prevailing scientific views represent a negotiated outcome. This is the antithesis of science.

Consensus of the non-scientific kind does have a role to play in the climate debate. This includes negotiating whether warming is a “good” or “bad” thing and what, if anything, we should do about it.

These are not scientific questions. These are issues of values, politics, ethics and economics. As a nation and as a global society we need to reach consensus to resolve those questions and to make and implement appropriate public policy.

I’ve nothing to add to Will’s excellent essay, so I recommend that you go directly to The Conversation to read the original and the comments. Some effort is required to weed the growing number of comments so I will highlight a segment of the conversation which focuses upon the important question of effective science communication:

John Cook
Climate Communication Research Fellow at University of Queensland

This is an interesting article with many important points. I would be the first person to stress the importance of communicating the many “fingerprints” being observed in our climate (and in fact have created a human fingerprints infographic which I frequently use in public talks http://www.skepticalscience.com/graphics.php?g=32).

However, the article is missing a crucial element to this discussion – what does the evidence tell us about the efficacy of consensus messaging? A number of studies have found that one of the strongest predictors of public support for climate mitigation policies is perception of consensus (i.e., the level of agreement among climate scientists about human-caused global warming). Also, consensus messaging significantly increases acceptance of climate change. A randomised experiment by Stephan Lewandowsky found that informing Australians of the 97% consensus increased their acceptance of human-caused global warming and intriguingly, the increase was greatest amongst conservatives. In this case, consensus neutralised ideology to some degree.

When people think there is still an ongoing debate about human-caused global warming amongst climate scientists, they’re less likely to accept climate change and support climate action. And given the Australian public on average think there is 58% agreement among climate scientists, rather than 97%, then this misconception has serious societal implications. Science communicators need to take into account that people use expert scientific opinion as a heuristic to inform their views on complex scientific issues.

To underscore this fact, I’ve actually tested the human fingerprints message (linked to above) and the consensus message in a randomised experiment. Consensus messaging significantly outperformed the fingerprints message. The lesson here is that we need to understand how laypeople think about complex scientific issues like climate change.

However, I don’t think there need be that much conflict between what social science is telling us and the views of the OP. A recent paper by Ed Maibach tested various forms of consensus messaging and they found the most effective was a message that emphasised both consensus and the evidence-based nature of the scientific method:

“Based on the evidence, 97% of climate scientists have concluded that human-caused climate change is happening”

John Cook
Climate Communication Research Fellow at University of Queensland
In reply to Anna Young

Anna, the problem you raise is exactly why communication like the John Oliver YouTube video embedded in the OP are so powerful. Not only does Oliver communicate the 97% consensus, he also does something equally important – he communicates how people cast doubt on the consensus (in this case, by perpetuating false balance in the media). What Oliver is doing is equipping people with the knowledge and the critical thinking skills so that when they see the mainstream media show a debate between a climate scientist and a tinfoil guy, they can see it for what it is. It’s not only a funny video, it’s brilliant communication. The fact that it’s been viewed millions of times means millions of people have now been “inoculated” against the misinformation of false debate in the mainstream media.

So kudos to Will Howard for embedding the video.

Will Howard
Research scientist at University of Melbourne
In reply to John Cook


Thanks John, for contributing that perspective. The points you raise, I would suggest, may be applicable to many areas of “contested” science, in health, resources (e.g. coal seam gas) and others. 

Whatever is said about the consensus, I do think we need to do a better job of communicating what underpins it. As your co-author Peter Jacobs notes

“to those suggesting that the consensus message is an appeal to authority that ignores evidence- the consensus exists *because of* the overwhelming physical evidence, which is detailed at length in the scientific literature.”

But I wonder about this: both the consensus and the consilience of evidence (my preferred term) seem to be strengthening, yet public support for policies aimed at mitigating climate change seem not to be.

I note polls suggesting climate change and environmental issues have moved down peoples’ priorities. Here in Australia, our current government was elected with a major plank in its platform being the removal of the carbon tax. (Whether we agree or disagree with their policy they ran on that issue and were elected).

Is this because people are skeptical of the science? Is it just that other issues take on more urgency: jobs, the economy, international conflicts, etc.?

John Cook
Climate Communication Research Fellow at University of Queensland
In reply to Will Howard

I like the term “consilience of evidence” also but when I test-drive it in public talks, it tends to inspire blank looks from the audience. It’s a term that scientists love. Laypeople, not so much. Which is why, again, it’s important that we understand our audience when we do science communication.

Why is public support not changing that much? Public concern about climate change does correlate with economic performance hence the drop in climate concern after the GFC. Another predictor of public concern about climate change is cues from our political leaders so you can see why Australia has a problem in that department at the moment. There’s certainly a number of factors that influence people’s attitudes to climate.

But as I said above, several recent studies have found perception of scientific agreement on climate change is one of the biggest factors. And given public perception of consensus is very low (I measured it at 58% on a representative Australian sample), this misconception is definitely a significant problem. It’s not the only factor delaying public support for climate action but it’s a big one.

Also, communicating the 97% consensus is a lot easier to understand than explanations of why greenhouse gases in the upper atmosphere are more efficient at radiating longwave radiation to space, hence contributing to the cooling stratosphere. From a communication point of view, consensus is a low lying fruit. This is why consensus messaging outperformed fingerprint messaging in my data.

So communicating the 97% consensus can help with removing one roadblock delaying climate action. It won’t fix everything – it’s not a magic bullet. But ignoring the “consensus gap” only serves to give extra life to that stumbling block.

I wrote a post a while back How to break the climate change gridlock including a conversation with  Andrew Dressler, Professor of Atmospheric Sciences at Texas A&M, about how we might more explicitly get each party’s values and economic interests on the negotiating table.

Will Howard has received funding from the Australian Research Council, the Australian Government Department of Climate Change, the Cooperative Research Centres Program, and the Australian Antarctic Science Program.

This article was originally published on The Conversation. Read the original article.

Sam Altman “I believe the 22nd century is going to be the atomic power century”

Samaltman

Russ Roberts in now full time at Stanford’s Hoover Institution, so he has been spending more of his time at the vortex of the Silicon Valley innovation cluster. One of the benefits is that he is becoming progressively more involved-with and excited-about the innovation culture. So his Econtalk guests include a growing number of Silicon Valley insiders. In July Russ interviewed Sam Altman, CEO of the iconic accelerator Y Combinator (YC). Sam confessed in the interview that he doesn’t filter himself very well – meaning it was a refreshingly frank discussion. Here’s Sam:

“I have studied a lot about what I think is sort of the best use of my time and money and what I think will help the world the most. And I really do believe that safe, cheap, clean energy is probably the most important thing an individual can do for the future of the world.”

It was clear to me that Sam has done his homework, and naturally I think his conclusions are indicators of an open, inquiring mind. Evidence:

“There are two nuclear energy companies in this batch. I believe that–the 20th century was clearly the carbon century. And I believe the 22nd century is going to be the atomic power century. I’m very convinced of that. It’s just a question of how long it takes us.

Y Combinator is in a good position to harvest the rewards of innovations that require a long development cycle and heaps of capital. Unlike the typical 10 year venture fund, YC makes a large number of tiny ($120k) bets, 700+ such bets so far since the YC launch in 2005. New nuclear is obviously a very long-term bet, even given that the company will almost certainly have to move to a friendly-regulator nation for the initial licensing. Sam is more optimistic  than I am about the reality of getting a new non-LWR design licensed by US NRC. OTOH if I were talking my book publicly I would be extremely deferential to the NRC staff.

Please do listen to the Sam Altman interview. Update: I found one of the two YC S14 batch nuclear companies. It is Helion Energy who is building an SMR concept. But it is FUSION, not fission:

Helion is making a fusion engine 1,000 times smaller, over 500 times cheaper, and realizable 10 time faster than other projects.

There’s a bit more at TechCrunch. Of all the earlier failed fusion experiments, could this be the one that works Obi Wan?

Are you worried about overpopulation?

A dear friend is very concerned that global overpopulation is making sustainable resource management impossible. A current example is the California drought and water crisis. Because this subject is definitely not intuitive, I thought I would share some resources that I outlined by email:

Regarding your sincere population worries – here are some possibly useful resources. First, the one hour BBC-produced talk by Swedish demographer Hans Rosling is an friendly introduction to  population dynamics BBC “Overpopulated”.

The fact that BBC funded such an expensively-produced mini-documentary reflects the reality that many people continue to accept the 1960s perspective voiced by Paul Erlich and The Club of Rome. This Malthusian view was what I believed through the 1990s. It was only when I had time to study current population research that I realized I was very out of date.

For a more in-depth, but still easy to follow lecture, see mathematical biologist Joel Cohen’s Floating University segment Malthus Miffed. I think it would be difficult to digest Dr. Cohen and still be frightened about runaway long term population growth.

That said, we also know very well how to make population growth a problem again. If we hobble the engine of economic growth – especially the improvement of the incomes of the very poor in Africa and Asia, then we could blow through the 9 Billion U.N. population forecast (plus 2.5 Billion from 2014).

It could happen – consider how badly the 2008 global financial crisis was handled by politicians and central bankers. “Never underestimate a politician”. But even with the below trend growth that we observe in the USA and EU, the Bottom Two Billion is transitioning out of subsistence farming to urban progress. Continuing that progress is essential to the forecasts of better health leading to falling family sizes.

For more well-written background on the subject please read the recent survey article by Stanford’s Martin Lewis: Population Bomb? So Wrong. E.g., did you know that India and America fall into the same (TFR of 2 to 3) fertility bucket? Excerpt:

India’s declining fertility rate, now only slightly higher than that of the United States, is part of a global trend of lower population growth. Yet the media and many educated Americans have entirely missed this major development, instead sticking to erroneous perceptions about inexorable global population growth that continue to fuel panicked rhetoric about everything from environmental degradation and immigration to food and resource scarcity.

In a recent exercise, most of my students believed that India’s total fertility rate (TFR) was twice that of the United States. Many of my colleagues believed the same. In actuality, it is only 2.5, barely above the estimated U.S. rate of 2.1 in 2011, and essentially the replacement level. (A more recent study now pegs U.S. fertility at 1.93.) Still, from a global perspective, India and the United States fall in the same general fertility category, as can be seen in the map below.

Click to embiggen

In today’s world, high fertility rates are increasingly confined to tropical Africa. Birthrates in most so-called Third World countries have dropped precipitously, and some are now well below the replacement rate. Chile (1.85), Brazil (1.81), and Thailand (1.56) now have lower birthrates than France (2.0), Norway (1.95), and Sweden (1.98).

To be sure, moderately elevated fertility is still a problem in several densely populated countries of Asia and Latin America, such as the Philippines (3.1) and Guatemala (3.92).

Click to embiggen

I highly recommend a careful read of the Martin Lewis essay. E.g., the surprising correlation of TV viewing with TFR (is it causation or coincidence?).

So, that’s a summary of the perspective of academics who make a living worrying about population growth. But we also know very well how to make population growth a problem again. One way is to destroy the engine of economic growth – especially the ongoing improvement of the incomes of the very poor in Africa and Asia. If we did that, then we could blow through the U.N. 2050 forecast of 9 to 10B population.

It could happen – consider how badly the 2008 global financial crisis was handled by politicians and central bankers. “Never underestimate a politician”. But even with the below trend growth that we observe in the USA and EU, the Bottom Billion is transitioning out of subsistence farming to urban progress. That progress is essential to the forecasts of better health leading to falling family sizes.

Another way is to allow climate change to develop so much momentum that there are no feasible mitigation or adaption strategies. That is sure to diminish incomes & health in the vulnerable populations. And those are the key drivers of the population turnaround.

Dear reader, if you know of other quality resources please provide the links. And especially please mention any peer-reviewed work that contradicts the consensus view reflected by professors Rosling,  Cohen and Lewis.

The “Drop” an iPad-connected smart scale – stressless baking?

NewImage

Here’s an idea. Check out their promotional video. The seamless baking process looks so painless that I’m thinking Christmas present for somebody special? 

Does Drop deliver $80 of value to a galley that already has a digital scale? Maybe – if the software made it nearly painless to convert our existing favorites to their gram-based scheme. The Wired review closes with similar thoughts: 

There is one significant limitation: In order to ensure a smooth user experience, the hardware, app, and content all need to be mashed together—meaning buyers will be stuck using Drop’s collection of recipes at the outset. This strategy gives budding bakers a tremendous amount of power, but also means there will be a relatively small amount of content. The team is working on an importer tool that will allow cooks to add and share their favorite recipes, but for now buyers will have to trust the taste of Irish techies.

Does it deliver $80 of value to a galley that already has a digital scale? If the software made it nearly painless to convert our existing favorites to their gram-based scheme… OTOH for a young twenty-something workaholic techie, this could be a no-brainer joy.
This reminds me a bit of Inkling – very cool interactive books, but ONLY their catalog. No sale here. Regarding Drop, perhaps for a young twenty-something workaholic techie, this could be a no-brainer.

New York-based Placemeter is turning disused smartphones into big data

NewImage

A city window overlooking the street has always been a score in its own right, what with so many apartments stuck opening onto back alleys and dumpsters and fire escapes. And now, a company wants to straight up monetize the view. New York startup Placemeter is paying city residents up to $50 a month for street views captured via old smartphones. The idea is to quantify sidewalk life in the service of making the city a more efficient place.

“Measuring data about how the city moves in real time, being able to make predictions on that, is definitely a good way to help cities work better,” says founder Alex Winter. “That’s the vision of Placemeter—to build a data platform where anyone at any time can know how busy the city is, and use that.”

Here’s how it works: City residents send Placemeter a little information about where they live and what they see from their window. In turn, Placemeter sends participants a kit (complete with window suction cup) to convert their unused smartphone into a street sensor, and agrees to pay cash so long as the device stays on and collects data. The more action outside—the more shops, pedestrians, traffic, and public space—the more the view is worth.

On the back end, Placemeter converts the smartphone images into statistical data using proprietary computer vision. The company first detects moving objects (the green splotches in the video below) and classifies them either as people or as 11 types of vehicles or other common urban elements, such as food carts. A second layer of analysis connects this movement with behavioral patterns based on the location—how many cars are speeding down a street, for instance, or how many people are going into a store.

Placemeter knows your first question—Isn’t this invasion of privacy?—and insists that it’s taking all measures to ensure anonymity. The smartphone sensors don’t capture anything that goes on in a meter’s home (such as conversations), and the street images themselves are analyzed by the computer, then deleted without being stored. The only thing that ends up saved in the company’s system, says Winter, is the rough data.

Source Atlantic CityLab. This 7 second video is a year old, but it gives a glimpse of the power of the Placemeter Traffic Analysis.

The visualization NYC Taxis: A Day in the Life has gone server-crashing-viral. It was constructed by NYC developer and city-data-wonk Chris Whong (details here). Correction: I’m including Whong’s work here because the reference was published on the Placemeter Blog. As you’ll see in the comments Chris is admired at-, but isn’t employed at Placemeter.