This blog was written by John D. Wilson, former Deputy Director for Regulatory Policy at the Southern Alliance for Clean Energy.Guest Blog | September 29, 2010
The Breakthrough Institute has midwived another denier myth with Jesse Jenkins’ Michael Shellenberger’s introduction to a blog post, “Why Energy Efficiency Does Not Decrease Energy Consumption.” The sensational headline caught my eye, and I found another unfortunate example of BTI’s sometimes incendiary approach to being provocative.
AND … the spin advanced by BTI here completely undermines their “idea“: renew the economy, reduce global warming, and achieve energy independence by “Making Clean Energy Cheap.” So I am at a total loss as to why BTI would put a little spin into the denialist info-cycle.
Evan Mills of Lawrence Berkley Laboratory first noted the potential for a new study on the “rebound effect” to become a new tool in the climate denial arsenal of tricks on the Climate Progress blog, which resulted in The Economist acknowledging error and retracting some of the most alarming soundbites.
In the paper that sparked the debate, Dr. Saunders and his colleagues have put forward some worthwhile points. It is correct that as we save money through energy efficiency, that will free up income to purchase things that use more energy. As even the Heritage Foundation’s blog points out, “But if you are actually concerned with improving human welfare, it’s quite good news.”
Sensationalism is a widely-acknowledged bias in media coverage, with new, counterintuitive research being one of the ways to increase reader interest in a story. But the sensational, counterintuitive idea that energy efficiency doesn’t save energy (also expressed in The Economist), prompted a sharp rebuttal from the study authors: “Your amusing but hopefully tongue-in-cheek conclusions about the “greenness” of incandescent lighting would be, if serious, off-base and in our view potentially harmful.”
But really, have Dr. Saunders and his colleagues really proven that “Energy Efficiency Does Not Decrease Energy Consumption”? And should the study authors offer that same rebuttal to Michael Shellenberger for his sensational headline? There are two reasons that people should be extremely skeptical of this ill-advised declaration: study method and industry practice.
Econometric models – vulnerable to garbage in, garbage out issues
The value of econometric models in evaluating the effectiveness of energy efficiency investments depends on data quality. Economists who favor this method have made several attempts to evaluate energy efficiency programs using these methods over the past couple of decades. For example, a recent discussion paper by Resources for the Future suggested that costs of energy efficiency programs were much higher than measured in regulatory proceedings or other direct reporting methods. However, it is our understanding that the RFF paper has not made it into peer reviewed publications because of these very issues which have been provided in peer reviews.
The basic problem with applying econometric study methods to the field of energy efficiency is that the systematic reporting of energy efficiency cost and impact data is severely deficient. In order to understand whether utilities are offering least-cost solutions in their energy efficiency programs, for example, SACE has resorted to conducting extremely tedious benchmarking with original source data.
The second problem is that the econometric findings can’t be replicated in the real world. Consider the counter-factual: if we *don’t* promote energy efficiency, will energy consumption be about the same as if we do?
How energy efficiency success is actually measured
The way that energy efficiency programs are operated and evaluated involves assuming a “measure life.” Installing that LED light bulb (thanks to an EE program) saves energy up to the point at which the customer would have installed that LED bulb anyway. Typical measure lives for something like that are 5-15 years.
Longer measure lives are credited to programs that involve modifications to infrastructure up front. For example, installing daylighting in a new commercial building, something that will save energy over the entire life of the building and something that would be impossible to install as a retrofit. A program that assists builders with this installation will result in energy savings that are typically rated at 40 years.
So to conclude that, “Energy Efficiency Does Not Decrease Energy Consumption,” one would have to find that the energy savings over the measure lives of that equipment are canceled out by additional energy use that would never have occurred if those energy efficiency measures had not been encouraged. This is very different than checking to see if the reduction in energy use by switching from a 100 watt bulb to a 10 watt bulb (for the same amount of light) is exactly a 90% energy savings, and if the study isn’t designed with an accurate understanding of how energy efficiency programs work, well, it won’t be a study of energy efficiency programs.
There are three ways that these measures help save energy. The first is direct – by saving energy in the sockets where the bulbs are replaced or the buildings where the unnecessary lighting is never installed. This is the primary area where “rebound” research has focused, and as Michael Shellenberger acknowledges, the increased direct use of energy in that same building tends to result in about a 5% reduction in the technical estimate for energy savings.
In their paper, Dr. Saunders and his colleagues seem to be saying that the upstream impacts of the financial savings for that building owner (and other beneficiaries) will be to increase consumption and thus drive overall economic consumption. Use less energy, buy more stuff (which needs energy to be made). Lighting points out a conceptual flaw in this otherwise persuasive model – energy savings is often a small percentage of the economic benefits of installing efficient lighting for many customers. One of the other significant benefits of efficient lighting is buying fewer lightbulbs. Use less energy, buy less stuff (which also saves energy). I’ve never seen an efficiency program that takes credit for this indirect energy savings.
The second is much smaller: reduced demand on the electric grid results in efficiency gains in transmission and generation (electricity is made and delivered a bit more cheaply for everyone). What Dr. Saunders seems to be saying is that the global reduction in energy costs will free up enough capital to invest in equipment that will use more energy. Here’s where we have to rely on econometric analysis, which can be very powerful, but only if the data and models have demonstrated an ability to reflect reality.
Market transformation is the third way that these measures save energy. It isn’t clear to me whether Dr. Saunders and his colleagues are engaging this topic directly. But the basic idea here is that by investing in energy efficiency, less wasteful technologies are adopted by the market in general more quickly.
It is really hard to see how accelerating the adoption of energy efficient technologies will be a zero-sum game. (Take it the other direction: Would we use just as much energy if we regressed to whale oil?) Isn’t rapid deployment of new clean energy technology the “big idea” that The Breakthrough Institute is trying to promote?
Make Clean Energy Cheap
Drive down the price of clean energy technologies with large-scale public investments in research, development, demonstration, and deployment.
So if making clean energy cheap just means that we use so much of it that it’s not clean, then what’s the point?
Update: Subsequent to posting this blog, The Breakthrough Institute revised its post with attribution to “Breakthrough Staff” rather than Jesse Jenkins. I am advised that Michael Shellenberger, not Jesse Jenkins, wrote the introduction with which I am mainly concerned in this commentary, and have made appropriate revisions above.