This blog was written by John D. Wilson, former Deputy Director for Regulatory Policy at the Southern Alliance for Clean Energy.Guest Blog | July 19, 2018
In 2016, there was an unusual spike in the operation of natural gas combustion turbine plants at several Southeastern utilities. At just 14 plants, fuel expenses increased by $210 million compared to recent years.
Across the Southeast, utilities increased gas peaker generation by 80%, compared to average use in 2010-14. Most of the increase occurred at 14 of 95 peaking plants in the Southeast. Duke Energy’s three operating companies were responsible for about 40% of the spike.
Just why such a spike might occur is a bit of a mystery. I’ll suggest a few theories below, but I am hoping that readers will comment on this and suggest a more plausible theory (or combination of theories).
What’s at stake?
Fueling gas peaker plants with natural gas affects our climate twice. Most obviously, by the carbon pollution that rises from the smokestack at the plant. But as Bill McKibben writes, “… any methane that escapes unburned into the atmosphere on the way to the power plant warms the planet very effectively — so effectively that if you leak more than 2 or 3 percent, it’s worse for climate change than coal.”
“The bottom line is that the amount of methane being lost across the supply chain is substantial, it’s higher than current estimates from the EPA, and the amount is large enough to double the climate footprint of natural gas on a 20-year basis,” Ramon Alvarez, associate chief scientist at EDF and lead author of the study, said in a telephone interview.
As EDF points out, the oil and gas industry has “significant opportunities … to dramatically and cost-effectively reduce methane emissions today.” In the Southeast, where there is relatively little oil and gas industry association with methane emissions, the biggest thing we can do to reduce methane emissions is to reduce the use of natural gas. It’s complicated, but basically the leak rate is a percentage of total consumption, so leaks increase (or decrease) when natural gas consumption goes up (or down). Until industry acts to stop the leaks, more natural gas consumption means more methane in our atmosphere, and an acceleration of global warming.
Natural gas combustion turbines are both a small and a large part of the Southeast’s power grid. They are a small part of the grid in the sense that they only supply about 2-3% of the electricity. Typically, they are only operated at about 5% of maximum potential because they are relatively inefficient. For fuel alone, they cost at least double the average power plant in the Southeast.
Even though they are costly to operate, utilities use gas peakers to supply power for relatively short periods of time, or for very special needs such as maintaining reliability in a specific location. The high cost to operate these plants is offset by the relatively low cost to build them. For occasional use, natural gas peaking plants are used as a cost benchmark for short-term power needs. Because utilities are committed to providing reliable electric service, there are a lot of natural gas peakers in the Southeast. The 95 peaking plants that operated in 2016 represented about 17% of the Southeast’s power generation capacity.
The overall 80% increase in natural gas peaker plants in 2016 was concentrated at just 14 power plants operated by just five of the Southeast’s power systems. As shown below, generation at these 14 power plants increased by almost 150% in 2016 relative to the 2010-14 baseline.
Natural Gas Peaker Plant Generation in the Southeast, 2010-2016 (MWh, or megawatt-hours)
Note: I excluded 2015 from the baseline because most of the increase in that year was due to a very large, one-year spike at a single plant (Santee Cooper’s John S Rainey). One-year spikes at individual plants are not unusual even though this one was very large.
Some Unsatisfying Theories
I’m a bit confounded because I can’t think of something that these five utility systems have in common that would explain the large and sudden increase in generation from gas peakers. Even if one discounts Santee Cooper and the Florida municipals – which each have just a single plant spiking – none of my theories fit Duke Energy (Carolinas and Florida), Oglethorpe Power (Georgia) and the Tennessee Valley Authority (TVA) equally well without also applying to other utilities that did not show the same spike.
Solar generation – In 2016, solar generation in the Carolinas doubled from 1.1 to 2.2 million MWh. Perhaps Duke Energy needed more peakers to accommodate variable generation from solar?
I doubt it. First, there was no similar increase in solar generation in the territories of Duke Energy Florida, TVA or Oglethorpe Power (a cooperative utility system in Georgia). Second, the increase of roughly 5 million MWh in gas peaker generation in the Carolinas dwarfs the solar generation increase for that year.
Fuel cost drop – Fuel costs did drop in 2016. In 2010-2015, gas cost $48-79 per MWh at Southeastern peaker plants. In 2016, the reported cost dropped to just $42, an average price decrease of $25 per MWh compared to 2010-14.
However, while most of the gas peakers reported similar cost differences, there were other utilities in the Southeast that reported even greater reductions in the cost of gas. Florida Power & Light (FPL) and Alabama Power’s reported fuel costs dropped by $40 and $37 per MWh, respectively, but did not report gas peaker generation at anywhere close to the scale reported by Duke Energy.
It is quite likely that the drop in the cost of natural gas fuel is part of the reason for the spike in gas peaker generation. But there must be something specific about the plant dispatch decisions at Duke Energy, TVA and Oglethorpe that explains the distinct spike.
Lack of alternatives – Another possibility is that the ongoing transition from coal to natural gas combined cycle plants required a “bridge” as plant completions occurred. Perhaps combined cycle plants were simply being operated at maximum output and there was no good alternative besides a gas peaker? That doesn’t appear to be the case, as capacity factors for these plants averaged well below their 70-80% potential at TVA (54%), Duke Energy Florida (59%), and Oglethorpe (64%). Only Duke Energy in the Carolinas (72%) might have been constrained from using combined cycle plants more aggressively – but Georgia Power operated its plants at 67% capacity factor without triggering an increase in gas peaker use. So this might have been a factor in the Carolinas, but it’s not a particularly compelling theory.
Operational efficiency – A final theory I tested was that utilities were shifting from the even more inefficient natural gas steam plants to natural gas peakers. This theory is definitely not supported by the facts. Florida Power & Light used to be the region’s leader in natural gas fueled steam plants, but has cut back on this technology dramatically. The only other significant trend in this technology is an increase in its use by Duke Energy Florida, which is likely due to transitions in generation resources that are underway.
While fuel cost and limits on existing combined cycle plants may explain part of the increase, I think it remains a mystery why generation at these particular gas peakers spiked in 2016.
Why should anyone care? Well, even though the cost of natural gas is down, these increases resulted in huge costs that are passed through directly to customers. The fuel expenses at the 14 plants listed above increased from an average of $357 million per year to $567 million in 2016, despite fuel prices dropping from $65 to $42 per MWh. So the answer to this mystery is literally a $210 million question.
Carbon emissions are also up. Carbon dioxide emissions from gas peakers increased from an average of 9.4 to 16.7 million tons in 2016. For comparison, total carbon emissions in 2016 from Southeastern utilities were 428 million tons. The 2016 carbon total was the lowest of the modern era (carbon dioxide emissions in 2010 were 530 million tons), but if natural gas combined cycle plants had been used instead of peakers, roughly 2 million tons of carbon emissions could have been avoided.
I hope a reader of this blog can solve the mystery. Entries from Duke Energy staff are warmly encouraged!