A Nobel Prize for a Behavioral Economics Pioneer: Are there Lessons for (Utility) Regulation?

Guest Blog | December 27, 2017 | Energy Efficiency, Energy Policy, Utilities

This is a guest post of a blog written by Scott Hempling which originally ran on his law firm’s blog in November 2017 here.

Richard Thaler has won the Nobel Prize in Economics, by undermining the “rational actor” assumption central to economics.  He proved that humans’ economic decisions are afflicted by systematic biases. His discoveries have direct application to utility regulation.

Less interested in economists’ formulas than in humans’ quirks, a young Thaler began keeping “The List”:  examples of decisions—including economists’ decisions—that were economically irrational.  Then, inspired by the groundbreaking work of psychologists Amos Tversky and Daniel Kahneman, Thaler spent decades proving that people have consistent, predictable biases that distort decision-making.  Thaler describes his work superbly in Misbehaving:  The Making of Behavioral Economics (2015).  The book is both autobiography and economic history, because Thaler’s career made economic history.

Thaler also wrote, with Cass Sunstein, the great book Nudge:  Improving Decisions About Health, Wealth, and Happiness (2008), whose applications to utility regulation I discussed in a prior essay.   Kahneman himself won the Economics Nobel in 2002, a prize Tversky would have shared had he lived.  Kahneman detailed his discoveries in his masterpiece, Thinking Fast and Slow (2011); the entire subject was recently retold for popular readership by Michael Lewis in The Undoing Project:  A Friendship that Changed Our Minds (2017).

Reading these four books causes one to ask:  Might the biases discovered by these intellectual eminences affect utility regulation?  (I use “bias” not in the conventional sense of having a closed mind, or a predisposition to favor one side of a debate, but rather in the Thalerian sense of having a propensity to make decisions based on irrelevant factors.)   Among the many bias-types discovered by Kahneman, Tversky and Thaler, consider these three.

Anchoring:  Are decisions affected by irrelevancies?  Experimenters rigged a wheel of fortune so that it always stopped on 10 or 65.  Two separate groups of students were told to watch the wheel spin, write down the resulting number; then guess the percentage of African nations in the United Nations.  For the group whose wheel stopped at 10, the average answer was 25%.  For the group whose wheel stopped at 65. the average answer as 45%.  (Thinking Fast and Slow at 119.)  The wheel’s number was the anchor; the anchor influenced the guess.  Real world implication?    “If you consider how much you should pay for a house, you will be influenced by the asking price.”  Id. at 122.  

When a company proposes a $75 million rate increase, that number becomes the anchor.  Then when the Commission allows an increase of $40 million, its press release says, “We saved customers $35 million.”  The anchor influenced the case, the outcome, and the explanation.  The anchored question was:  “Has the company justified a $75 million increase?”  The right question is:  “What should it cost to supply this service territory?”

To avoid anchoring—to get the estimate right—one first must detect the anchor, then reduce its influence.  Both steps require a clear head:  “People adjust less (stay closer to the anchor) when their mental resources are depleted, either because their memory is loaded with digits or because they are slightly drunk.”  Thinking Fast and Slow at 121-122.

Halo effect:  You work for a charitable organization.  At a party you meet John, whom you find very amicable. You like John, so you put him on your list of possible donors.  But amicability does not equal generosity.  You’ve experienced the halo effect, because you associated amicability with generosity—without a speck of data to support the association.  You viewed the relationship between amicability and generosity as “simpler and more coherent than the real thing.”  Thinking Fast and Slow at 82.

Electricity is awesome, literally:  Flip a switch, 500 miles away a puff of smoke appears, your light goes on.  Also awesome is utility financing.  Tasked with financing new generation and transmission, the CFO somehow coaxes $2 billion from diverse lenders and shareholders.  Does our awe of technology and finance, our respect for the CEO’s engineering expertise, cause us to think well of a utility’s rate case request?  Anyone who dismisses this tendency is likely underestimating the halo effect.  “The sequence in which we observe characteristics of a person … matters … because the halo effect increases the weight of first impressions, sometimes to the point that subsequent information is mostly wasted.”  Id.  Rather than deny the halo effect, we should acknowledge it—then mitigate it.  We can mitigate it by seeking independent sources of information; and, as in good police procedure, prevent these sources from coordinating their testimony.  Id.

Narrow framing:    A major media holding company owned 23 publications, each run by a separate executive.  In a joint meeting, Thaler asked each one:  If you had a chance to make an investment, one which had a 50% chance of making a profit of $2 million and 50% chance of losing $1 million, would you do it?  Twenty of the 23 said no:  As one executive explained, success would get him “a pat on the back and a bonus equal to three months income,” while failure could get him fired.  The holding company CEO saw it differently:  If all 23 made the investments, there would be, based on the probabilities, a total gain of $11.5 million.  The problem was narrow framing:  The holding company’s compensation practices caused each executive to focus only on her situation rather than the total company situation.  See Misbehaving at 188.

Are there situations in utility regulation—on the seller or the buyer side—where each individual action (or inaction) makes sense to the actor, but the combination of actions or actions does not?  If so, we suffer from narrow framing.  Narrow framing also occurs when we base a decision only on the arguments and evidence presented by opposing parties, rather than asking “What would I need to know before I formed an opinion about [this particular issue]?”,  Thinking Fast and Slow at 86, and then acquiring all that necessary information.

*  *  *

Economists initially objected to these discoveries, arguing that rational people cannot be irrational.  Tversky responded succinctly:  “A theory of vision cannot be faulted for predicting optical illusions.  Similarly, a descriptive theory of choice cannot be rejected on the grounds that it predicts ‘irrational behavior’ if the behavior in question is, in fact, observed.”  The Undoing Project at 286.  Thaler got on his colleagues’ nerves, because he used facts and logic to expose looseness in conventional assumptions.  A critic of his profession, he is making it better.

I invite readers to send me examples of how inadvertent biases—as defined here—affect regulatory decision-making.  Like Thaler and his colleagues, together we can advance our knowledge and improve our profession.

Guest Blog
My Profile