By James Kwak
Don’t get me wrong: I like behavioral economics as much as the next guy. It’s quite clear that people are irrational in ways that the neoclassical model assumes away, and you can’t see human nature quite the same way after hearing Dan Ariely talk about his experiments on cheating. But I don’t think cognitive fallacies are the answer to everything, and I don’t think you can explain away the myriad crises of our time as the result of them, as Richard Thaler does in his recent New York Times article.
Like many people, Thaler wants to write about the parallels between the financial crisis and the BP oil leak. For Thaler, the root cause of both crises is that “people in general are not good at estimating the true chances of rare events, especially when human error may be involved” — catastrophic market seizures in the first case, catastrophic oil rig explosions in the latter case.
I have no doubt that it is true that people have problems estimating the chances of certain rare events.* But to stop there is to whitewash the sins of the companies and the executives who created these crises.
First, it doesn’t do to say that ordinary people are irrational in making ordinary everyday decisions, and therefore we have to accept that companies will be irrational in making big decisions — like, say, whether to drill holes in the Earth’s crust a mile under the ocean. As they say, people make big bucks to make these decisions, and we expect them to use a little more reasoning than the kind we evolved on the African grasslands.
The problem isn’t that people have cognitive biases in assessing unlikely events. When you’re dealing with a big company like Citigroup or BP, you have many people applying lots of clever thinking to these problems. The problem is that there is a systematic bias within these companies against certain assessments and in favor of others. That is, the guy who shouts, “Danger! Danger!” will be ignored (or fired), and the guy who says, “Everything’s fine, the model says disaster can strike only happen once every hundred million years” will get the promotion — because the people in charge make more money listening to the latter guy. This is why banks don’t accidentally hold too much capital. It’s why oil companies don’t accidentally take too many safety precautions. The mistakes only go one way. You have executives assessing complex situations they don’t even begin to grasp and making the decisions that maximize their corporate and personal profits. (Is BP’s CEO going to give back years of bonuses now?)
On top of that, it isn’t even true, as a matter of fact, that the companies involved failed to estimate the risk of disaster. In a recent Fresh Air interview, Abrahm Lustgarten discussed three internal BP memos, written in 2001, 2004, and 2007, each of which warned that the company’s culture of inattention to safety — “a consistent emphasis of profits over production over safety and maintenance and environmental compliance,” in Lustgarten’s words — was creating a high degree of risk. The problem wasn’t cognitive fallacies; it was that BP employees were almost certainly falsifying internal inspection reports because of pressure to let production go forward.
This isn’t inability to quantify the likelihood of unlikely events; this is willfully looking the other way.
Thaler also wants to make the point that regulators are incapable of understanding the complex technologies involved, whether in finance or in oil exploration. But while this is undoubtedly true to an extent, it also misses the main point.
The most frightening part of Lustgarten’s interview has nothing to do with BP. It’s about the use of hydraulic fracturing (or “fracking,” apparently with no intended reference to Battlestar Galactica) to drill for natural gas. In fracking, a mixture of water and chemicals is injected underground under extremely high pressure to break up rock formations and release trapped natural gas bubbles. According to Lustgarten, there is no scientific understanding of what happens to those chemicals — many of which are toxic — and whether they end up in our drinking water. Yet the Energy Policy Act of 2005 forbids the EPA from regulating fracking under the Safe Drinking Water Act — by simply stipulating, without proof, that the chemicals are removed after being used, and therefore there is nothing to regulate.
If this reminds you of the Commodity Futures Modernization Act, it probably should. How could this happen? You should listen to the interview because I’m working from memory, but basically the EPA (this is under Bush and Cheney, remember) negotiated the deal with Halliburton and the other gas exploration companies. The EPA agreed to the stipulation, and hence the exemption for fracking, and in exchange the drillers agreed to stop using benzene (or diesel fuel, of which benzene is a component) as a fracking chemical. Years later, however, we now know that the exploration companies simply continued to use benzene as a fracking chemical.
This is what happens when you have a weak regulatory agency crippled by pressure from above (and political appointees who are opposed to regulation) and a private sector that simply does whatever it pleases in pursuit of profits. It’s not individual irrationality; it’s power, pure and simple. Free market economics has already whitewashed enough egregious corporate behavior. Let’s not repeat that mistake with behavioral economics.
* I have always been puzzled, however, by the fact that sometimes people say we underestimate certain unlikely risks, like financial meltdowns, but sometimes people say we overestimate certain other unlikely risks, like dying in a plane accident or a tornado.