If we can clear bias from our field of vision, perhaps we can avoid making blind rationalizations that lead us to failure.
The ability to rationalize is the greatest of all of our psychological defense mechanisms. It allows us to defend ourselves from our own stupid mistakes; to deal with great psychic trauma; and make painful decisions for the greater good (among other things).
As product developers, rationalization is often what allows us to go forward under uncertain conditions and survive incredible blunders. How can we know when those rationalizations are well-founded decisions, or simply self-delusions?
Recently, I watched a segment on PBS NOVA, Mind over Money, that made me wonder about this delusional nature. It made me think about how we rationalize choices in the face of irrefutable evidence. Simply put, how can we still believe we are right when there is clear evidence that we will be, or have been, disastrously wrong?
In the world of product development, the question translates to dumb decisions like “New Coke,” cold-blooded decisions like the Ford Pinto, and fatally negligent decisions like the Gulf disaster.
What allows us to make decisions based on flimsy evidence? What allows us to make predictions with the full realization that catastrophe looms if we are wrong?
The segment detailed the battle between rationalist and behaviorist economic philosophies. Rationalists believe in an efficient market hypothesis, which starts with the conviction that people are rational about financial matters, and that emotions have no place in decision making. Rationalists say we evaluate risks intuitively, and behave as if we are doing the calculations when we make decisions about money. Behaviorists, on the other hand, believe that our fiscal behavior can be as capricious and whimsical as our emotions, and markets go up or down based on those emotions.
Rationalists claim that prices and markets are always correct, and there is no need for government to control any part of a rational, efficient market. The problem with the rational model is that it does not predict catastrophes. If we were rational, we would not believe house values could keep going up forever, with prices far in excess of their value. If we were rational, we would not panic when an asset, like our house, which we have overleveraged, cannot be sold for what we have in it. But we do panic. In that panic, when everyone tries to unload, assets cannot be sold for any price.
Rational economists do not allow for such panic, because it doesn’t fit their model. The extensive mathematical models of rational economics have no place for such irrationality. This is just the thing that rationalists hate about behaviorists. Because human emotion cannot be quantified, no measures will fit neatly into a mathematical model. Professor John Cochrane of the University of Chicago Booth School of Business says behaviorists cannot be right because they have no mathematical model. Really? What a fascinating piece of circular logic.
Rationalists also have no place for excessive exuberance, as such behavior cannot be predicted by mathematics. Excessive exuberance is what accounts for a speculative bubble, which is invariably followed by a correction (code for panic).
This country’s denial of the rationalist economic model has brought forth astonishing self-delusions.
Behaviorists have substantial psychophysical evidence, such as neurological studies that have found financial decisions to be reinforced in the same place that is affected by drugs, food, and sex. This suggests that financial decisions are not rational, but emotionally driven. To this day, rationalists either deny this information, or counter it by stating that we as individuals may be rational, but as a collective we are not.
Unfortunately, the world of new product development is overrun with opportunities for this kind of self-delusion. In our field, we often use rationalization to protect ourselves from the obviousness of our mistakes, to defend our unfounded beliefs, or to excuse all manner of stupid decision-making. How can we guard against such unwarranted self-delusion? We can learn from the failures in the financial arena.
First, we cannot not force what we think we know into a model. I cannot count the number of small inventors and big company marketing types who have said that they “know” their product is going to sell, despite tepid market studies. The data, they claim, does not show the truth. When the facts don’t support the model, we must modify the model, not the facts.
We must also make sure the model is based on solid data, and we have to understand and interpret that data correctly. Think of John Nash’s early work in which he “proved” that people will always try to get as much from a situation as possible. Much of rationalist economics is based on his theories, which support the notion that greed is good, and benefits will trickle down. Yet Nash’s theory was based on flawed and biased research. Once he was out of his psychotic phase (remember A Beautiful Mind?), he recanted his conclusions. Nash said that the only people who believed in his results were economists and paranoid schizophrenics.
This discussion is not about whether or not rationalists or behaviorists are correct in their opinions. It’s about the theorists, the individuals who have made a lifetime investment in a particular position and are emotionally and irrevocably tied to a stance that is ultimately not supported by data. However, they hold firm and rationalize to defend their position at any cost.
The moral of this story is that we must all do a little self-analysis. We must acknowledge our emotional involvement in the outcome of whatever we are building, promoting, or protecting. If we can clear our field of vision of that bias, perhaps we can avoid making blind rationalizations that lead us to failure. Perhaps we can better predict and prevent impending disaster.