Was Shiller really expecting banks or Wall Street firms to fail? "Get real," I thought.
That was January. This is March. The other day, American Banker reported that the watch list of banks sick enough to maybe die had about four-score names on it. Today, the NY Times tells us Bear Stearns "now faces the prospect of the end of its 85-year run as an independent investment bank."
How did we get into this mess? How did the making and reselling of mortgage loans obtained by people who couldn't pay them back – much less afford a downpayment– create such toxicity that august financial institutions around the globe have turned into cowardly lions?
In a more recent op-ed piece, professor Shiller looks at why even experts can suffer from herd behavior:
If people do not see any risk, and see only the prospect of outsized investment returns, they will pursue those returns with disregard for the risks.A generation ago, another Yale scholar, psychologist Irving Janis, became interested in herd behavior because of the Bay of Pigs fiasco. How could JFK and his advisers not have recognized the hare-brained scheme for what it was? Janis saw the problem as 'groupthink." This Yale Alumni Magazine story explores his theory.
Were all these people stupid? It can’t be. We have to consider the possibility that perfectly rational people can get caught up in a bubble. In this connection, it is helpful to refer to an important bit of economic theory about herd behavior.
Three economists, Sushil Bikhchandani, David Hirshleifer and Ivo Welch, in a classic 1992 article, defined what they call “information cascades” that can lead people into serious error. They found that these cascades can affect even perfectly rational people and cause bubblelike phenomena. Why? Ultimately, people sometimes need to rely on the judgment of others, and therein lies the problem.
Mr. Bikhchandani and his co-authors present this example: Suppose that a group of individuals must make an important decision, based on useful but incomplete information. Each one of them has received some information relevant to the decision, but the information is incomplete and “noisy” and does not always point to the right conclusion.
Let’s update the example to apply it to the recent bubble: The individuals in the group must each decide whether real estate is a terrific investment and whether to buy some property. Suppose that there is a 60 percent probability that any one person’s information will lead to the right decision.* * *Suppose houses are really of low investment value, but the first person to make a decision reaches the wrong conclusion (which happens, as we have assumed, 40 percent of the time). The first person, A, pays a high price for a home, thus signaling to others that houses are a good investment.
The second person, B, has no problem if his own data seem to confirm the information. . . . But B faces a quandary if his own information seems to contradict A’s judgment. In that case, B would conclude that he has no worthwhile information, and so he must make an arbitrary decision — say, by flipping a coin to decide whether to buy a house.
The result is that even if houses are of low investment value, we may now have two people who make purchasing decisions that reveal their conclusion that houses are a good investment.* * *Mr. Bikhchandani and his co-authors worked out this rational herding story carefully, and their results show that the probability of the cascade leading to an incorrect assumption is 37 percent. In other words, more than one-third of the time, rational individuals, each given information that is 60 percent accurate, will reach the wrong collective conclusion.
No comments:
Post a Comment