Wednesday, June 10, 2009

A Model Mania

Here's a quote from Tyler Cowen's new paper about what caused the crisis.

Once we liberate ourselves from applying the law of large numbers to entrepreneurial error, as Black urged us, another answer suggests itself.  Investors systematically overestimated how much they could trust the judgment of other investors. Investment banks overestimated how much they could trust the judgment of other investment banks. Purchasers of mortgage-backed securities overestimated how much they could trust the judgment of both the market and the rating agencies as to the securities' values. A commonly held view was that although financial institutions had made large bets, key decision makers had their own money on the line and thus things could not be all that bad. Proceeding on some version of that assumption, most market participants (and regulators) held positions that were increasingly vulnerable to systemic financial risk.

What he's trying to do is come up with a theory where a bubble can be seen as rational response on the part of the participants, hence integrating it into what they call "rational expectations" theory in economics.  Normally, the theory is aimed at proving how the market will always revert to the mean.  Some people overestimate how much something is worth, other people underestimate it, and in aggregate these cancel out.  If this were true, it would generally be fine to look at what your neighbor was doing and copy it (ie. trust him), statistically speaking, because your neighbor would be bound to be within certain error bars of reality, even if he were doing the same type of copying.  It's true, a crowd where people copy one another could fall into a pattern of systematically misestimating the value of something if it started with just the wrong seed, but given that the crowd will never have just one seed, this is probably not a robust mechanism for introducing error.  Even if most of the people just copy their neighbor, if there were some percentage who acted independently, you would still overall have a reversion to the mean.  The only difference would be in the size of the actor units, if that makes sense.  There would be 'lead steers' who influenced a whole herd of people, but there would be several such steers, and in general their errors, and hence the errors of their followers, could be expected to cancel one another out. 

If you reduced the number of independent actors (increased the number of copiers) as a percentage of the population, there would presumably be a tipping point where any misestimation could spread unchecked, with no tendency to revert to the mean, and indeed, with the tendency to echo around this hermetic ring of imitators and reinforce itself.  You could look at this as a sort of mathematical version (on the assumption you needed one) of the theory that misaligned incentives can lead to a bubble.  If your neighbor only makes money when he correctly bets on some underlying reality, then you can trust him to have independent incentives, and in general it's no problem to copy him.  If instead he just gets paid based on the number of people who trust him, then of course he's not trustworthy at all, and you should steer clear of imitating him.

But what's going on here with this notion of "underlying reality"?  What the fuck is that supposed to mean when we know that this reality is created by the market participants themselves?  A simpler and more immanent definition of trust wouldn't make reference to any underlying reality or independence, but just to the idea that you can trust someone whose incentives are aligned with your own -- you can trust someone who only wins if you do.  But this type of trust still leaves a system vulnerable to delusional feedback loops.  Everyone's incentives might line up and we might all march off the cliff together. 

Now I'm thinking that you really get to the core of the problem when you see this in terms of zero and non-zero sum games.  Trust comes fundamentally from alignment of incentives, that is from playing games where we both win or lose together -- trust would be warranted in these instances.  Conversely, you cannot trust someone whose outcome is not correlated with your own, and especially not if you are competing with them, winning when they lose and vice-versa -- here trust is unwarranted.  It seems to me that you can get a bubble in either of these cases.  Cowen points out that too much trust where it is not warranted can lead the copying mechanism to diverge from the underlying reality.  But lots of trust even when it is warranted can also lead to out of control copying.  The difference is that in this second case the divergence can be self-fulfilling by actually altering the reality.  I suspect that this might be a consequence of the mutuality of the one situation (warranted trust runs in both directions) versus the lack of such a requirement in the other (unwarranted trust doesn't require that we both win, and so may leave open the possibility of a global non-zero sum game) but I can't quite see this clearly.

Anyhow, I'm not sure this is anything but a poor restatement of the ideas Soros talks about.  I do find it intersting to think about the light and dark sides of feedback loops though.  We only get somewhere in the long run by working together with our incentives aligned, but our tendency towards this is so easy to abuse, either because we figure out a way to fake the alignment of incentives for a time, or because we really do jump in the same handbasket to hell.  To wax philosophical for a moment, we might conclude that "reality" is just about time scales and sustainability.  So when Cowen says:

One view of rational expectations is that investors' errors will cancel one another out in each market period. Another view of rational expectations is that investors' errors will cancel one another out over longer stretches of time but that the aggregate weight of the forecasts in any particular period can be quite biased owing to common entrepreneurial misunderstandings of observed recent history. In the latter case, entrepreneurial errors magnify one another rather than cancel one another out. That is one simple way to account for a widespread financial crisis without doing violence to the rational expectations assumption or denying the mathematical elegance of the law of large numbers.

we can only reply:

The long run is a misleading guide to current affairs. In the long run we are all dead. Economists set themselves too easy, too useless a task if in tempestuous seasons they can only tell us that when the storm is past the ocean is flat again. 


 

No comments: