JPM-Hit by the limits of statistics?

thetrader's picture




We are amazed by the many articles discussing (and giving advice)  the JPM “perfectly hedged” trading CIO book, by mainly journalists that don’t understand risk, trading nor (the) greeks. JPM, claiming to have one of the world’s most sophisticated risk management, has apparently lost at least 2 billion USD. What actually happened will probably never hit the media. While journalists debate whether or not to drop VaR as a risk measure, we think it is appropriate to review an old article from Nassim Taleb, one of few clever minds when it comes to understanding “real” risks. From Taleb’s Fourth Quadrant, via Edge.

Statistical and applied probabilistic knowledge is the core of knowledge; statistics is what tells you if something is true, false, or merely anecdotal; it is the “logic of science”; it is the instrument of risk-taking; it is the applied tools of epistemology; you can’t be a modern intellectual and not think probabilistically—but… let’s not be suckers. The problem is much more complicated than it seems to the casual, mechanistic user who picked it up in graduate school. Statistics can fool you. In fact it is fooling your government right now. It can even bankrupt the system (let’s face it: use of probabilistic methods for the estimation of risks did just blow up the banking system).

The current subprime crisis has been doing wonders for the reception of any ideas about probability-driven claims in science, particularly in social science, economics, and “econometrics” (quantitative economics).  Clearly, with current International Monetary Fund estimates of the costs of the 2007-2008 subprime crisis,  the banking system seems to have lost more on risk taking (from the failures of quantitative risk management) than every penny banks ever earned taking risks. But it was easy to see from the past that the pilot did not have the qualifications to fly the plane and was using the wrong navigation tools: The same happened in 1983 with money center banks losing cumulatively every penny ever made, and in 1991-1992 when the Savings and Loans industry became history.


It appears that financial institutions earn money on transactions (say fees on your mother-in-law’s checking account) and lose everything taking risks they don’t understand. I want this to stop, and stop now— the current patching by the banking establishment worldwide is akin to using the same doctor to cure the patient when the doctor has a track record of systematically killing them. And this is not limited to banking—I generalize to an entire class of random variables that do not have the structure we thing they have, in which we can be suckers.

And we are beyond suckers: not only, for socio-economic and other nonlinear, complicated variables, we are riding in a bus driven a blindfolded driver, but we refuse to acknowledge it in spite of the evidence, which to me is a pathological problem with academia. After 1998, when a “Nobel-crowned” collection of people (and the crème de la crème of the financial economics establishment) blew up Long Term Capital Management, a hedge fund, because the “scientific” methods they used misestimated the role of the rare event, such methodologies and such claims on understanding risks of rare events should have been discredited. Yet the Fed helped their bailoutand exposure to rare events (and model error) patently increased exponentially (as we can see from banks’ swelling portfolios of derivatives that we do not understand).

Are we using models of uncertainty to produce certainties?

Full article here

A few Taleb videos worth reviewing here

Chart VaR distribution via Wikipedia.