[[ Check out my Wordpress blog Context/Earth for environmental and energy topics tied together in a semantic web framework ]]

Thursday, February 26, 2009

Recipe for Disaster

The limits of easy oil availability clearly caused the massive economic downturn that started last year.

From an article "Recipe for Disaster: The Formula that Killed Wall Street", the author tries to convince us that a single equation devised by David X. Li caused the financial crisis.

But we really can't blame David X. Li.

I went and hunted down Li's original paper and came to the conclusion that he tried his best to actually take into account the risk of dependent events. It just turned out that the math he invoked gave some simple metrics which people flocked to because it boiled down to a single number. His Gaussian copula function transforms a pair of independent probabilities to something that has a dependency, thus raising the risk of default from a multiplicative probability of small numbers to something orders of magnitude higher. This is actually a proper move. Somebody would sooner or later discover this heuristic, and Li happened to get there first.

However, the underlying math of invoking hazard functions for deducing the default risk has even more severe implications, and that concept existed before Li even tried to apply his own set of heuristics. Anyone that understands stationary processes knows that economics has a wicked time dependence and that huge risks are waiting in the wings since hazard rates are not static or stationary in the financial markets.

So what we are seeing with the Wired article and some other articles that preceded this, is an attempt to lay the blame on a single person, where we know that the actual blame needs to be placed on all the stooges that thought that they could get away with squirelling away and deferring the possibilities of ultimately risky strategies. We are seeing the annointment of Li as the "O-ring" failure mode of the financial sector. (The infamous O-ring of the first space shuttle disaster provided the convenient scapegoat to that whole affair, even though probably more endemic problems were at the root of the explosion)

Predicting failures on physical processes and actuarial types of things is much more refined than applying the same concepts to financial instruments. Likewise, the math, probability, and statistics behind oil depletion is as sound as grade-school arithmetic compared to the assumptions that the "Wall Street" quantitative analysts have used to make money. And the soundness of oil depletion analysis that I have done is not that I use a necessarilly different kind of math or work out the physical characteristics, but that I rely on limits and concrete measurable parameters.

The blame put on Li is scapegoating to the highest order, but the wheels are in motion. All I can say is that it is probably better to put the blame on a Wall Street type than on the Community Reinvestment Act. The seemingly well-educated do indeed deserve most of the scorn, not some working stiffs or low-income people trying to make ends meet.



The idea of CDS behind many of the bond investments is to insure against bond defaults. As good actuarians, the insurance analysts estimated the probabilities of failure/default and balanced those against the cost of "credit default swaps", i.e. an insurance agreement. The fact that clients could purchase an excess of these agreements, means that they could actually make money should the default occur. This, in contrast to conventional insurance where you need to guard solely against loss. So the effect of making money by the equivalent of "shorting" looms large.

No matter, the premise that the bond actuarians could actually predict bond default times deluded everyone. Instead of assuming fixed probabilities for time-before-default (the survival time), I could have predicted (had I cared) that the hazard functions modeled would not look anything like a constant stationary failure rate. They would all blow up as soon as resource depletion hit hard. This becomes a case of poor estimates of probabilities and the syncronicity of contrary economic events.

In the oil depletion analysis, I use dispersion on rates. Hazard functions act like rate functions as well, with a fixed Markovian probability of failure per unit of time defined in most cases. This produces a nominal damped exponential probability of survival for a bond (the bond dies, or euphemistically does not survive when it defaults). Yet we dare not use dispersion on failure rates for these bonds. The failures disperse most obviously on time constants, not on rates. Economic crises shouldn't really map like a random breakdown of a part with a slowly accumulating probability of failure over the part's lifetime. It just goes bang at any time and the whole thing falls apart -- that turns into a much more realistic model.

If we do the dispersion on time constants (a range in tau in the following figure), you can see that failures/defaults can occur much earlier than using a fixed tau (leading to the damped exponential).



As Taleb noted, the amount of charlatanism in all this is incredible. I really don't know how these "quants" could even consider their analyses robust at all ... unless they wanted to snooker everyone and earn the payout from the CDS's and rolled up CDO's when the bonds ultimately defaulted. It is entirely possible they had this in mind all along. Then it would all make sense. Otherwise, it makes economics and finance more an more like a dismal science.



Then why do not economists write or analyze anything about very simple economic matters such as resource (e.g. oil, natural gas, etc) depletion ?

Resource depletion is so simple and has nothing to do with belief systems or money-making opportunities, more like bean-counting, yet they can't be bothered to confront one of the most challenging problems of our times, and one that has huge economic implications?

The religion of economics is based on infinite resources and a forever expanding growth potential. Without that assumption, macroeconomic theory basically collapses on itself. And the economists (who ironically invented the sunk-cost model) do not want to see all their theoretical work go down the drain. This is indeed pseudo-science cloaked by the religious belief in maintaining tenure and subscribing to the comfort of group-think. In other words, don't veer too off-course and miss the chance for a Nobel prize. The fact that Wired even brought up the idea that David X. Li was potentially a candidate for an economics Nobel makes me ill.

Thursday, February 12, 2009

USA Field Size Distribution Update

I asked for some help in fleshing out some points on this reservoir field size distribution.

USA Field Size Distribution

A few days ago David N kindly sent me a copy of the Baker paper, and I transcribed some of the data points here.



The Baker paper collected statistics up to 1986, and consisted of data from about 14,000 fields. Robelius with more recent data put it at 34,500 fields.

From the data, it looks like between 1986 and now that many more of the smaller fields became developed and therefore got counted in the statistics. So in the last 20 years, we probably have gained substantial mileage from the low volume reservoirs, explained by either of these possibilities:
  1. Smaller fields get deferred for production due to economic reasons
  2. Smaller fields have a smaller cross-section for discovery so therefore show up later in the historical process. This second-order effect plays a smaller role in dispersive discovery than one would intuit -- i.e. not as if all big fields get found first, instead the probability weighting has a slight bias toward bigger fields.
Since 1986, I eyeballed the breakpoint at around 20 million-barrel fields. But now, if we try to generate a new transition breakpoint for the next 20 years, it would appear at less than 1 MB (C=0.6) as suggested by the dispersive aggregation model. This gets us close to physical limits and a real point of diminishing returns (but you probably knew that already).