The Great Curve
This narrative describes the virtual flow of oil through the system from discovery to production, providing a problem domain description to how an oil peak occurs. You can find the accompanying stochastic differential equations here.
Precondition: Oil sits in the ground, undiscovered.
Invariant: The amount of oil in each of its states cumulatively sums to a constant.
The states of the oil in the ground consist of fallow, construction, maturation, and extraction.
- Someone makes a discovery and provides an estimate of the quantity extractable.
- The oil sits in the fallow state until an oiler makes a decision or negotiates what to do with it. Claims must be made, etc. This time constant we call t1, and we assume this constitutes an average time with standard deviation equal to the average. This becomes a maximum entropy estimate, made with the minimal knowledge available.
- A stochastic fraction of the discovery flows out of this fallow state with rate 1/t1. Once out of this state, it becomes ready for the next stage (and state).
- The oil next sits in the build state as the necessary rigs and platforms get constructed. This time constant we call t2, and has once again an average time with standard deviation equal to the average.
- A stochastic fraction of the fallow oil flows out of the build state with rate 1/t2. Once out of this state, it becomes ready for the next stage.
- The oil sits in the maturation state once the rigs get completed. We cannot achieve the maximum flow instantaneously as the necessary transport, pipelines, and other logistics are likely not 100% ready. This time constant we call t3.
- A stochastic fraction of the built rig's virtual oil flows out of the maturation state with rate 1/t3. Once out of this virtual state, it becomes ready for the next stage of sustained extraction.
- The oil sits in the ready-to-extract state once the oil well becomes mature.
- The oil starts getting pumped with stochastic extraction rate 1/t4. The amount extracted per unit time scales proportionally to the amount in the previous maturation state.
We can consider each one of these states as a reservoir with a capacitive time lag associated with the time constant set for each stage. In stochastic terminology the flow approximates a Markovian Process.
The extraction from the final stage gives the average production level. Since Markov processes have well-behaved linear properties and remain conditionally independent of past states, we can apply an entire set of discoveries as forcing functions to this process flow and the result will reduce to a convolution of the individually forced solutions.
The final production profile over time approximates the classic Hubbert curve. This narrative explains in very basic terms how and why the peak gets shifted well away from the discovery peak. However we observe no symmetry in the derived curve, as the nature of time causality rules long negative tails out.
Regarding the US crude oil production curves, Staniford was able to make a very good fit over a few orders of magnitude using a gaussian. As for temporal properties of this curve over time, Staniford noted it has the property that:
dP/dt = K * (t0-t) * P(t)
where
t0=PeakTime
. This relationship reads that the production increase slows down over time linearly, but also scaled by the amount in production at that time -- kind of an linearly decreasing positive feedback turned into a linearly increasing negative feedback. At t=t0
, the production increase turns into a production decrease. Unfortunately, I can't provide a problem-domain narrative for the gaussian formulation. Like the logistic curve formulation, I get brain-freeze trying to shape it into intuitive terms.
2 Comments:
Web:
I'm not sure there's much hope of us agreeing, since our scientific instincts are so different. But FWIW...
Firstly, I completely agree that we lack an adequate theoretical understanding of why the logistic/gaussian fits work, when they work. I view that as a problem to be solved, if we can.
I agree roughly with your narrative account of what happens in the development of an oil field. However, it's not clear to me that it leads to any practical model for prediction (though it can and has led to some insight). Each stage in your model requires a convolution function which there's no reason to think is stationary over time, and which could potentially have an arbitrary form. We don't have measurements of any of those convolution functions. So you make some simplifying assumptions, which may or may not be valid - we no more know that than why production curves in large regions are sometimes Gaussian. And then we are still left with a model with a large number of parameters that we can't measure. So our choices are to guess them or fit for them. Guessing is obviously useless. And in my experience, if one fits a very complex "soft-science" problem with a model with a lot of parameters, one can fit anything, but predict nothing.
I also don't agree with the Markov assumption. I note that this oil production system is coupled to a culture/economy, which gets to exert choices about all stages of the process based on the desired production at the end. That culture/economy appears to have fairly long lived ideas about how fast the process should go. You seem to have a very strong bias to viewing the system as entirely a physical system which can be modeled by analogy to systems that are strictly physical like electronic circuits. But it's just not the case. Every one of your convolution functions is basically a measure of human decisions to do things like buy rigs, drill wells, etc. Only the decline rate of wells in production is a physical process, and even there, humans rework wells, etc.
I'm certainly not saying your work is worthless, and you shouldn't continue it. On the contrary, I think it's definitely worthwhile and interesting and I hope you do continue. But I don't think you have any basis for your frequent claims that this is infinitely superior to traditional approaches because your mathematics is somehow better. The mathematical complexity of a model has no necessary relationship to its scientific usefulness. What I would rather see you do is use this framework to try to help us figure out why the traditional approaches work, when they do work, and under what conditions it is and isn't safe to use them.
Every one of your convolution functions is basically a measure of human decisions to do things like buy rigs, drill wells, etc.
You forget the fact that lots of little decisions lead to the macro phenomena.
Elsewhere you speak of scientific instinct. My instinct comes from years of work with statistical mechanics, which describes the result of many tiny decisions that work out on the whole to some average phenomenon. The analogies that I make with electronics are simply mathematical curiousities.
Finally, the traditional approaches do not work under the relentless criticism of Michael Lynch, et al.
Post a Comment
<< Home