[[ Check out my Wordpress blog Context/Earth for environmental and energy topics tied together in a semantic web framework ]]

## Thursday, July 07, 2005

### Part 1: A Macro Peak Oil Model

Last month I derived a Micro peak oil model that helped me understand the dynamics of extraction and production from a single reservoir. I noted that although it gave me a good intuitive feel for short time scales, it certainly wouldn't scale at the global level (i.e. the world Hubbert peak). Thinking about the a priori data we have available, I believe I can take a crack at such a global (or macro) level model. I make no attempt at getting all the details right. At this point, I want only to get my intuition on a mathematical footing, and then see how far I can mimic the gross features of the generic Hubbert curve.

## The Macro Model

Once again, I simplify the relationship between reserves and depletion rates by relying on a first-order approximation: the rate of extraction per time is directly proportional to the amount of oil left.
dP(t)/dt = -r * P(t)
Lacking any additional information, this becomes the naive estimator for how something depletes; it also finds application in many other physical processes including thermal conduction and particle diffusion. In general, the relationship points to a reduced extraction rate as the availability or density of a resource depletes.

Of course, the first-order differential equation solves to a simple declining exponential.
P(t) = K * e-r*t
Obviously that doesn't complete the story as the exponential doesn't come close to approximating the asymmetrical Bell curve of the Hubbert peak.

A temporal driving force applied to the exponential allows us to mathematically intuit a better symmetry. To achieve this, we use the a priori assumption that discoveries provide the stimulus for extraction. Historically, discoveries start at zero, reach some peak, and then start declining over time. We have long since reached peak in discovering oil wells, so this becomes valid empirical data that we can use to model depletion.

Given that we have (1) a depletion rate model and (2) an empirical discovery model, we need to combine the two by driving the transfer function with a stimulus function. Mathematically, this key third step is typically solved by applying the convolution integral:

The order in the functions doesn't matter; intuitively, it becomes a kind of a moving average function applied over all points in time. I show the continuous variant of the convolution integral as well as the discretized version to illustrate how easily this can be computed.

At this point I don't want to use the empirical discovery function. Instead I use a simple triangular function to serve as a heuristic and something that we can easily parameterize. The result is shown in the following graph:

Don't take the peak date too seriously. This profile is meant to show how easily the asymmetrical Bell curve derives from such a simple model. (It is actually quite difficult to distinguish the rise from a Gaussian curve).

Maybe somebody has done the analysis this way already. I don't know; from the literature I haven't found it yet. At the PeakOil.com message board, a few people have been playing around recently with Logistics equations, Ricatti equations, and Verhulst equations and using non-linear estimators to come up with best fits to coefficients. Screw that nonsense. It bothers me that no one has developed some intuitive mathematics which clearly show the general trend of the Hubbert peak.

I will follow up this post with a Part 2 to clear up any loose ends. In the meantime, I attach below some GNAT code suitable for compiling with a GNU GCC compiler. It will generate a comma separated value output which I pulled into the spreadsheet chart above. Otherwise, you can transform the code to extract the algorithm.
`with Text_IO;with Ada.Numerics.Elementary_Functions;procedure Conv is   type Flt is array(Natural range <>) of Float;   function Get (Arr : Flt;                 I : Integer) return Float is   begin      if I < Arr'First or I > Arr'Last then         return 0.0;      else         return Arr(I);      end if;   end;      function Convolve (A, B : in Flt) return Flt is      Total : constant Natural := A'Length+B'Length;      C : Flt(0..Total);      V : Float;   begin      for J in 0..Total loop         V := 0.0;         for I in 0 .. J loop            V := V + Get(A, I) * Get(B, J-I);         end loop;         C(J) := V;      end loop;      return C;   end;      function Triangle_Window (L : Natural) return Flt is      R : Flt(0..L-1);      Half : constant Natural := L/2;   begin      for I in 0..L-1 loop         if I < Half then            R(I) := Float(I);         else            R(I) := Float(L - I);         end if;      end loop;      return R;   end;   function Exponential (L : Natural;                         Alpha : Float) return Flt is      use Ada.Numerics.Elementary_Functions;      R : Flt(0..L-1);   begin      for I in 0..L-1 loop         R(I) := Alpha * exp(-Alpha*Float(I));      end loop;      return R;   end;      W : constant Flt := Triangle_Window(130);   E : constant Flt := Exponential(300, 0.02);   R : constant Flt := Convolve (W, E);   Year : constant Natural := 1900;begin   for I in R'Range loop      Text_IO.Put_Line(Integer(Year+I)'Img & "," &                        Get(W,I)'Img & "," &                        R(I)'Img);   end loop;end;`

Professor  SW said...

Very nice. It would be interesting to plot a family of curves that are generated as key variables are changed.

7:25 PM
Professor  WHT said...

Good idea. I will give it a try for Part 2.

9:34 PM
Professor  DarkSyde said...

Creationist Response: Isaac newton wus a fag!111!!!1one!! Leibniz was a Nazi!!!one!!!1!1
Calculous is jus a thery and more and more scientists are reejcting the dogma of calculos!!11!!one!!!

I pWN U!!!!11!11

5:56 AM
Professor  WHT said...

Dear creator: Don't be hatin' my formies.

9:19 AM
Professor  AtlasShrugs.com said...

Interesting stream going here..........

but you dare impugn my authenticity again and I will show no mercy.............can you dig it?

Go over to my site, I merely guest author at Trey's.

Oh yeah, you owe me an apology

Pamela aka "Atlas"
atlasshrugs.com

8:54 AM
Professor  WHT said...

I'm 99% sure that DarkSyde's comment refers to this:
http://www.pandagon.net/archives/2005/07/watch_how_quick.html

and to DS's experiences:
http://www.dailykos.com/story/2005/7/5/204739/7372

2:01 PM
Professor  monkeygrinder said...

Is that ADA code? Where is the entry point - I should be able to convert that to jscript or something that runs in a browser.

5:28 PM
Professor  WHT said...

If you have the Ada-aware GCC suite (most recent Linux distros do), copy the code into a file and name it "conv.adb" and then run "gnatmake" on it. The entry point is the procedure Conv.

Several years ago there was something called AdaJava. Good luck on converting to JavaScript. It may not be too hard if you can match the array abstraction, I just have an inordinate fear of debugging.

Legally speaking, it's GNAT/GCC specific and not 100% Ada because it uses a 'Img attribute.

7:25 PM
Professor  new laptop battery said...

I'm usually not a fan of Gehry's work, but this is an exception. Using wood as a wrapper imitating structure is not only novel, but ties the space down through the use of a consistent, inviting, and non-distracting material palette. Perhaps Gehry is more suited to small-scale projects Laptop battery.

2:58 AM

"Like strange bulldogs sniffing each other's butts, you could sense wariness from both sides"