[[ Check out my Wordpress blog Context/Earth for environmental and energy topics tied together in a semantic web framework ]]

Wednesday, April 30, 2008

Scaling and the Dispersive Discovery growth function

The search growth function I use for the Dispersive Discovery model follows a T6 time dependence. The derivation comes from a quadratic growth term on top of a single dimension of volume. When the quadratic gets multiplied along the three dimensions of volume, the T6 dependence results.

High-order growth terms such as T6 have some similarity to exponential growth terms as a particular order in the Taylor's series polynomial expansion dominates over a certain interval. The following chart shows the cumulative dispersive discovery using T6 plotted alongside an ekT growth term inserted into the Dispersive Discovery equation. I normalized the two curves via an affine transformation so they intersect at T=1.

Note that the doubling time for the exponential is about 10% of T at T=1, which roughly coincides to the doubling time for the T6 growth.

For world crude oil discoveries, the T=1 time point scales to approximately 100 years (the time period from 1858 to the early 1960's when we observed a global peak). This means that the discovery growth doubling time equated to roughly 10 years in historical terms -- premised on that you believe the Dispersive Discovery model applies. If you look closely at the two curves beyond T=1, the exponential reaches the asymptote much more quickly than the T6 growth curve. This makes perfect sense as the higher order polynomial terms in the Taylor's expansion of the exponential take over, and push to the asymptote more quickly, and thus minimizing the effects of dispersion.

Some might find the exponential growth model more understandable or intuitive, as this emulates technological advances such as those described by Moore's law (i.e. which shows doubling of microprocessor speed every two years), or approximates population growth and the demand and acceleration in prospecting effort that this implies.

Whether the exponential growth actually provides a more realistic picture of the dynamics, I can't say but know for certain that it requires a much stronger growth stimulus -- thus implying that a doubling of search effort must occur every 10 years for the foreseeable future. On the other hand, a high-order function such as T6, though it continues to accelerate, will show progressively longer doubling periods as T increases.

We know that Moore's Law has recently shown signs of abating. This could result from an abatement of technological progress as researches start to give up on scaling techniques1, which in the past has guaranteed speed increases as long as the research fabs could continue to reduce circuit dimensions. Or it could stem from a hard limit on the scaling itself, due to parasitics and losses as the electrical properties encounter quantum limits. I have a feeling that something similar to a "dispersive discovery" in the research growth advances will allow Moore's Law to continue to limp along, as researchers will continue to find niches and corners in the ultimately constrained and finite "volume" of semiconductor combinations available to us.

So what happens to oil prospecting effort as we start hitting the walls remains unknown. We may want to pay close attention to how Moore's Law shakes out just out of curiosity and to see how a "smooth landing" applies in that technology area2. In any event, it definitely will pay to start using the exponential growth model in conjunction with the T6 growth term as the two complementary cumulative dispersive discovery curves don't show a significant amount of difference, and moreover demonstrates that the underlying model shows a certain amount of robustness in terms of parametric variation. In particular, the exponential provides a good way of calculating differential margins should we want to assume a stronger post-peak discovery search pressure.



1 Years ago, I sat in an adjacent office to Robert Dennard, a really nice guy by the way. The scaling theory that he formulated, along with his invention of DRAM, had a lot to do with the correctness of Gordon Moore's predictions. I would find it fascinating if I could get Dennard's opinion (or Moore's for that matter) on how the Dispersive Discovery "scaling" theory could apply in a macro sense. I bet they would both admit that the endless doubling would not continue indefinitely, both in classical semiconductor scaling and likely in oil discoveries as well.

2 The key area of research interest looks like a focus on multi-threading and concurrent functionality. Building more parallelism into microprocessors allows them to continue on an upward performance path, even though the speed improvement turns into a "virtual" or ephemeral achievement. And that assumes that we can get our arms around creating algorithms that take advantage of multi-threading -- not the easiest or most amenable idioms to formal techniques that programmers would prefer to encounter. But some researchers do have grand hopes; in an EE Times article titled "Berkeley researcher describes parallel path", one professor thinks he has discovered unity energy savings on this path:
Energy and the environmental issues are also driving work in ubiquitous computing, said S. Shankar Sastry, dean of engineering at Berkeley.

"We need to think about a building OS that handles all the heating and cooling systems and controls elevators," he said, describing work that could make these large energy consumers into generators. "We need to create buildings that not only consume zero net energy but have zero net cost," he added.
Incredible.


Next: Stay tuned for a final skewering of the Logistic production model.

0 Comments:

Post a Comment

<< Home


"Like strange bulldogs sniffing each other's butts, you could sense wariness from both sides"