Yesterday's post was rather rushed as I was about to get on an airplane. Once on board, it occurred to me, there's another book I've read where the difference between log-normal and power law distributions is key. That's The Misbehavior of Markets by Benoit Mandelbrot & Richard Hudson. They argue modeling markets with normal distributions (in price changes, for example) is erroneous and leads economists to underestimate the likelihood of extreme events (like Black Monday in October 1987).
Put another way, small price changes occur very frequently and large price changes very infrequently. If you plot these on a log-log scale you get a straight line out some distance. Conventional finance assumes this fits a log-normal curve which eventually falls off. Mandelbrot, argues historical price data fits a fractal or scale-free model, i.e. follows a power law, meaning the likelihood of extreme events has been consistently underestimated.
Those of you in high tech in the US may have followed the accounting change which requires stock options be expensed by the company issuing them. To do this one must be able to estimate the present value of the options are the time they are granted. This is done using the Black–Scholes formula. While the authors received the 1997 Nobel prize in economics (Scholes & Merton; Black had died in 1995) and the Black-Scholes formula is now written into accounting practice, Mandelbrot argues it consistently underestimates infrequent cases of extreme price volatility.
As an interesting aside, Long Term Capital Management (LTCM), a hedge fund with Scholes & Merton on it's board, collapsed in 1998 due to a series of correlated repricing events that were extremely unlikely, but did occur in the late summer 1998. LTCM lost $4.6B.
Power laws and log normal distributions appear in many disciplines, not just networking and markets. In many cases it's hard to determine which applies. Whether Mandelbrot is correct about power laws and finance, at least we can agree -- $4.6B is big bucks!
Interesting thread. Don't know too much about the maths here, but don't the wild fluctuations have to hit some kind of tolerance line, and that line is statutary?
Posted by: PaulSweeney | August 19, 2006 at 08:59 AM
I'm trying to imagine how to apply these insights to our little world of telecom.
Maybe the defining "pricing" event in telecom operations is congestion. At the moment we pretend to the users it doesn't exist, and refuse to build pricing mechanisms that present the real opportunity cost to the user of displacing someone else's packet from the queue. So maybe the equivalent "correlated extreme" is a 9/11-Madrid-London-Mumbai event where the network collapses due to extreme congestion spikes that don't unfold at all along the patterns of everyday use. Perhaps in "emergency mode" we just make calls and messages cost 10x or 100x their normal price.
Trying to apply it to slow phenomena like network buildout leaves me scratching my head though.
Posted by: Martin Geddes | August 19, 2006 at 04:32 PM
I like Martin's analogy of "capacity" and "fluctuations". Of course with fibre the answer just seems to be "throw more at it!" because customers respond really really well to fixed rate, predictable pricing models. The entire conversation reminds me of my time in automotive industry where the term "stacked specs" was well understood. Perhaps all models have an element of "stacked assumption" about them?
Posted by: PaulSweeney | August 21, 2006 at 12:47 PM
Paul - The economist John Maynard Keynes is said to have warned investors that although markets do tend toward rational positions in the long run, "the market can stay irrational longer than you can stay solvent." LTCM was a hedge fund that used multiple trading strategies, depending upon them not being correslated and thus balancing each other out. In the summer of 1998, the markets they played in all moved in the same direction for longer than their capital lasted.
Martin - There is some evidence that Internet traffic is fractal, i.e. scale-free, i.e. follows a power law. And there are other phenomena, like website linking patterns, and websites ranked by page views, that follow power laws. But I haven't thought of a way to couple this to telecom pricing or fiber in the access network. :-)
Paul - Never having been in the auto industry, does "stack specs" refer to build up of tolerances? or what?
Posted by: brough | August 21, 2006 at 01:16 PM
Stacked specs. Each part in a module was tested within its own tolerances before being added to the module, where it was tested again on the module level. With more and more sub modules, and with modules coming from different companies, there can emerge a "little fault line" where given a combination of circumstances, there could be a failure.
The problem with probability, is that given an adequately large number, then it will happen to somebody. The modern implication of this in a digital world, is that everyone hears about it. If that "failure" in the "social network" gathers enough "storm" then it could close that network down. With an open, "edge network", the network will be used In ways and manners that you (the designer) had not intended. That was your strategy in the first place. It might be interesting to think of this in terms of classical systems dynamics (Forrester et al), I don't think many people have brought that line of thought to this debate.
Posted by: PaulSweeney | August 23, 2006 at 09:02 AM