August 03, 2006
Hot Hot Networks
As the current summer heatwave takes itssmall toll on the internet, it's pushed a couple hidden issues a little closer to the surface. One is just how fragile the internet is (and is not) and the other just how god damn hot and power hungry it is.
In Albert-László Barabási's Linked he demonstrates that the internet forms what he calls a scale-free network. One of the interesting characteristics of a scale free network is that it is virtually indestructible as a communications medium. No matter how many nodes you take out, big ones, small ones, whatever, you still, in his models at least, you can still communicate across the network. You can slow the network down incredibly by taking out key nodes, but you can not destroy it as a communications tool.
But that is the internet as this big thing, this mass that Barabási studied and then modeled via graph theory. That is not the same thing as what we can call your "personal internet". Your personal internet is your email, your blog, your online bookmarks, your MySpace page, and the like. The precise contents vary from person to person, maybe X relies on MyYahoo and Y on YayHooray. But regardless of the specifics your personal internet is a handful of sites and they are not nearly as resilient as some hypothetical scale free entity. Just taking out your email for a few hours can pretty much destroy your internet communications, while having your site, or main news-source out can leave you hanging half off the network. Or as someone on one of my mailing lists tossed out: "And of course if both Dreamhost and GMail go down at the same time, then that means it's time to go onto the roof and have a beer." Which of course is a great idea, but how many days do you want to stay on the roof, and how much beer do you have stocked up?
The second issue is in a way about Moore's Law, you know the one about computer speeds doubling every 18 months or two years. I've had a minor disagreement with Art Kleiner over the past couple years on just how seriously to take Moore's Law. Been highly skeptical of any attempt to prediction the future with the accuracy of a law, I don't think it can be taken for granted. In fact at least anecdotally it seems to be slowing down now. I just replace a four year old laptop, and I went from a single 1.9GHz chip to a dual 2GHz, which is barely double in four years. Part of this is a technological issue, and part is an issue of demand, speed just is not as important a selling point as it once was and hence there is less drive to push it at the speed of Moore's law. Art's point is that you need to factor in price, that Moore's law is better seen as a doubling of processing speed per dollar rather than a raw doubling, and it's a good point.
But what neither of us have factored in, is the cost of computing in terms of how much power it is consuming and how much heat it is generating. Faster chips run hotter and suck in more electricity. While chip designs are getting more efficient at least at times, the over all trend still seems be towards more power consumption and more hot hot hot servers and laptops. Unless this can get reigned in, Moore's law might just run smack into a brick wall of rising power costs. It might still be possible to double the cost of computing per dollar of microchip every couple years, but it might not be worth the added cost on your electric bill...
"Chips will be as hot as nuclear reactors by the end of the decade... and as hot as the surface of the sun by 2015, if they continue on their current design path, according to Pat Gelsinger, Intel's architecture chief."
"The nub of his presentation is this. Moore's law will remain true for the next ten years: but the thermal properties and power requirements of future chips conforming Moore's rule of thumb will be unsellable, unless design approaches are radically revised.
""'No one,' as Gelsinger puts it,'wants to carry a nuclear reactor in their laptop onto a plane'. "Posted by Abe at August 3, 2006 12:35 AM