It seems that I'm not the only one who has been thinking about the energy costs of digital information.
All the big computing and dot com companies are concerned about their energy bills. Sun have even hosted a two day conference on the issue recently.
"The problem arises from the large amounts of electric power needed to cool the tens of thousands of microprocessors at today's data centers.
Indeed, the cost of electricity to cool these server farms account for about half of the power bill of these centers, said Jon Koomey, a consulting professor at Stanford University's civil engineering department"
Like the untility guy quoted in the report I can't see the technology industry curbing its thirst for energy any time soon, though with the impending world oil shortage, their attention will be more and more drawn to rapidly increasing energy bills. The second law of thermodynamics says more efficient technologies ultimately won't solve the problem of increasing consumption, though they can slow down the rate of deterioration. So in the short to medium term these companies need to be looking at significantly less energy intensive computing architectures, as does the rest of the commercial world that has bought into Bill Gates' "PC on every desk" vision.