North American Network Operators Group

Date Prev | Date Next | Date Index | Thread Index | Author Index | Historical

Re: Energy consumption vs % utilization?

  • From: Deepak Jain
  • Date: Tue Oct 26 14:42:18 2004


Actually I think nobody does calculate "real" utilization,
as there are a lot of soft factors to be taken into account.
Electrical usage for a datacenter is pretty consistent throughout a month, even as measured by a sum of days. The utilization of the systems inside of it are almost anything but consistent... even during boot up it would be nearly impossible to determine the instantaneous necessary power draw.

Separately, deploying applications to clusters of machines where the cluster is dynamically resized [more machines are turned on/off] depending on load is a non-trivial function and outside the operational experience/need of most customers.

But even assuming you could do that, the best approximation I could imagine for an Internet data center would be something akin to its network traffic graph [assumption being that network load amongst a stable set of customers is proportionate to the processing power required to produce it... even if an individual customer uses much more CPU power to do that at a specific time quanta]. Basically, if you use 1Mb/s at noon on Monday, and 1.2Mb/s at noon on Tuesday with the same customer set, you can probably estimate that your system's load is 20% higher than it was on Monday. Assuming you aren't operating at either the very low extreme or very high extreme. At least that would be my thought.

If all applications were designed to virtualized ala mainframe style, this clustering concept might work to dynamically redeploy resources... However the mainframes themselves are inherently not-smooth-stepped in terms of their power/cpu curves, so its probably a dead issue in that regard.

Deepak Jain
AiNET