[74987] in North American Network Operators' Group

home help back first fref pref prev next nref lref last post

Re: Energy consumption vs % utilization?

daemon@ATHENA.MIT.EDU (Petri Helenius)
Tue Oct 26 16:21:28 2004

Date: Tue, 26 Oct 2004 23:19:06 +0300
From: Petri Helenius <pete@he.iki.fi>
To: Alex Rubenstein <alex@nac.net>
Cc: "Gregory (Grisha) Trubetskoy" <grisha@ispol.com>, nanog@merit.edu
In-Reply-To: <Pine.WNT.4.61.0410261429110.3340@vanadium.hq.nac.net>
Errors-To: owner-nanog-outgoing@merit.edu


Alex Rubenstein wrote:

>
>> I'm looking for information on energy consumption vs percent 
>> utilization. In other words if your datacenter consumes 720 MWh per 
>> month, yet on average your servers are 98% underutilized, you are 
>> wasting a lot of energy (a hot topic these days). Does anyone here 
>> have any real data on this?
>
>
> I've never done a study on power used vs. CPU utilization, but my 
> guess is that the heat generated from a PC remains fairly constant -- 
> in the grand scheme of things -- no matter what your utilization is.

You should be able to pick up simple current / wattage meter from local 
hardware store for $20 or so. That will tell you that on a modern 
dual-CPU machine the power consumption at idle CPU is about 60% of peak. 
The rest is consumed by drives, fans, RAM, etc. As wattage the 
difference is 100-120W (50-60W per cpu)

All modern operating systems do moderate job of saving CPU wattage when 
they are idle (BSD's, Linux, MACOS X, WinXP, etc.)

Pete



home help back first fref pref prev next nref lref last post