[74980] in North American Network Operators' Group

home help back first fref pref prev next nref lref last post

Re: Energy consumption vs % utilization?

daemon@ATHENA.MIT.EDU (Alex Rubenstein)
Tue Oct 26 15:01:49 2004

Date: Tue, 26 Oct 2004 14:31:20 -0400 (Eastern Daylight Time)
From: Alex Rubenstein <alex@nac.net>
To: "Gregory (Grisha) Trubetskoy" <grisha@ispol.com>
Cc: nanog@merit.edu
In-Reply-To: <20041026133008.M1223@onyx.ispol.com>
Errors-To: owner-nanog-outgoing@merit.edu



Hello,

I've done quite a bit of studyin power usage and such in datacenters over 
the last year or so.

> I'm looking for information on energy consumption vs percent utilization. In 
> other words if your datacenter consumes 720 MWh per month, yet on average 
> your servers are 98% underutilized, you are wasting a lot of energy (a hot 
> topic these days). Does anyone here have any real data on this?

I've never done a study on power used vs. CPU utilization, but my guess is 
that the heat generated from a PC remains fairly constant -- in the grand 
scheme of things -- no matter what your utilization is.

I say this, because, with a CPU being idle of 100% utilized, they still 
are grossly inefficient, on the order of less than 10% in all cases (ie, 1 
watt in returns at least .9 watts of heat, no matter loading of the CPU).


-- Alex Rubenstein, AR97, K2AHR, alex@nac.net, latency, Al Reuben --
--    Net Access Corporation, 800-NET-ME-36, http://www.nac.net   --



home help back first fref pref prev next nref lref last post