[74985] in North American Network Operators' Group

home help back first fref pref prev next nref lref last post

Re: Energy consumption vs % utilization?

daemon@ATHENA.MIT.EDU (Steven M. Bellovin)
Tue Oct 26 16:02:36 2004

From: "Steven M. Bellovin" <smb@research.att.com>
To: Alex Rubenstein <alex@nac.net>
Cc: "Gregory (Grisha) Trubetskoy" <grisha@ispol.com>, nanog@merit.edu
In-Reply-To: Your message of "Tue, 26 Oct 2004 14:31:20 EDT."
             <Pine.WNT.4.61.0410261429110.3340@vanadium.hq.nac.net> 
Date: Tue, 26 Oct 2004 16:01:24 -0400
Errors-To: owner-nanog-outgoing@merit.edu


In message <Pine.WNT.4.61.0410261429110.3340@vanadium.hq.nac.net>, Alex Rubenst
ein writes:
>
>
>Hello,
>
>I've done quite a bit of studyin power usage and such in datacenters over 
>the last year or so.
>
>> I'm looking for information on energy consumption vs percent utilization. In
> 
>> other words if your datacenter consumes 720 MWh per month, yet on average 
>> your servers are 98% underutilized, you are wasting a lot of energy (a hot 
>> topic these days). Does anyone here have any real data on this?
>
>I've never done a study on power used vs. CPU utilization, but my guess is 
>that the heat generated from a PC remains fairly constant -- in the grand 
>scheme of things -- no matter what your utilization is.
>

I doubt that very much, or we wouldn't have variable speed fans.  I've 
monitored CPU temperature when doing compilations; it goes up 
significantly.  That suggests that the CPU is drawing more power at 
such times.

Of course, there's another implication -- if the CPU isn't using the 
power, the draw from the power line is less, which means that much less 
electricity is being used.

		--Steve Bellovin, http://www.research.att.com/~smb



home help back first fref pref prev next nref lref last post