PC Electricity Consumption

With electricity quickly approaching $2 per watt per year, leaving a computer powered is a very expensive proposition. I found that I could power down my PC and save over $100 per year.

Here's how to accurately estimate how much your computers cost to operate.

Calculating Costs

In order to calculate the cost, I first calculated my total cost per watt per year. With my recent electric bill in hand, I used the following formula:

$ per Watt-Year = (bill's $amount) ÷ (bill's KWH) × 8.766

Given the following calculation, my electricity rate is a staggering $1.51 per watt-year:

$30.29 ÷ 176 KWh × 8.766 = $1.51

With this watt-year value, it's easy to accurately estimate how much your computer (or clock radio or refrigerator) costs per year.

Computer Cost per Year of Operation

Mac MiniPowerBookiMac G4iMac G5eMac 700WinPC
On19 [$25]
14 [$18]
38 [$50]
58 [$76]
91 [$120]
108 [$142]
Sleep mode
DVD View24225374107115
DVD Rip37326472127128
Brick Only00n/an/an/an/a

Impact on Air Conditioning
The above chart represents only the power that the PCs consume. But in the warmer months, the heat generated by a computer (or any other in-home electrical device) generates heat that needs to be removed by an air conditioner. This is simple thermodynamics.

An optimistic rule of thumb is that it takes 1/3rd of a watt of electricity for your new, very high efficiency air conditioner to cool (remove) the heat generated by a 1 watt device.

Additional AC watts = (watts consumed by device) ÷ 3

So, for example, let's say you're using a PC and a monitor that consume about 200 watts of electricity. Practically, all of those 200 watts are released as heat into the room. How many watts "harder" does your AC have to work to remove the heat released by the computer system?

66 watts = (200 watt PC w/ Monitor) ÷ 3

So that 200 watt computer/monitor combo releases enough heat so that your AC unit needs to consume an additional 66 watts of electricity to keep the room cool. This is starting to add up! Leaving your 200 watt computer on is actually costing you 266 watts in the summer months!

Impact on heating
There is some good news - the heat that your computer generates does help heat your home in the colder months, reducing the amount of heat that your heating system needs to generate. There is only one drawback - a computer is an inefficient heater compared to a typical home heating system - a heat pump or a fueled furnace is a much more cost effective way to produce heat. The savings related to "heating-by-computer" is highly variable, based on the type and efficiency of the heating system you have, plus the cost of the fuel you're using. Here are some estimated adjustments based on high efficiency heating systems of various types:

Natural Gas: adjusted watts = watts consumed × 0.65
Oil Furnace: adjusted watts = watts consumed × 0.54
Heat Pump..: adjusted watts = watts consumed × 0.50

So in those bitter cold months when your home heating system is running, and assuming you're using a high efficiency gas furnace, our example 200 watt computer system is only "costing you in dollars" the equivalent of 130 watts of electricity, because your saving 70 watts in "home heating value". These calculations are highly dependent on commodity energy prices and the efficiency of your home heating system.

[The above adjustment calculations are based on the cost of generating heat in a residential heating system as follows:
  • High efficincy gas furnace (97% efficient, $1.55 per therm)
  • High efficiency oil furnace (89% efficient, $2.50 per gallon)
  • Electric Heat Pump (effective 200% efficient, 15¢ per KWh)
  • Electricity: (100% efficient, 15¢ per KWh)
Your heating system may have different efficiency properties, and fuel rates can fluctuate wildly. Although electricity isn't really 100% efficient considering generation and transmission losses, for the purposes of these cost comparisons, 100% efficiency is correct. Similarly, there is energy expended to move gas and oil from their source, but again, we attempting an apples-to-apples comparison of home heating costs by focusing on the almighty dollar, not source-to-consumer energy efficiency.]

Computer Details for the above chart
MiniAn Intel-based Mac Mini, 1.66 GHz dual core CPU, wireless on, bluetooth on.
PowerBookA 12" G4 PowerBook, 1.5 GHz CPU, wireless on, bluetooth on, screen fully dimmed, fully charged
iMac G4A 17" iMac G4, 1.0 GHz CPU, no wireless or bluetooth, screen at normal brightness
iMac G5A 17" iMac G5, first generation, 1.8 GHz CPU, wireless on, screen at normal brightness
eMac 700A 700 MHz eMac with 640 MB RAM and 802.11b wireless on, including 17" CRT.
WinPCAn AMD Athlon XP home-built, 1.6 GHz CPU, 512 MB RAM, generic case, Windows XP. Display not measured.

I measured the computers doing a variety of tasks ... from nothing to "heavy usage". I started with just booting the computer, launching a few applications, and watching the meter. I call this the "on" state.

The above chart shows the power consumption numbers I got out of my watt meter. I used the handy and relatively inexpensive "kill-a-watt" power meter for all measurements. This handy device measures Watts, Volt-Amps, KWh, Frequency, and a bunch of other power attributes. Note that not all the numbers are "fair" - the eMac, iMac G4 and iMac G5 have a built in display which was on and measured, and the PowerBook does battery trickle charging, but it's screen was fully dimmed during measurement. (for full screen brightness on the powerbook, add 5 watts). The Mini and WinPC's monitors were not measured.

Operating State Description
OnComputer on, user logged in, apps up, CPU low.
SleepComputer in "sleep" or "standby" mode (windows)
OffComputer powered down, but plugged into wall outlet
DVD viewWatching a video DVD using OS-provided tools
DVD ripRipping a video DVD to MPEG; consumed CPU(s).
Brick OnlyPower brick consumption while detached from its computer (mini, PB only).

Next, I hope to upgrade my Mac Mini to a Core 2 Duo CPU - stay tuned for more details!


Anonymous said...

What would also be interesting is comparing the use of a machine in comparison to a) the speed of the drives it comes with and b) the wattage of the CPU.

Anonymous said...

1) Is the Athlon one of the older (e.g. energy inefficient) models?

2) You can get fairly impressive savings just by switching to compact-florescent lightbulbs; although they cost more up front, they last longer and use much less power.

3) Some utilities charge a different rate for electricity used for heat generation as opposed to "normal" use; heat generation tends to have a rather level draw (or increase at night), and so it can use "base" power produced by big and cheap coal or nuclear plants. These big plants, however, can't change their output very fast. During the day, usage fluctuates a lot, and so the "top watts" have be provided by smaller, more flexible, but more expensive plants (e.g. gas-turbine). The utilities like electric heating because it evens out the overall use of electricity, and so lets them rely on their base plants more. Therefore, it is cheaper to put 200 watts into a resistive heating system (and get 200 watts of heat) than it is to put 200 watts into your idling PS3 (and also get 200 watts of heat).

Anonymous said...

Nice article. I'm wondering what the constant 8.766 is? Also when quoting 58 watts for a 17" G5, is that with the screen dimmed, screen at the lowest intensity? Did you measure any effects of adjusting screen brightness?

I use my home computer as a web server so it's on 24/7. Interesting comparison of my iMac's electrical costs vs the cost of .Mac (although not enough storage on .Mac to be usable).

LanceJ said...

$ per Watt-Year = (bill's $amount) ÷ (bill's KWH) × 8.766

The 8.766 constant is there to convert the hours and kilowatts provided in your bill into watts and years.

24 hours per day * 365.24 days/year = 8766 hours per year

divided by

1000 watts per Kilowatt

A 1 watt device powered on for an entire (solar) year consumes 8.766 KWH.

Cuvtixo said...

"3) Some utilities charge a different rate for electricity used for heat generation as opposed to "normal" use; heat generation tends to have a rather level draw (or increase at night)"
Can someone explain to me how utilities know what electricity is used for heat generation? I can imagine night vs daytime rates, but I don't think utilities can monitor electricity usage beyond that.