One of my four housemates and I have been running servers in our house for over 5 years now. Because of the Northpoint bankruptcy in spring 2001, we went a whole month without an internet connection, the first time in 5 years that the servers were off for an extended period. I manage the PG&E bill for the house, and when the electric bill for that month came and it was much lower than other months, I realized for the first time just how much power a continuously running computer uses.

We'd been simply splitting the bill 4 ways, but I now felt bad for the other 2 housemates, who are not geeks and don't have servers, and thus shouldn't be paying for them. So I devised a way to compute shares more equitably. At first I based the calculations on the AC we used in that month of April. I took that number of kilowatt-hours and split it 4 ways, and then divided the remaining amount proportionally based on who was running how many servers (e.g. if it was 600 kWh and I was running 1 server and he was running 2, then he would pay for 400 and I would pay for 200).

Well, this worked for several months but then it started seeming not quite accurate, maybe because the days were shorter and we were running lights more, or electric heaters, or whatever. We decided to do the opposite - subtract the amount the servers used, then split the rest 4 ways, since "the rest" can be a highly variable and unpredictable amount, based on lots of things, like the weather, how many housemates are in town for how long that month, etc etc. But this new algorithm depended on one crucial thing - how much electricity does an average server use?

My first calculations were based on the fact that most full-sized PC-style computers have 250 Watt power supplies. However, I knew that that was a maximum rating, and that they probably were not eating up that entire amount. But how much of that was actually being used?

We finally asked our friend Dean Gaudet, who used to have a job doing these kinds of measurements (he is also one of the main programmers in the Apache project). Here's what he had to say:

...for modern PC hardware you can budget something like this:
  • 8 to 10W per 3.5" disk (ide or scsi ... older drives = up to 3x more, you can actually look up manufacturer specs)
  • 0.25W per memory *chip* (you have to count the number of chips total on all the DIMMs, typically 8 or 16 for non-ecc 9 or 18 for ecc... sometimes as high as 32/36!)
  • 40W per CPU (this is too much for some, too little for others)
  • throw in 10W per expansion card... just as a guess.
  • 20W for the rest of the motherboard (although more than 2 or 3 fans and you should start adding in 2W per fan)
  • hmm, i've never measured cd-roms or floppy drives!

that'll get you your DC wattage, pre-power supply... these things are sometimes as inefficient as 50%... but as a rough guess take something like 75%, so multiply your above total by 4/3 to get your AC wattage.

or, go buy an AC Amp Clamp Meter ... they're around $50 for 0.1A accuracy. they measure power by measuring the magnetic inductance around a single hot wire. the difficulty of using this is getting at a single hot wire without the other hot wire cancelling the magnetic field. you can do this a number of ways... one is to open up the electrical panel and put the clamp around just one of the hot wires on a circuit, that'll get you the entire circuit's draw. another way is to peel off the outer insulation on a computer power cord so that you can separate the three wires inside (they're still shielded, you just need to get access to one of the live wires).

i'm guessing you probably know that power = volts * amps, but i thought i'd repeat it in case :) you have to guess that power is around 115V unless you also get a volt-meter and measure the voltage.

i've got another $1000 piece of lab gear which measures much more accurately, and you plug your device into it... i've no bloody idea why this stuff costs so much. you'd think with the energy crunch someone would have marketed a device which you just plug things into at the wall and it measures the draw and total. i know of many people who are in the same situation as you wondering what their box is consuming per month.

Thanx, Dean!

Using these numbers, I plugged-and-chugged for our various servers (all of which only have 1 or 2 drives and not a huge amount of RAM,etc) and the average came out to about 90kWh per month. So now I charge myself and my geek housemate for 90kWh per server, and then we split the rest of the usage four ways. At current PG&E rates of about 15 cents a kWh, that means it costs about 14 bucks a month to power your average server in San Francisco. (note that that rate includes all the stupid hidden taxes and fees that get tacked on to the bill, it's not the "raw" rate.)

There, now all you non-geeks living with servergeeks can have an extra burrito or two each month instead of paying for your housemates' servers! heh....