How to decide if a computer upgrade will pay for itself in power savings

Last Updated on April 17, 2017 by Dave Farquhar

I occasionally read an offhand comment where someone says he or she just bought a new computer, and the new computer is so much more power efficient than the old one, it’s going to pay for itself.

I wonder if they did the math, or if that’s what the salesperson told them. Because while I can see circumstances where that assertion would be true, but it typically would involve extremes, like replacing an aged Pentium 4 computer with, well, a netbook. They probably didn’t do that.

Part of the reason I got into computers professionally was because I was tired of hearing lies from salespeople and technicians. So let’s just take a look at this claim.

The first key is understanding the math. Math isn’t my strong point, but hopefully I can explain it. Computer power consumption is measured in watts. Electricity is billed by the KWH (kilowatt-hour). Multiply the computer’s wattage by the number of hours it’s powered on, divide that number by 1,000, then multiply that number by your cost per KWH. Check your electric bill to find your cost per KWH, or if you can’t do that, do a Google search to find outthe average cost per KWH in your state.

Unfortunately the only way to determine your computer’s power consumption is to measure it with a meter, such as a Kill-a-Watt, while the computer is in use.  And to figure out if a new computer will pay for itself, as alleged, you have to measure the old one and the new one. You can’t measure something that you haven’t bought yet.

So here’s how I suggest going about taking a guess.

To get a crude estimate on the low end, you’ll have to compare the CPUs. The Thermal Design Power (TDP) of pretty much any CPU made in the last couple of decades is easy to find with a Google search. There are problems with this approach, which we’ll explore in a minute, but it at least gives a reality check.

Let’s take my web server as an example. It has an AMD Athlon XP 2200+ CPU in it. A Google search indicates it has a TDP of about 63 watts.

The server is powered on all the time, 8760 hours per year. So multiply 63 by 8760, which is 551880. Divide that by 1000, and we get 551.88 KWH.

In Missouri, electricity costs 8.32 cents per KWH on average. So multiply 551.88 by 0.0832, and I find that CPU costs $45.91 per year to operate.

If I were going to replace that CPU, I might as well go all out and replace it with an AMD Fusion E350, which has a TDP of 18 watts. 18 watts * 8760 hours is 157680. Divide that by 1000, and we get 157.68 KWH. Multiply that by .0832, and the Fusion costs $13.12 per year to operate.

So that new motherboard would save me $32.79 per year. The motherboard would cost me $135, so at that rate, the motherboard would pay for itself in 4 years, 2 months.

That’s a little long. I keep my servers a long time, but I’d like to see a faster payoff than that.

How about an Intel Atom? We’re looking at a TDP of 13 watts, which translates into a yearly operating cost of $9.48, a yearly savings of $36.43. I can get an Atom board for $70, so it would pay for itself in less than two years.

This is a crude measurement. Although an Atom’s CPU uses less power, a Fusion motherboard uses less power than an Atom board because the Fusion’s supporting chips are a bit more efficient than the Atom’s supporting chips. But this number is pretty easy to arrive at, and provides something of a reality check. If there’s big savings to be had, this method will give you an indication of it.

To get a measurement on the higher end, you have to be willing and able to take a system inventory, then plug your system’s components into a power supply calculator. Then plug the components you’re considering for the replacement system into it, and compare the difference.

In my case, the calculator recommends a 224W power supply for the Athlon XP-based system and a 112W power supply for an Atom-based system. That’s a difference of 112 watts. Stunning. Doing the math, at Missouri prices, 112 watts equates to a savings of $81.63 per year. I can’t run the numbers on Fusion because it’s not in the calculator yet.

This is a crude measurement too, but at least it involves crude measurements of all the components in a system. Some generations of motherboard chipsets and video cards are more efficient than others. But look at the difference between the two crude measurements: $36.43 if I consider just the CPU, versus $81.63 if I consider the whole system.

So based on that math, yes, I should be ordering a new motherboard for my server.

And if I do, then I can plug a meter into each computer, take a measurement of what system really consumes under the loads I throw at it, and then I’ll know if I made a mistake. Because this method of calculation assumes the system is under a heavier load than it probably really is. The video card in my server, for example, isn’t working hard at all since it spends most of its life displaying a login prompt. In text mode.

But the difference isn’t always that dramatic. Micro Center is running a special right now with an AMD Phenom II X2 560 CPU with motherboard for $89. Run a minimal configuration using onboard video through the power supply calculator and I get 200 watts. The 24-watt difference is nice, and the actual savings (measured by a meter) may be more than that since the newer CPU won’t have to work as hard to accomplish the same work. But by this crude arithmetic, I’d save $17.49 per year. I wouldn’t be able to recycle the memory, so by the time I paid for new memory, I’d have to keep the system six years for it to pay for itself.

And we’re just talking motherboard swaps here, not buying a whole new $400 system.

So whether a new computer pays for itself just in power savings really depends. You can’t just make a blanket statement that if you replace a Pentium 4-based PC with a Core i-based PC that it’ll pay for itself. It depends on the CPU you’re replacing, what you’re replacing it with, how many hours the system is powered on, how hard you make the system work, and what part of the country you live in. In New England and California, where electricity costs almost twice what it costs in Missouri, it’s an easier sell. In Hawaii, where electricity costs an eye-popping 31 cents per KWH, it’s an even easier sell. It’s a tougher sell in states like Idaho and North Dakota.

Do I believe that some people can buy a new computer and have it pay for itself in energy savings? I do now. Do I believe that everyone who makes that claim is making it with validity? No.

If you found this post informative or helpful, please share it!