I’ve never been a big fan of overclocking. I overclocked for a couple of weeks back in my Pentium-75 days but quit when my system started acting goofy. I did it again five years ago when I was writing my book, because, well, everyone expected me to talk about overclocking in it. So I overclocked again, and tried to use that overclocked machine in the process of writing a book. This foray only lasted a little while longer.
I explicitly recommended against overclocking in my book, based on my experience with it. Now, some five years later, we have an analysis from a Microsoft engineer, based on what he found when analyzing crash dumps people had sent in when they push the “send error report” button.There’s a lot of technical jargon in his analysis. I know enough about assembly language make a Commodore 64 flash lots of colors on the screen, so I know just enough to translate his code examples into English.
Basically, what the examples indicate is that the overclocked processor in question knew that two values were set to the same number, was told to do something if the two values were equal and something else if they weren’t equal, and the CPU did… the something else. Suddenly five wasn’t equal to five.
The second example he sites is just a sneaky way to set something to zero. These overclocked processors weren’t able to do that reliably.
Let me put this a different way. In 1989 or 1990, I read a magazine article in the late, great Compute magazine about CPUs. At the time, the Motorola 68030 was one of the fastest CPUs on the market. The author asked a Motorola engineer how Motorola made a 50 MHz CPU (which at the time was mind-blowing). Know what he said? He said they started out by taking a 33 MHz CPU, running it at 50 MHz until it broke, and then they looked at what broke and tried to find a way to make those parts stronger.
A lot of people encourage overclocking because they say it’s harmless. That quote from a Motorola engineer notwithstanding, I think it really depends on what it is that you’re doing. Some would argue that if all you do is play games, go ahead and overclock, and if you toast your CPU in a year, well, next year’s hottest game will need a newer CPU anyway. Having had a computer crash in the middle of a game where I was doing well, I’m not so keen even on that idea.
I’m certainly not going to overclock anything that’s going to have my financial information on it. When I’m doing my monthly budget, I need my computer to know that five equals five.
I didn’t recommend overclocking in 1999, and with what CPU prices have done in the past six years, if anything, it makes less sense now.