Overclocking didn’t start in the 90s, and it wasn’t limited to PCs either. Here’s a history of overclocking from a guy who did it some, and talked to guys who did it a lot in the 80s.
I don’t recommend overclocking, and today Microsoft can prove it’s a bad idea. But overclocking has a long and colorful history.
Overclocking certainly got a lot easier in the 90s, but we’ll get to that in a while.
Overclocking didn’t even start in the 80s. I’ve heard stories that in the 1970s, chip designer Chuck Peddle experimented with defective 6502 CPUs to see how fast they could run. Since the chips didn’t work work well enough to sell, he had nothing to lose. I can’t find it documented anywhere else but I understand he got some chips to run at 16 MHz for a while before they blew up. They were supposed to be 1 MHz chips.
It was a common practice in the 1980s, especially with PC clones. Clone shops would put slower chips into faster motherboards, put a heat sink on them, and sell them cheaply. You could get a 12 MHz computer for the price of an 8 MHz one, until it broke. Some shops explained they were doing this. Others just did it.
I worked with a guy who worked on IBM PC/ATs, which came at 8 MHz from the factory. He said he had them running at 10-12 MHz reliably. To do it, you had to desolder the clock crystal and replace it with a faster one. So it wasn’t a modification for the faint of heart. It was four solder points, but it was also a $3,000 computer.
I first learned of overclocking in the early 1990s. I was a big-time Amigaholic, and we Amigans loved our accelerator boards. In the back pages of magazines, you could find cheaper boards that would claim to be “tested and clocked at” a given speed. What that really meant was that the board had an overclocked CPU on it. The advertisers at the front of the magazine sold boards with CPUs running at their official speeds, at a higher price, of course.
Enter the 486
Overclocking got a lot easier in the days of the 486. By the early 1990s, motherboard makers started putting jumpers on the boards for clock speed. This let them sell one board and a reseller could put whatever CPU they wanted on it. People started figuring out they could buy a 25 MHz system, take it home, change the jumper to 33 MHz, and have a 33 MHz system. Later 386 boards also had this ability. But there wasn’t much point in overclocking a 386 because AMD sold 40 MHz 386s for less than Intel charged for 33 MHz.
At this point overclocking was still a bit of a secret. There were circles of people who knew how to do it. But if you didn’t know someone who’d done it, you had to notice that jumper on your motherboard and change it to see what happened.
Overclocking goes mainstream
Then along came the Internet. And along came a site called Tom’s Hardware Guide.
Tom Pabst was a German doctor living and practicing in England who liked to unwind by playing video games. He decided in 1996 to share his experience with overclocking.
He discovered that certain Asus motherboards overclocked more reliably than other boards, and one board in particular, the P55T2P4, had some jumper settings for nonstandard speeds. It had 75 and 83 MHz buses before AMD and Cyrix decided to try to use those speeds. Pabst found that a Pentium-166 running at 83 MHz x 2 was faster than a standard Pentium 200 running at 66 MHz x 3. But if you really wanted something, you could run a Pentium at 83 MHz x 2.5 to reach 208 MHz.
With the right motherboard, high-quality memory, and a tolerant video card, Pabst’s tweaks would give you the fastest computer on the block.
Pabst became a polarizing figure and he eventually grew tired of the PC hardware enthusiast community. But there’s little doubt that his early articles were the first most people heard of overclocking.
Intel fights back
Of course Intel wasn’t too keen on the phenomenon. The Pentium-200 was their highest margin product. So they didn’t like people buying 120 MHz CPUs and overclocking them to get the same performance for hundreds of dollars less. They didn’t like people buying the cut-rate Pentium 75 and turning it into a midrange CPU either.
So as subsequent CPU generations wore on, Intel limited the options for overclocking by doing things like locking CPU multipliers.
I understand if you overlook the Celeron, but that chip illustrates the overclocker attitude better than any other. In 1998, Intel released the Celeron to compete with cheap CPUs from AMD and Cyrix. To make it, they took all of the Level 2 cache off a Pentium II, clocked it at 266 or 300 MHz, and sold it at an AMD-like price. But due to the lack of L2 cache, it was slower than a previous-generation Pentium running at 233 MHz, let alone an AMD or Cyrix chip running at 266 or 300 MHz.
But the L2 cache was the Pentium II’s limiting factor in overclocking. So a 266 MHz Celeron ran happily at 400 MHz, or potentially even 450 MHz, the same speed as the fastest Pentium II at the time. It was a lot slower than a Pentium II at that speed, but it worked well for 3D gaming.
But the only people who bought those Celerons were overclockers and people who didn’t know anything about computers. So Intel released a new chip six months later with L2 cache built in. It was still slower than a P2, but faster than a comparable AMD or Cyrix chip. It also overclocked pretty well. So buying cheap Celerons and making them run like Pentium IIs and Pentium IIIs was a thing for a while.
The high end
But over time the need for overclocking decreased. Computers got faster but the software didn’t get much more demanding. So as time went on, people stopped overclocking cheap CPUs. Instead, they bought middle- and high-end CPUs and overclocked them to get maximum performance for high-end gaming.
In theory, overclocking should have gotten harder over time. Intel and even AMD tried to make it that way. But it didn’t really. Back in the Socket 7 days, Abit one-upped Asus by creating the IT5H, a motherboard that could overclock without changing any jumpers. That started a bit of a game of leapfrog that continued into subsequent CPU generations. Intel was limiting the options. But companies like Abit, Asus, and Gigabyte did what they could to make whatever remained possible as easy as possible. I have boards that have an option in the BIOS to overclock the CPU automatically. It’s just as easy as choosing what hard drive to boot from.
These days overclocking is almost expected. It’s usually not possible to do the same kind of crazy overclocks of the past, running a chip at 16 times its rated speed like Chuck Peddle may have. But it’s very possible to get a bump to the next level of CPU performance if you buy the right motherboard. I don’t recommend it, but it’s so easy to do that I expect a lot of people do it.