History of overclocking

Last Updated on September 17, 2023 by Dave Farquhar

Overclocking didn’t start in the 90s, and it wasn’t limited to PCs either. Here’s a history of overclocking from a guy who did it some, and talked to guys who did it a lot in the 80s.

I don’t recommend overclocking, and today Microsoft can prove it’s a bad idea. But overclocking has a long and colorful history. It’s less common than it used to be, perhaps. But it’s not completely extinct.

Overclocking certainly got a lot easier in the 90s, but we’ll get to that in a while.

History of overclocking: The 1970s

Overclocking didn’t even start in the 80s. I’ve heard stories that in the 1970s, chip designer Chuck Peddle experimented with defective 6502 CPUs to see how fast they could run. Since the chips didn’t work work well enough to sell, he had nothing to lose. I can’t find it documented anywhere else but I understand he got some chips to run at 16 MHz for a while before they blew up. They were supposed to be 1 MHz chips.

But overclocking 6502-based systems wasn’t exactly common among end users. The rest of the chips in the system also had to be able to handle the higher speed, and frequently they couldn’t. Overclocking only became practical when systems came onto the market that assumed the CPU might not always run at one speed.

History of overclocking: The 1980s

Overclocking happened in the 1980s for fun and profit. Arcade operators would buy speed-up kits to make games harder after people got good at the games and played longer than before. Sometimes these speedup kits were just a replacement clock crystal. Swap out the crystal, overclock the board, hope it held up, and profit! No one thought of it as overclocking at the time, but it’s exactly what it was.

On the consumer side, overclocking early home computers wasn’t terribly practical. But it became a bit more practical with PC clones. For that matter, you could overclock a true blue IBM PC or XT too. There was a project called the PC Sprint board. It was a cheap DIY accelerator board. The instructions explicitly warned you to replace the CPU with a faster one. An NEC V20 was ideal. But if you didn’t, the 4.77 MHz chip would run at 7.16 MHz, at least for a while. It was overclocking if you didn’t swap the chip. If you did, it wasn’t actually overclocking.

Some clone shops would put slower chips into faster motherboards, put a heat sink on them, and sell them cheaply. You could get a 10, 12, or 16 MHz computer for the price of a slower one, until it broke. Some shops explained they were doing this. Others just did it.

Overclocking even happened on true-blue IBM PC/ATs, at least sometimes. I once worked with a guy who worked on IBM PC/ATs, which came at 6 or 8 MHz from the factory. He said he had them running at 10 or even 12 MHz reliably. To do it, you had pry out the clock crystal and replace it with a faster one. Ironically, overclocking was harder on clones, as those clock crystals were soldered in. Early overclocking often required soldering skills, unless you bought IBM I guess. IBM later modified the BIOS to keep you from running them much faster than 8 MHz, but overclockers would replace the BIOS chips to get around that.

Amiga overclocking

history of overclocking
This ad from the back pages of a 1992 issue of Amiga World advertised overclocked CPUs on Amiga upgrade boards. The cheapest board is an overclocked 20 MHz CPU, priced $100 less than a comparable board with a real 25 MHz CPU on it advertised elsewhere in that issue. The other two boards run at 36 or 38 MHz. Motorola never officially released a 68030 CPU running at either of those speeds.

I first learned of overclocking in the early 1990s. I was a big-time Amigaholic, and we Amigans loved our accelerator boards. In the back pages of magazines, you could find cheaper boards that would claim to be “tested and clocked at” a given speed. What that really meant was that the board had an overclocked CPU on it. The advertisers at the front of the magazine sold boards with CPUs running at their official speeds, at a higher price, of course. Motorola 68000-series CPUs lended themselves well to overclocking, so some people did it, even though the editorial content of the magazines usually warned against it.

What about Halt and Catch Fire?

In series 1 of the AMC television drama Halt and Catch Fire, overclocking is a subplot. The drama is set in 1983, and the goal of the series is to produce a faster, cheaper clone of the IBM PC. In one episode, they discuss overclocking the delivered product. In another episode, they overclock an 8 MHz chip to 12 MHz to see what the performance of the 12 MHz version will be when it comes out.

Realistically, no large company wanted the liability involved with shipping an overclocked mass-market PC. It’s unlikely the scenario presented in the show would have ever been said out loud.

Overclocking to test the performance of a future part is arguably less far-fetched, but also seems to conflict with the show’s storyline. It introduces a dependency on a part that wasn’t out yet when the goal was to get the system to market as rapidly as possible. But I don’t want to get into spoilers.

Suffice it to say the show makes overclocking seem more common in 1983 than it was.

History of overclocking in the 486 era

Overclocking got a lot easier in the days of the 486. By the early 1990s, motherboard makers started putting jumpers on the boards for clock speed. This let them sell one board and a reseller could put whatever CPU they wanted on it. It increased the cost of each board marginally, but it also reduced the number of SKUs they had to offer, and it also made it easier to sell upgrade CPUs.

Of course, this led to an unintended side effect. People started figuring out they could buy a 25 MHz system, take it home, change the jumper to 33 MHz, and have a 33 MHz system. Later 386 boards also had this ability. But there wasn’t much point in overclocking a 386 because AMD sold 40 MHz 386s for less than Intel charged for 33 MHz. Overclocking 486s was more common since they could buy a 25 MHz chip, clock it at 33 MHz, and save $100.

At this point overclocking was still a bit of a secret. There were circles of people who knew how to do it. They’d share their knowledge, whether in person, on bulletin boards, or services like Compuserve, if you knew to ask. But if you didn’t know someone who’d done it, you had to notice that jumper on your motherboard and change it to see what happened.

Overclocking goes mainstream

This is what Tom’s Hardware Guide looked like in 1996. Its front page featured articles on overclocking the system bus, CPUs, and two benchmarks.

Then along came the Internet. And along came a site called Tom’s Hardware Guide. With it, overclocking became a rite of passage. The history of overclocking didn’t start here, but this was when it came out of the underground.

Tom Pabst was a German doctor living and practicing in England who liked to unwind by playing video games. He decided in 1996 to share his experience with overclocking with other like-minded DIY computer enthusiasts.

Before he came along, the only way to get that kind of information was from hard-to-find Usenet newsgroups. Even there, you didn’t necessarily find the kind of information he shared on Tom’s Hardware Guide. Pabst is a key figure in the history of overclocking.

The Asus P55T2P4 was an important motherboard in the history of overclocking
This Asus P55T2P4 motherboard from the mid 1990s, intentionally or not, was very good for overclocking. It also had some nonstandard bus speeds. This made it one of the first popular enthusiast boards.

He discovered that certain Asus motherboards overclocked more reliably than other boards, and one board in particular, the P55T2P4, had some jumper settings for nonstandard speeds. It had 75 and 83 MHz buses before AMD and Cyrix decided to try to use those speeds. Pabst found that a Pentium-166 running at 83 MHz x 2 was faster than a standard Pentium 200 running at 66 MHz x 3. But if you really wanted something, you could run a Pentium at 83 MHz x 2.5 to reach 208 MHz.

With the right motherboard, high-quality memory, and a tolerant video card, Pabst’s tweaks would give you the fastest computer on the block. He even offered some advice on cooling, beyond the standard heatsink and fan that came with your CPU. Air cooling ruled the day in this era, although some experimented with Peltier coolers. Liquid cooling wasn’t on the radar yet.

CPU yields: Intel’s dirty secret

Pabst also explained processor yields, and how Intel would test wafers at given speeds, but once Intel worked out the initial kinks in production, most wafers marked at 75 MHz could run just fine at 100 or even 120 MHz, and most 120 MHz chips could run just fine at higher speeds. He expressed disappointment that his 120 MHz chip that he bought personally for his own gaming rig would only run at 166 MHz and didn’t run reliably at 200 MHz.

Pabst became a polarizing figure and he eventually grew tired of the PC hardware enthusiast community. But there’s little doubt that his early articles were the first most people heard of overclocking. His site soon spawned dozens of imitators. Many have come and gone, but all owe their roots to Tom’s Hardware Guide.

Intel fights back

Of course Intel wasn’t too keen on the phenomenon. The Pentium-200 was their highest margin product. So they didn’t like people buying 120 MHz CPUs and overclocking them to get the same performance for hundreds of dollars less. They didn’t like people buying the cut-rate Pentium 75 and turning it into a midrange CPU either.

So as subsequent CPU generations wore on, Intel limited the options for overclocking by doing things like locking CPU multipliers.

The Celeron

I understand if you overlook the Celeron, but that chip illustrates the overclocker attitude better than any other. In 1998, Intel released the Celeron to compete with cheap CPUs from AMD and Cyrix. To make it, they took all of the Level 2 cache off a Pentium II, clocked it at 266 or 300 MHz, and sold it at an AMD-like price. But due to the lack of L2 cache, it was slower than a previous-generation Pentium running at 233 MHz, let alone an AMD or Cyrix chip running at 266 or 300 MHz.

But the L2 cache was the Pentium II’s limiting factor in overclocking. So a 266 MHz Celeron ran happily at 400 MHz, or potentially even 450 MHz, the same speed as the fastest Pentium II at the time. It was slower than a Pentium II at the same speed, but it worked well for 3D gaming.

But the only people who bought those Celerons were overclockers and people who didn’t know anything about computers. So Intel released a new chip six months later with L2 cache built in. It was still slower than a P2, but faster, at least in terms of clock speed, than a comparable AMD or Cyrix chip. It also overclocked pretty well, though not to the levels the early Celerons did, since the cache was the limiting factor. So buying cheap Celerons and making them run like Pentium IIs and Pentium IIIs was a thing for a while.

AMD followed Intel’s lead and locked its multipliers, but people found they could defeat it by drawing jumpers on the chips with conductive ink or even graphite.

The high end

But over time the need for overclocking decreased. Computers got faster but the software didn’t get much more demanding. So as time went on, people stopped overclocking cheap CPUs. Instead, they bought middle- and high-end CPUs and overclocked them to get maximum performance for high-end gaming.

Intel and AMD probably had a bit of a hand in this, by leaving certain performance-enhancing features out of their low-end CPUs entirely, driving enthusiasts upmarket. Maybe they’d buy the cheap i7 and make it run like the faster i7, but they had to meet Intel halfway to do it. You can buy a $50 CPU today and overclock it, but you won’t get the performance of the $400 CPUs by doing it, no matter what you do.

As overclocking grew more difficult, enthusiasts adopted more and more extreme cooling measures, such as liquid cooling. As games became more graphics intensive, overclocking the chips on the video card also became a thing. Overclocking the GPU on the video card often gives a better return on investment than overclocking the CPU, and GPU makers traditionally have been less hostile to overclocking than CPU makers.

Overclocking today

In theory, overclocking should have gotten harder over time. Intel and even AMD tried to make it that way. But it didn’t really. Back in the Socket 7 days, Abit one-upped Asus by creating the IT5H, a motherboard that could overclock without changing any jumpers. That started a bit of a game of leapfrog that continued into subsequent CPU generations. Intel was limiting the options. But companies like Abit, Asus, and Gigabyte did what they could to make whatever remained possible as easy as possible. I have boards that have an option in the BIOS to overclock the CPU automatically. It’s just as easy as choosing what hard drive to boot from.

These days overclocking is almost expected. It’s usually not possible to do the same kind of ridiculous overclocks of the past, running a chip at 16 times its rated speed like Chuck Peddle may have. But it’s very possible to get a bump to the next level of CPU performance if you buy the right motherboard. I don’t recommend it, but it’s so easy to do that some enthusiasts expect you to do it.

So the history of overclocking isn’t over, by a long shot. People always want to get something for nothing, or something for less, at least. So the practice will be difficult to end entirely.

If you found this post informative or helpful, please share it!

2 thoughts on “History of overclocking

Comments are closed.