Why Intel stopped making motherboards

Last Updated on September 13, 2023 by Dave Farquhar

For nearly two decades, Intel was a go-to brand not just for CPUs but also for motherboards. Then, in 2013, Intel pulled out of the market, ending an era. Here’s why Intel stopped making motherboards.

Intel saw motherboard production as a way to protect its brand identity more than as a profit center. Once the industry had several other companies producing motherboards that met acceptable quality standards, Intel had little reason to stay. The key to understanding Intel’s motherboard business is understanding Intel’s mindset. Intel will introduce products just to sell or protect another product, then leave that market when the product no longer needs that support.

The Intel Inside campaign was about more than CPUs

Why Intel stopped making motherboards
This Intel Aladdin motherboard appeared in computers from Gateway and other manufacturers. Selling quality motherboards helped Intel protect its brand reputation. But by 2013, the problem Intel motherboards solved had pretty much gone away.

Intel started life as a maker of computer memory, as in RAM. Not CPUs. CPUs literally didn’t exist yet. But after two different companies asked Intel to produce them, Intel agreed to give CPUs a try, mostly because they knew it would make it easier to sell memory. The first desktop computer wasn’t marketed as such, and in the end it didn’t end up using Intel’s CPU, but it wasn’t long before a young computer scientist named Gary Kildall got his hands on the chip. Kildall created the CP/M operating system and that helped popularize early Intel 8-bit CPUs.

Intel quickly moved on to a 16-bit CPU. One of the chips in that new line ended up powering the IBM PC. IBM wanted to ensure it would have a reliable, steady supply of CPUs, so IBM required Intel to license the 8088 to other companies. That way IBM could buy identical chips from someone else or make them themselves if Intel couldn’t supply enough for whatever reason. Intel lined up several alternative sources, including AMD. Yes, AMD started out making Intel CPUs under license, not as a company who cloned Intel’s CPUs.

Early Intel clones

While AMD was making chips as an official licensee of Intel, other companies were cloning Intel chips. Zilog produced a chip called the Z-80, which was completely compatible, software-wise, with the Intel 8008, 8080 and 8085 CPUs. It also contained some enhancements. CP/M was created on Intel CPUs, but the majority of CP/M computers ended up running on a Z-80 instead.

NEC made its own clones of the Intel 8086 and 8088 CPUs, called the V30 and the V20, respectively. These chips were completely pin-compatible with their Intel counterparts and were slightly more efficient, so they ran 15-20 percent faster when running at the same clock speed. The NEC V20 and V30 were a cheap upgrade for XT-class PCs, and some clone makers used NEC chips instead of Intel 8088s.

This may have been a bit of an annoyance to Intel, but Intel didn’t see itself as a CPU maker, even in the early 1980s. Intel was selling CPUs so it could sell memory. Intel even sold memory expansion boards for IBM PCs that plugged into the ISA expansion slot, providing up to 8 megabytes of memory. It was populated with Intel memory chips, of course.

If companies wanted to buy a non-Intel CPU, Intel was still happy to sell the memory. But things started to change in the late 1980s. Two things happened.

Japanese memory and the Intel 386 CPU

Prior to the mid 1980s, almost every US company who made computer chips of any type also made memory. Intel made RAM chips. AMD made RAM chips. So did Texas Instruments, Motorola, and a scrappy company in Idaho named Micron, among others.

As the 1980s wore on, Japanese chipmakers started entering the memory market. And one by one, the US chipmakers concluded independently that the Japanese were better at making memory than they were. Intel was neither the first nor last to come to this conclusion. By 1989, Micron was the only US company still making memory chips on its own. The rest either partnered with Japanese companies or left entirely.

Meanwhile, the Intel 386 provided an opportunity. IBM refused to use the chip when Intel first released it in 1986. But Compaq did. IBM’s rejection probably seemed like a huge blow to Intel at first, but it ended up transforming both companies in ways few expected at the time. Intel rose to new heights, and it was the start of IBM’s fall.

Since IBM didn’t want to use the 386, Intel had no reason to license it to anyone else. Compaq wasn’t big enough to make that kind of a demand in 1986. Intel had the 386 market all to itself. Andy Grove recommended that Intel focus on CPUs, which was a controversial decision at the time, but it transformed Intel from just another chipmaker to a Fortune 50 company.

But in the meantime, Intel had a problem. Two problems.

The Intel Inside campaign

Intel didn’t license the 386 design to anyone else. But AMD wasn’t willing to quietly fade away, just selling 8088 and 80286 CPUs until the market for those chips disappeared. AMD decided to produce 386 CPUs without Intel’s blessing. Of course Intel sued, but AMD bet it could win. It took some time, but for a couple of years, Intel had to share the lucrative 386 market with AMD. And by then, Intel didn’t have much to sell you if you bought the CPU from AMD.

But the clone market provided both a problem and an opportunity for Intel. Starting in the mid 1980s, lots of companies sprung up producing motherboards to supply small PC clone makers like Dell. Dell isn’t small anymore of course, but Dell started out as one of countless companies building PCs from off-the-shelf parts that were just knockoffs of the parts IBM used. Michael Dell was hardly unique. Every major city had dozens of Michael Dells, buying parts to build knockoff PCs. They just didn’t sell them over the phone to a national audience like Dell did, or at least not at the scale Dell achieved.

For Intel, this was good and bad. Of course there was tremendous opportunity, with thousands of storefront-type dealers selling cut-rate PCs. But if they all used AMD CPUs (or a later competitor like Cyrix), that was going to be a problem. And if they were all the quality of the white-label canned peas you find on the bottom shelf of the grocery store, that was going to be a problem.

Dealing with AMD

Intel sued AMD of course, but didn’t get many concessions in court, especially with the 386 CPUs. And the relief wasn’t immediate. So Intel started advertising heavily. It devised a logo, “Intel Inside,” and supplied stickers with the logo to PC makers who used their chips. Intel’s advertising strongly suggested that if you wanted a great PC at a great price, all you had to do was look for a PC with an Intel microprocessor in it.

Intel didn’t want to follow the Harbor Freight business model, but they were happy to push AMD and Cyrix into that segment of the market.

IBM and Compaq hated it. Both IBM and Compaq sold expensive, high-quality PCs, and suddenly, their key supplier was telling millions of people that the differences between them and Packard Bell were insignificant, since they all used Intel CPUs. Intel’s message was that you could buy the cheapest PC you could find with an Intel CPU, and everything would be OK. Well, at least that’s what everyone remembered Intel saying, even if that’s not quite what Intel said.

And that wasn’t true.

The quality issue

The trouble was, while Michael Dell actually was making a reasonable effort to sell PCs of acceptable quality, those local storefront clone shops weren’t necessarily. In St. Louis, we had computer stores who had poor ratings with the Better Business Bureau because they sold computers that made Packard Bell quality look top-drawer. It was just a race to the bottom on price, and whoever found the cheapest suppliers won. Some even cheated by overclocking the CPU. Some stores told you they were doing it, and it was a great way to save you money. The less honest stores didn’t, and still charged a premium for a 33 MHz CPU even if they were overclocking a slower chip.

The result was predictable. These PCs had high failure rates. That was a big inconvenience while the computer was still in warranty. But once the warranty expired, you were facing an expensive repair to get the computer running again. And if the replacement was just another cheap, low quality component, the computer wouldn’t last very long before it needed another repair.

In the early and mid 1990s, I had several people ask me to repair computers they bought from a store called Better Business World. They all had the cheapest components imaginable and the only real way to fix them was to gut the machine and replace virtually everything inside the case except the floppy drives. But every city had at least one store like that.

How Intel motherboards solved the quality issue

The problem for Intel was that its advertising was encouraging people to go into any computer store and ask for a computer with an Intel CPU in it, and everything would be OK. Depending on the computer store you went to, that might or might not be the case.

That’s why Intel started producing motherboards. This gave Intel control over one of the most critical components in the system. It was far more important to the overall quality and longevity of the system than who made the CPU, in spite of what Intel’s marketing said. Intel motherboards were more expensive, but not outrageously so. If you bought a system with an Intel motherboard in it along with a reasonable quality hard drive and power supply, it might not quite match IBM or a Compaq quality, but it would be close, and probably a fair bit cheaper.

It also let Intel combat overclocking. Intel motherboards weren’t good for overclocking, and soon gained a reputation for that. This was intentional. It kept shops that used Intel boards from selling low-spec CPUs as the higher speed versions. Tom’s Hardware Guide hated this, but Intel wasn’t going after the hardware enthusiast community. They were chasing the mass market. Enthusiasts overclocking Intel CPUs so Quake would run faster usually weren’t buying AMD CPUs. And they weren’t buying cut-rate motherboards either, because the cut-rate boards weren’t as good for overclocking as boards from companies like Asus and Abit.

Some large companies, including Micron, Gateway, and even Dell, used Intel motherboards at least some of the time.

Why Intel left the motherboard market

Over the course of building motherboards for two decades, Intel built up a portfolio of supporting products. One of the keys to driving PC prices down was integrating the chips on the board into a smaller number of chips. Intel started buying up companies who made those kinds of chips. It increased their control, and it also increased their profit margins.

For about 20 years, Intel provided a safe choice. If you didn’t know anything else, you could ask for an Intel motherboard and an Intel CPU when you bought a computer, and you’d get reasonable quality and value. Sure, you could do better if you knew what to ask for. But you could also do a lot worse. And Intel boards provided those small computer stores with a good upsell. If you asked for an Intel CPU, they could say the CPU didn’t make a big difference if you didn’t also buy an Intel motherboard, and talk you into buying a more expensive computer there instead of a cheaper computer at another clone shop, or a Packard Bell from a big-box store.

Over time, the rest of the industry improved. Thanks to hardware enthusiast sites, people started paying attention to brands and quality. Companies like Asus and Gigabyte, who made quality boards, started putting their names on the product and the packaging so people would ask for it.

And while low-tier parts certainly still exist, even the cheapest boards are better than they used to be. I still recommend Asus or Gigabyte over, say, ECS, the successor company to Amptron and PC Chips, the makers of so many of the motherboards that drove Intel to intervene. But the no-name parts don’t fail as quickly as they used to.

By 2013, the original reasons Intel started producing motherboards didn’t really exist anymore. Intel could exit the market, keep selling the chipsets like they always had been, and treat companies like Asus and Gigabyte as partners rather than partner/competitors.

One other big thing changed. In 1998, if someone bought a cheap PC and had a bad experience with it, there was very real danger of that person going to Apple. In 1998, that Apple computer had a Motorola CPU in it. But in 2013, that Apple computer had an Intel CPU and chipset in it. And an expensive, high-margin Intel CPU and chipset at that.

Intel coming full circle

Intel is a tough competitor, and has certainly been accused of anticompetitive practices. But Intel’s use of new products is different from that of, say, Microsoft. It’s illegal to use your monopoly to try to get another monopoly. Intel has certainly released products to try to grow the market for its existing products. That was the case with Intel getting into CPUs so it could sell more memory chips, and into motherboards so it could sell more CPUs.

But once its motherboards weren’t really helping Intel sell CPUs anymore, it made sense to leave. Intel probably wasn’t losing money on motherboards but they aren’t a huge profit center either. Like memory in the 80s, Intel decided it was time to cede that market to other companies.

If you found this post informative or helpful, please share it!