Video card vs graphics card

Last Updated on November 23, 2018 by Dave Farquhar

Occasionally someone comes on a hardware forum and asks for help with the terms video card vs graphics card, and apologizes for possibly asking a n00b question. The thing is, it’s not a bad question at all. While there’s technically no difference between them today, in retro computing, there was a difference. And even in modern times, the two words certainly imply different things. So let’s dig in.

Video card vs graphics card, today

Video card vs graphics card
The phrase “graphics card” implies something big and beastly like this ATI-based card that takes up two slots due to all the hardware it takes to cool it. People who buy these kinds of cards make a distinction between video card vs graphics card.

We’ll talk about today first, since the video card vs graphics card difference today is like the difference between a violin and a fiddle. When playing classical music, it’s a violin. When playing country music, it’s a fiddle. Same instrument, different setting.

When the integrated graphics on your motherboard prove inadequate and you buy a plug-in card so Youtube videos will play back without stuttering and web pages will render faster, or other 2D computing tasks, and you pay $50 or less for it, all of that implies a video card.

When you buy an expensive card with a high-end GPU in it for 3D tasks, whether it’s business work like CAD or hobbyist use like gaming, and you pay $100 or more for the card, you’re more likely to call it a graphics card.

That $100 card will play Youtube videos just fine, and it will probably spend some time doing that kind of stuff too. And you can play games on a $35 video card if you turn the detail down enough. It’s just a matter of whether you’d be happy with it.

These cards tend to go obsolete before they break, especially if you don’t overclock them.

The future of inexpensive video cards

video card vs graphics card: the $50 level
A simple video card like this one has some 3D graphics capability, but people who buy these types of cards to watch Youtube videos generally call them video cards.

AMD’s integrated video started to improve around 2011, which was a boon for budget PCs since you could get decent video and a cheap CPU and have a nice budget PC.

Intel realized they were ceding a market to AMD, so they followed suit with better integrated graphics in its Sandy Bridge series of CPUs in early 2016. This lessened the market for sub-$50 video cards, since both CPU makers offered integrated graphics that were at least as good as the sub-$50 video cards, perhaps even beating the $50 cards.

But as long as there’s a gap in 2D performance between integrated video and entry-level graphics cards, there will be a market for something in between. At the very least, there’s a market to outfit the millions of pre-Sandy Bridge PCs in the used pipeline with something better to build cheap gaming PCs.

Video card vs graphics card in the 1990s

Things were different in the 90s. In the mid 1990s, 3dfx Interactive produced a chipset to accelerate 3D graphics for arcade machines. By 1996, memory prices dropped to the point where they could release an affordable product for PCs based on the chipset. They produced a PCI expansion card with its Voodoo chipset on it, 4 megabytes of RAM, and the other necessary support hardware. Like a modern graphics card, these cards accelerated 3D processing by offloading those tasks from the CPU. This emerging market was a popular topic in the early days of Tom’s Hardware Guide.

video card vs graphics card: Diamond Monster 3D
This 3dfx-based Diamond Monster 3D graphics card from 1997 required a separate 2D video card for non-3D work.

The difference between a 3dfx Voodoo graphics card and a modern graphics card was that the Voodoo cards only did 3D processes, through a proprietary API called Glide. It plugged in next to a regular video card, and then you plugged in a pass-through cable to connect the two cards. Your computer continued to use the regular video card for most tasks. It would switch to the 3dfx Voodoo card when you loaded a game that could use, or be made to use, the Glide API for 3D rendering.

The beginning of a new genre

The killer app for 3dfx products was the popular mid-90s game Quake. Other games quickly followed suit, of course, to take advantage of the growing installed base.

In 1998, 3dfx followed up with the Voodoo 2, which was faster, and allowed you to pair up multiple cards via SLI. There were some competing 3D/2D cards, notably based on technology from Nvidia and ATI. But for a while, enthusiasts preferred getting a Voodoo 2 and having their choice of 2D card. SLI turned out not to be cost-effective, but the concept returned in later years.

3D games existed before Quake, of course. The genre started several years earlier with Wolfenstein 3D, which would run on 286 and 386 machines. Its successor, Doom, was a huge success. But the availability of faster CPUs and dedicated 3D hardware allowed greater levels of detail and higher and a greater number of frames per second, giving smoother video.

Integrated cards and the end of an era

3dfx’s competitors went the 3D/2D route, and in time started to catch up with 3dfx in terms of performance. By 1999, with the release of the Voodoo 3, 3dfx was also going the combined 2D/3D route, so the era of separate video and graphics cards was essentially over. 3dfx disappeared soon afterward, partly because of a poorly executed merger with STB, a maker of graphics cards. So that’s why you don’t see 3dfx hardware today. 3dfx hardware is obsolete today but collectible. It turns up on Ebay with some regularity.

But for a few years during the Pentium II era, one could make a distinction between video vs graphics cards, and it helped define an era in the late 1990s.

If you found this post informative or helpful, please share it!