Super VGA vs XGA

Last Updated on June 2, 2022 by Dave Farquhar

The distinction between Super VGA and XGA matters more when you’re talking projectors than when you’re talking monitors. In the 90s, we didn’t talk about XGA much. XGA was a semi-proprietary IBM thing, available for high-end PS/2s, while SVGA was the open standard.

On projectors, XGA means a 1024×768 display, while SuperVGA or SVGA means an 800×600 display. Vintage CRT monitors didn’t make that distinction since CRTs don’t operate at fixed resolutions. While LCD monitors care about the difference, they automatically negotiate the resolution with your computer so they hide the distinction.

Super VGA vs XGA today

XGA vs SVGA
If you’re using a video projector, XGA is better than SVGA. But if I’d said that in the 90s, people would have given me a strange look.

In the 90s, we didn’t need names for specific resolutions. A 14-inch CRT could display a 1280×1024 resolution, it just wouldn’t be very easy to read at that tiny size. It was more usable on a 17-inch display but your 14-inch monitor could display it.

Projectors operate at specific fixed resolutions. So we backed into names for those resolutions. 1024×768 got the XGA name, and SVGA got 800×600, since XGA didn’t initially define that standard.

Today, XGA is better than SVGA because it’s higher resolution. In the 90s, the opposite was true, because SVGA was an open standard and more versatile. They both used the same connector.

If an old-timer ever argues with you about SVGA being better than XGA, there’s a historical reason for it. If you care about the history, either because you’re a retro computing enthusiast or someone is arguing with you, here’s the history.

Super VGA vs XGA in the 90s

SVGA vs XGA
CRT monitors didn’t care much about SVGA vs XGA. If it had a 15-pin connector, it would work with either. You could plug them into an old VGA system too, because it still used the same connector.

In the 90s, not a lot of people talked about XGA. I still recall, a quarter century later, the hardest question in my job interview selling computers at Best Buy, because it was a monitor question. But it was only a hard question because I made it hard.

“What’s the highest resolution monitor available on the market today?” my interviewer, Ron, asked.

I mulled over the question for a good 30-45 seconds, trying to decide whether it was a trick question. Finally I said, “I think you’re looking for SVGA.” Then I flat out asked Ron if he considered IBM XGA and NEC Multisync monitors SVGA, or something else.

Thus began my tenure as the Best Buy know-it-all, because Ron gave me a look that told me he had no idea, and no one had ever asked him that question before.

In the 90s, anything that was higher than 640×480 resolution was Super VGA. I called Super VGA an open standard, but that’s kind of a misnomer. It was open but it wasn’t standard. Super VGA was complete and total anarchy. Anyone could extend it however they wanted as long as they provided a driver for it. SVGA was better, because it was more versatile, and cheaper. Sometimes a lot cheaper.

XGA as the “legitimate” successor to VGA

Super VGA vs XGA
In the 90s, pretty much anything better than VGA was considered SVGA. XGA meant IBM’s standard for PS/2s (pictured above), if you even cared.

In some ways, the legitimate successor to VGA was XGA, not Super VGA. You see, when it came to IBM video standards, you had MDA, CGA, EGA, VGA, and then XGA. Those cards and their clones were all compatible with each other. You didn’ t have to worry about who made your EGA card. From a programming perspective, they were 100% compatible with each other. A few companies tried extending CGA, but only the PCjr/Tandy 1000 extension of CGA caught on.

Yes, I’m aware of MCGA, but it didn’t fit neatly into that hierarchy.

After VGA, IBM went on to produce XGA and a monitor to go with it called the 8514. But most of the industry went a different direction, extending VGA in various ways, and they called it Super VGA. An organization called the Video Electronics Standards Association (VESA) defined various modes and standard ways to make your card enable them. But if you knew the hardware, everyone’s Super VGA card had capabilities beyond the VESA standards, or at the very least, a faster way to enable them. That’s why Windows looks better and runs faster when you load a driver for your specific card, rather than picking the SVGA driver, which is just a lowest common denominator.

XGA, by contrast, was a well-defined standard like VGA, EGA, CGA and MDA. From a software perspective, XGA was XGA, whether you bought the card from IBM or ATI, one of the few makers of XGA clone cards.

XGA vs SVGA from a practical perspective

In the 90s, as long as you loaded the right driver, you didn’t have to worry much about SVGA vs XGA. At my first real IT job, which was an IBM shop, we had a bunch of IBM 8514 monitors. Not only was it an XGA monitor, the 8514 was the XGA monitor. It worked fine plugged into SVGA systems.

To us, XGA was just a subset of SVGA. And by the mid 90s, IBM realized that trying to corner the market with proprietary standards was a mistake and started using SVGA chips like everyone else.

If you found this post informative or helpful, please share it!