VGA and MCGA were two new video standards introduced in April 1987 with the IBM PS/2 line. They were meant to make the new IBM computers more competitive with Amiga, Atari ST, and color Macintosh computers, all of which had ability to create high resolution graphics with more colors then IBM’s first generation of PCs. MCGA vs VGA confused people in 1987, and it can confuse people today.
Of the two standards, VGA proved the most enduring. The GPU in your desktop or laptop PC is fully backward compatible with VGA, and whatever graphics mode you’re using right now is an extension of VGA. The 15-pin VGA plug is still very common 35 years after the PS/2’s introduction. VGA approved to be the most enduring element of the PS/2 line.
What was MCGA?
You don’t hear nearly as much about MCGA as you do other video standards. MCGA was an acronym for Multi-Color Graphics Array. Contrary to some popular belief, it does not stand for “Monochrome Color Graphics Adapter.”
You can think of MCGA as a cut down VGA, the literal opposite of Super VGA, and that is frequently where the explanation ends. It used the same 15-pin connector, and normally used the same monitors. From an end user perspective, the most notable difference was that it didn’t do high resolution, high color displays. It could do 320×200 graphics with 256 colors from a palette of 262,144 (mode 0x13 or mode 13h), but not 640×480 with 16 colors (mode 0x12 or mode 12h). The only 640×480 display it supported was mode 0x11 (11h), with two colors. There’s more detail about the video modes and their nuances at Ardent Tool of Capitalism.
MCGA wasn’t present on as many of the new machines. Only the entry level Model 30 had MCGA initially. Later, the all-in-one Model 25 also received MCGA, as did the rare and obscure 7496 Executive Workstation. But the Model 30-286 had VGA, not MCGA. So IBM only used it on two machines.
VGA caught on quickly. It took less than six months for third party VGA cards to hit the market, with the promise that you didn’t need to run out and buy a PS/2. They could buy a VGA card and monitor to upgrade an existing PC, and carry on like it was a PS/2. The PS/2 wasn’t the immediate flop that people remember it as today, but the ability to upgrade existing machines certainly limited its success.
What about MCGA? Almost nobody bothered. Epson built MCGA into its Equity 1E and PSE-30 PCs, but that is the only known third party implementation of MCGA. No one ever produced a standalone ISA card for MCGA. There just wasn’t enough difference in cost for it to be worthwhile.
MCGA ended up much like XGA in that regard. Third parties did their own thing and left IBM isolated in its own ecosystem.
The other two differences between MCGA and VGA
There were two notable technical differences, besides the limitations in resolution and color depth. The first was the limited backward compatibility. VGA was backward compatible with EGA. MCGA was not. MCGA lacked modes 0D and 0E that EGA used. Both VGA and MCGA were backward compatible with CGA, but if you wanted to run EGA software, MCGA couldn’t do it.
The second difference is more of a curiosity today, but one that retro computing enthusiasts might find interesting, and maybe even useful.
Both VGA and MCGA normally run at 31 kilohertz. This makes VGA incompatible with some other computers of the time like the Amiga, Apple IIGs, and Atari ST, even though they were also analog. Amiga computers ran at 15 kilohertz, unless you outfitted them with a scan doubler.
But if you don’t ground any of the mode sense pins on the VGA connector, MCGA runs at 15 kilohertz. This means if you wire up the red, green, blue, horizontal sync, vertical sync, and ground pins straight through from the PC’s 15-pin connector to their counterparts on an Amiga monitor and leave out the other 9 pins completely, you can use an Amiga monitor, such as an Amiga 1080 or Commodore 1084, with a Model 30. Specifically, you need to make sure pins 4, 11, 12, and 15 aren’t grounded to anything and remain completely unconnected.
But you can also use any VGA monitor, and VGA monitors are much easier to come by. So I don’t know how many people actually use this trick.
Why did IBM make MCGA?
It’s a bit curious why IBM would make a video standard that was only incrementally less expensive, but had a severe compatibility limitation. There was a booming market for software that used EGA graphics at the time, and the MCGA-equipped Model 30 couldn’t run any of it.
IBM’s mindset at the time gives a clue. IBM wanted the Model 30 to wipe out the market for XT-class clones. And the Model 30 was compelling. It had an 8 megahertz 8086, which was fast enough to outperform and 8088 at the usual 7.16 and 9.54 megahertz turbo speeds. Yes, an 8086 could out perform an 8088 at a slightly higher speed because it was more efficient. And MCGA was a compelling upgrade over the usual graphics standards that XT clones shipped with.
I guess it didn’t occur to them that aftermarket VGA cards would make the point moot.
But IBM wanted a reason for people to buy the more expensive computers in the line. A faster CPU, they reasoned, might not be enough. So they built a deliberately mediocre video standard to use in entry level machines. IBM had a history of this. In 1986, IBM released its PC/XT-286, and it had a nice trick to improve performance, but they put a six megahertz processor in it. They wouldn’t give you that speed trick and an 8 MHz processor.
I don’t know that anyone ever said IBM was an acronym for Intentionally Bland Mediocrity at the time, but that sums up many of IBM’s decisions in the 1980s, including MCGA. And it shows why they were more vulnerable than they appeared at the time.