Emulation is the black art of running software designed for one computer or game system on another system that normally wouldn’t be compatible. Emulation has existed since the 1970s, but is much more practical today. Here’s how emulators work.
Strictly speaking, emulators work like a translator, sitting between hardware and software and translating between the two so the software doesn’t realize it’s running on something else. Just like a human translator, there’s overhead involved in this, but modern computers are fast enough that emulation is more practical today than it was in the 1980s, even though plenty of emulators existed even then.
Emulation’s long tradition
Emulation existed even at the dawn of the microcomputer era. Bill Gates and Paul Allen wanted to develop software for the Altair 8800 microcomputer, but they had a problem. They didn’t have an Altair, and there was a waiting list to buy one. Paul Allen worked around the problem by writing an emulator for the DEC minicomputer that was available to them. This allowed them to write, test, and debug their software even though they didn’t have an Altair to use. This is why the software worked on the first try when they demonstrated it. It wasn’t dumb luck, and it wasn’t that Gates and Allen were gods. They had the tools that let them do the job.
Emulation let a lot of software development take place, and it was common among software developers well into the 1980s. Larger, more powerful computers often had better development tools than the 1 MHz microcomputers of the day. Even if the minicomputer couldn’t emulate the machine at full speed, having better tools made the overall development process go much faster. And sometimes observing the software running at less than full speed made it easier to spot problems. It was great for development, even if it was impractical for ordinary use.
How commercial emulators worked in the 1980s
In the 1980s, commercial products to allow consumers to emulate different machines existed. But emulation on those products really meant bolting one device onto another. The Atari 2600 compatibility modules for the Coleco Vision and Intellivision game consoles contained a 6507 CPU, a 6532 I/O chip, and an Atari TIA clone chip of questionable legality. It was a full Atari 2600 clone in a cartridge. Coleco took it a step further and created an all-out 2600 clone.
The Mimic Systems Spartan, an Apple II+ emulator for the Commodore 64, took the same approach. So did the CP/M add-ons for the Apple II and C-64. If the host system had hardware it could use, these emulators used it. But if they didn’t, they just added the hardware they needed. The more hardware you had to add, the more the emulator cost. The Atari 2600 add-ons were moderately successful because they cost around $50, and the Atari 2600 had a library of around 400 cartridges to 133-145 for competing consoles.
The Spartan flopped because it cost $299, could easily end up costing over $700 to implement it properly, and by the time it hit the market, almost everything a C-64 owner would have wanted to run was available on the 64 natively, or had a reasonable equivalent. It was too expensive. It was too late to market. And the return on investment was too low.
It wasn’t really emulation because you were pretty much bolting one computer onto another. But when the computer doing the emulation wasn’t much faster than the emulated machine, it was what they had to do.
How true emulation works
True emulation works by implementing the emulated machine’s hardware in software, so that software on the emulated machine can’t tell it’s not running on the real thing. You can think of it like translating the software on the fly so it can run on an alien computer. Or you can think of the emulator like a huge collection of device drivers and a hardware abstraction layer, sitting between the software and whatever hardware you’re using, so the software doesn’t know the difference.
Doing this introduces an enormous amount of overhead. That’s why software-only emulation didn’t become common until the release of the Amiga computer in the mid 1980s. Even then it did so with mixed results. Previous computers just weren’t fast enough to deal with the overhead. Even the Amiga struggled, depending on how similar it was to the computer it was emulating. An Amiga did a much better job of emulating a 68K-based Mac than a Commodore 64 or IBM PC, because the Amiga and Mac used the same CPU. A stock Amiga running at 7 MHz ran very much like a 7 MHz Mac would. But it couldn’t emulate a C-64 or IBM PC at anywhere near full speed. It wasn’t until Amiga CPU speeds reached 25 MHz or so that it could emulate a C-64 or IBM PC at full speed in software alone.
When emulators became practical
At speeds like 25 MHz, emulating computers that ran at slower speeds became practical on other machines too. So we started seeing practical emulators of all kinds of different platforms on PC hardware by the mid 1990s. Activision’s Atari 2600 Action Pack series from the mid 1990s wasn’t a translation of Atari 2600 games to Windows — it was an Atari 2600 emulator bundled with ROM dumps of the actual vintage Activision cartridges from the early 1980s. It was around this time that we started seeing C-64 emulators on PC hardware that could emulate the 64 at full speed, too.
Modern emulators still work the same way. As hardware becomes more powerful, systems that were impractical to emulate in the past become more practical. Additionally, emulation of very-old hardware can become more true to life, such as adding filters to the graphics output to make an LCD screen look more like the old CRTs looked.
Emulators on Mac OS
Emulators became very popular on Mac hardware in the late 1990s as well, for practical reasons. There were content-creation apps that only existed on the Mac, or at least had better versions of the Mac than other platforms. But the opposite was true of productivity apps and games. A company called Connectix tried to step into that void. It released a product called Virtual PC that could run Windows on Mac hardware. While a Mac running at 200 MHz couldn’t keep pace with a PC at the same speed with Virtual PC, the emulated PC was at least usable.
More controversially, Connectix released a product called Virtual Game Station that emulated a Sony Playstation on the Mac. Sony sued for patent infringement. Sony won, which suggests they had a legal case, but many people saw it as self-defeating at the time. I hold that view. While it did have the potential to cut into Playstation hardware sales, Sony usually sold the game console itself at a break-even cost, or even at a loss because the games were profitable, especially first-party titles. The Connectix product increased the install base without Sony having to go to the expense of manufacturing and selling the hardware. That’s probably why Connectix took the gamble in the first place. Connectix wasn’t able to convince either the courts or Sony, and quickly withdrew it from the market.
Open-source PS1 emulators are available today of course. Emulation itself is usually legal, but software piracy isn’t.
Macs emulating PCs today
Today, since Macs run on Intel CPUs just like PCs and can even run Windows directly, running Windows side by side with OS X is very practical using products like VMware. There’s a bit of emulation involved in virtualization, particularly in regards to the network adapter. But there’s no translation involved when it comes to the CPU, so it’s much faster than Virtual PC was in the 90s.
This is why Mac fans sometimes say their Mac runs Windows better than a real PC. Depending on the PC they’re comparing it to, they’re probably right. But it’s not really fair to compare a $2,500 Mac to a $500 PC. A $2,500 PC runs Windows better than a $500 PC too.
PCs emulating PCs
Sometimes PCs even emulate older PCs. This is sometimes necessary these days because 64-bit Windows removed support for running older 16-bit applications. But you can still use something like DOSbox to run older 16-bit DOS applications. It can even run Windows 3.1 along with 16-bit Windows applications if you install Win3.1.
This works well, because the translation involved on the CPU side is minimal. Plus, the graphics and sound hardware in older DOS PCs wasn’t very complex. A modern CPU can easily do the necessary translation between the vintage hardware and the modern device drivers. Some PC software is sensitive about timing, and emulation can be helpful because an emulator can easily slow itself down to match specific grades of hardware. You may be able to find a title that objects to running on a slowed-down 386, but it will be rare.
It can be very difficult to build one PC that can run all DOS software ranging from early 1980s apps designed for the IBM PC to mid-90s apps designed for a Pentium with a high-end graphics card equally well. DOSbox is more convenient since it can scale across that range and can emulate various graphics and sound cards. Even if you have a several vintage PCs with representative graphics and sound hardware of the period, it’s still possible to find the occasional title that doesn’t quite work right. In those cases, DOSbox can probably emulate the optimal combination of hardware and software to get the title working.