Intel had a hit on its hands with the 8080 CPU and its successor, the 8085. But then Zilog came along and stole its thunder with the compatible but superior Z-80. Intel needed to follow up with something a lot better. The 8086 was what they came up with. The advantages of 8086 over 8085 were numerous, and that’s why everyone knows Intel, and few people outside of retro computing enthusiasts and embedded systems engineers ever heard of Zilog.
Here are the major advantages of 8086 over 8085.
The 8085, as an 8-bit CPU, couldn’t access more than 64K of RAM. In the 1970s, 64K was a lot of memory, but everyone knew that wouldn’t last forever. Today, 64 gigabytes seems like a lot of RAM, but I think we all know there will come a time when it won’t.
Intel, anticipating a day when memory would be cheaper and more plentiful, designed the 8086 to access up to 1 megabyte of RAM. In its most common use case, the IBM PC and compatibles, you only got to use 640K of that, but in 1981, that still seemed like a lot of RAM.
Faster clock speeds
The 8085 ran at a maximum clock rate of 6 MHz. The 8086 ran at a maximum rate of 10 MHz. The increase was modest by today’s standards, but in the 1970s and 1980s, speed increases of 20-40 percent represented a large premium.
The 8086 included native support for more complex mathematical operations than the 8085 did, making it much easier to perform calculations using the built-in capability of the chip, without having to write your own routines to do the math. Doing the math in hardware is also faster.
Backward compatibility, sort of
The 8086 wasn’t binary compatible with the 8085 but its design made it rather easy to adapt existing code to the new chip. The instruction set, register names, and memory addressing modes carried over so existing code would assemble on the new chip with little or sometimes no modification. The ease of migration from a popular 8-bit family gave the 8086 a big advantage over a number of other 16-bit CPUs on the horizon in 1978, and that’s why the 8086 is the 16-bit CPU you’ve heard of today.