Many vintage computers have RS-232 ports, and some current equipment does too, especially network switches. But what is RS-232? And why did it only partly fall out of favor? And what about RS-232C? I’ll try to clear up the confusion.
What is an RS-232 port?
Long ago and far away, we called RS-232 “serial.” So when you see a 9-pin or 25-pin serial port on a vintage computer or other vintage equipment, that’s probably RS-232. For decades, RS-232 was the industry standard for serial connections. It’s far from the only serial standard. USB and Ethernet are also serial. Inside your computer, the SATA connection is also serial. Serial just means you’re sending one bit of information at a time, rather than sending eight or more bits at a time, in parallel.
The “RS” in RS-232 stood for “recommended standard.” It was proposed in the 1960s and formalized by 1962, and quickly became the de facto standard for serial connections. It eventually became a formal industry standard, under the name EIA-232. The name EIA-232 probably is more proper, but the earlier RS-232 name was so widely in use by the time it became a formal standard that almost nobody bothered to change. For all intents and purposes there’s no real difference between RS-232 and EIA-232.
Like USB, RS-232 used several different connectors over the years. The original standard called for a DB-25 connector. In 1984, IBM adopted a smaller, more compact 9-pin RS-232 connector. It omitted some rarely used and redundant pins to save space, since it was difficult to fit a DB-25 connector on the same slot bracket with much of anything else. Switching to a more compact 9-pin connector gave IBM more options when it came to arranging connectors. Apple used a slightly smaller 5-pin DIN connector that omitted even more pins on its Apple IIc.
The cable to connect between the various connectors was simple to wire, and was widely available. When I worked at Best Buy in the mid 90s, we always had dozens of RS-232 cables and adapters in stock.
What is RS-232 used for?
In the 1980s and 1990s, we mostly used RS-232 to connect a modem for telecommunications, such as to dial into a BBS, a service like Compuserve or Prodigy, or connect to the Internet. And for quite a while, we also used RS-232 to connect a mouse, until PS/2 mouse ports became ubiquitous. Almost every system had PS/2 mouse ports by the late 1990s, but prior to the Pentium era, it was very common for generic PCs to use serial mouse connections and brand-name PCs to use PS/2.
Many systems used a Centronics parallel port to connect printers, but serial printers existed too. On the Apple II, serial printers were standard, and using the parallel port was an option. In 1977, printers were slow and there was a fairly significant price difference between serial and parallel, so using serial for printers made sense.
Today, we still use RS-232, usually for a null modem connection to a switch or embedded device to upgrade firmware or change configurations. It’s really cheap to implement, so that’s why we still see it in embedded devices. And it provides a bit of security by obscurity, since obviously not everyone has the interface needed to connect to RS-232 today.
What is RS-232C?
Sometimes you see the designation RS-232C, rather than just plain RS-232. The difference is voltage. The original RS-232 standard ran at 25 volts. RS-232B lowered this to 12 volts, and RS-232C lowered it to 5 volts. This could ring alarm bells in some people’s minds, as plugging something expecting 5 volts into a connection that uses 25 volts could be dangerous. But RS-232C came into use in 1969. So unless you’re dealing with equipment built in the 1960s, you don’t really need to worry about RS-232 vs RS-232C.
What is an RS-232 interface?
The meaning of an RS-232 interface depends on who you ask. For some people, it’s the same as the port. For others, it refers to the whole card that implements RS-232. But I think the most common usage is for a device that converts another standard to RS-232. On a modern computer, that would be a USB-to-RS-232 adapter. But we saw them in the 1980s too. The Commodore 64 had a port that was very similar to RS-232, but omitted a couple of chips. You needed an RS-232 interface to connect a standard Hayes or Hayes-compatible modem to Commodore computers other than the Amiga. The Atari 8-bit computers also lacked a standard RS-232 connector, using something similar in many ways to modern USB. You could also get an RS-232 interface to let you use a Hayes modem with an Atari.
Why some companies didn’t follow the RS-232 standard in the 1980s
Why didn’t Commodore and Atari follow the industry standard? Cost, mostly. Commodore implemented something very close, but left out two standard chips that would have added $10 to the cost of their machines. Virtually all of Commodore’s machines were designed to meet a price point, and that $10 would have meant sacrificing something else. It made more sense to sacrifice something not everyone was going to use. Going their own way made their modems a bit cheaper too, which was important because Commodore really wanted to release a sub-$100 modem.
Atari was in a similar boat. The SIO port they designed was more versatile. Making it backward compatible would have added cost for a feature that much of its audience was unlikely to use in 1979.
Why is RS-232 recommended?
The “R” in RS-232 stood for “recommended.” But all that means is that it wasn’t a formal industry standard. It was a standard that a coalition of computer companies jointly developed and recommended. RS-232 is probably the most famous and recognizable of these recommended standards, but the formal, standardized versions are Electrical Industries Association (EIA)-232, the Telecommunications Industries Association (TIA)-232, and the international standard, ITU V.24.
On a less formal basis, using RS-232 was a good idea when it was possible, because it meant your product worked with a variety of machines. A Hayes modem worked with either an Apple II or an IBM PC, for example, right out of the box. And even in the case of companies like Commodore who went their own way, by following the standard, a fairly inexpensive interface made their product work with those machines too.