How did computers change during the 1990s?

Last Updated on May 18, 2023 by Dave Farquhar

How did computers change during the 1990s? Sounds like a test question, but I’ll bite. I’m willing to say it was the most pivotal decade for computing, rivaled only by the 1980s. Computers made multiple important changes during the 1990s, so let’s talk about them.

The Internet

How did computers change during the 1990s
How did computers change during the 1990s? This was a big and expensive PC in 1990, and its big, boxy appearance is pretty representative of what an early 90s PC looked like. By 1999, it was junk. Image credit: Procolotor/Wikimedia Commons

Any discussion of how computers changed during the 1990s has to start with the Internet. It might as well end with the Internet too. In 1990, the Internet was something you got to use while you were in college, or if you happened to work in the right industry. To have it at home, you had to be more than a little weird, and rich. By 1999, it was unusual if you didn’t have the Internet at home. Today it’s the main reason some people own computers. We weren’t quite there yet in 1999, but we were on our way.

It was also in the 1990s that the World Wide Web was invented, making the Internet graphical. Prior to the spring of 1993, the Internet was all text-based. Tim Berners-Lee’s idea to put a graphical interface on it was the key to making the Internet commercially viable.

The Internet of 1999 still had a lot of growing up to do. Social networking was still several years away. But many of the big players were active. Ebay was practically a national craze, as people emptied their closets and basements to eager bidders vying for things they hadn’t seen in years or even decades. You could buy books and music on Amazon. Video streaming was still years off, but you could rent movies from Netflix and mail them back instead of dealing with video rental stores and their late fees. And while not all restaurants had online ordering, you could order pizza.

So in a span of 10 years in the 1990s, the Internet went from being something only a select few people had heard of to being one of the main reasons people wanted to own a computer.

Telecommunications speed

Telecommunications got much faster during the 90s, which was something else the Internet needed in order to be viable. In 1990, most people who had modems had modems that transmitted 1,200 or 2,400 bits per second. By 1999, my Internet connection transmitted at 256 kilobits per second. When I went online in 1999, my modem speed was 200 times faster than it had been in 1990.

By today’s standards, that 256-kilobit speed is anemic, but it was fast enough to stream audio. Commercializing streaming audio came later, and connections still had to speed up a bit for streaming video to be practical. But the jump from 56-kilobit dialup to always-on broadband made these things possible. We ended the 1990s with what could best be described as minimum-viable-product broadband, but it was the decade that computers started getting broadband.

How computer operating systems changed in the 1990s

The first 32-bit computers and operating systems appeared in the 1980s, but 32-bit computing didn’t go mainstream until 1995. That brought numerous benefits, including increased speed and access to up to 4 gigabytes of memory. The increase in power made it possible to greatly improve reliability and, later, security. It also made full pre-emptive multitasking practical. Prior to 32-bit operating systems, multitasking was rare, or didn’t exist at all. You could only run one program at a time, so you had to exit one program and load another. Some operating systems let you pause one program to switch to another, or they offered so-called “cooperative” multitasking, where programs would tell the computer when they weren’t busy so another program could have your attention. In some cases this meant a program in the background hogged the computer’s attention while you tried to work in a different one.

Today we have 64-bit operating systems so 32-bit operating systems seem quaint. But 32-bit operating systems becoming mainstream really tied all of the other ways computers changed in the 1990s together.

Standardization

We can also point to the 1990s as the decade when computers standardized much more. At the start of the decade, the Apple-Microsoft duopoly was in play, but there were other players. Several 8-bit computer lines from the 1980s lingered into the early years of the 1990s as starter computers. Commodore and Atari had rival 32-bit computers vying with the Macintosh. They didn’t survive past mid-decade, but it wasn’t completely clear at the beginning of the 1990s that they were doomed. And across the Atlantic, a company called Acorn had its own 32-bit computer. While that computer standard faded away too, the CPU that powered it, ARM, lives on in your smartphone or tablet today.

In retrospect, it was pretty clear even before 1990 that the computer industry was going to consolidate on a much smaller number of standards. The 1990s were the decade when we ended up with the Microsoft-Apple computer duopoly we have today.

How computer graphics changed in the 1990s

Hardware and software improved dramatically in the 1990s, making 3D graphics much more practical. While 3D graphics existed in the 1980s, it was generally wireframe graphics that required you to use a lot of imagination to fill in the gaps. Then in 1992, id software released a 3D remake of a popular 2D maze shoot-’em-up game from 1981 called Castle Wolfenstein. The timing was good. In 1992 there were still a lot of aging PCs still hanging around that wouldn’t run Wolfenstein 3D, but Microsoft Windows 3.1 came out that year, and if your PC could run Windows 3.1, it was powerful enough to play Wolfenstein 3D.

A year later, id software followed up with the smash hit Doom, which was even more ambitious graphically. And as the decade wore on, ever more ambitious 3D first-person shooter games came out, fueling demand for faster CPUs and graphics cards to make them more realistic.

Today we take 3D gaming from a first-person perspective for granted. But in the 1990s, it was still very new and it advanced very quickly through the decade. The first person shooter as we know it today first appeared on computers during the 1990s. The graphics from the early 3D first person shooter games look quaint today, but less quaint than the wireframe 3D we got in the 1980s.

How computers got faster in the 1990s

Mainstream CPU speed accelerated dramatically throughout the 1990s, driven by newer and more demanding software. Although their best days were behind them, in 1990, there was still a market for 8-bit computers running at 1 MHz with 64K of memory, such as the Commodore 64. A top-of-the-line PC in 1990 sported a 486 processor running at 33 MHz and had between 2 and 4 megabytes of RAM, expandable to 16 megabytes.

By 1999, a top-of-the-line PC sported a Pentium III CPU running at 600 MHz and 128 megabytes of RAM, expandable to 256 or 386 megabytes. This was a dramatic improvement in 9 years. Today, a top-of-the-line PC from nine years ago feels a bit sluggish at times, but it’s still very useful. In 1999, that 9-year-old 486 that had cost thousands of dollars when it was new was essentially useless. A mainstream PC in 1999 typically sported an Intel or compatible CPU running at 300-400 MHz and 32-64 megabytes of RAM.

Computer speeds accelerated very rapidly throughout the 1990s, and that meant much shorter upgrade cycles than we have today.

How music on computers changed in the 1990s

In 1990, digital music meant music on CD instead of analog tape or vinyl. And sound on a computer was a mixed bag. Some computers had pretty good music capability but many had a beeper that only produced rudimentary sound. Computers gained the ability to play back high quality music during the 1990s. And with it came the idea of using a computer to consume digital music.

As a result, by 1999, the phrase digital music meant MP3 files, often of highly questionable legality. Apple hadn’t legitimized the MP3 player yet, and when Diamond Multimedia released its Rio MP3 player in 1998, the music industry saw it as a tool for pirates and nearly sued the company out of business. Legal digital music distribution was still a few years away, but by 1999, digital music meant something different than it had in 1990. But like the Internet itself, commercializing it took a number of years.

And the MP3 file format didn’t turn up overnight. It was actually invented in 1991, and grew in usage over the course of the decade. It wasn’t very practical in 1991, but as CPU speeds and Internet speeds and disk capacity increased, it gained more acceptance. The first time I saw the file format was probably sometime in 1995 but it was more of a curiosity.

We tend to think of digital music as more of a 2000s phenomena than a 1990s phenomena, but the roots of digital music on computers absolutely dates to the 1990s.

How computer style changed in the 1990s

For most of the 1990s, computers were what we now call boring beige boxes. The shade varied but they were invariably a very neutral off-white color, and they tended to be big boxes with lots of drive bays and slots for expansion. Bold design was the exclusive territory of expensive workstations that ran Unix, from companies like Steve Jobs’ Next, or from Silicon Graphics Inc.

In 1998, Steve Jobs released the Apple iMac, with a streamlined design and translucent plastic. It was bold, and you either loved it or thought it looked like a cheap toy. But it pushed the limits of what people thought a computer could look like.

It took a few years for black to overtake beige as the standard color for computers. But the way computers looked definitely started changing fast in the late 1990s.

Changes in portable computers in the 1990s

Laptop computers existed well before 1990, notably from Radio Shack. But laptops grew up a lot in the 1990s. By the end of the decade, laptops were expensive, but they had color displays and CD-ROMs and speeds that approached those of desktop PCs. Portable computing didn’t become mainstream for a few more years, but it certainly became a lot more practical in the 1990s. In the 1980s, you couldn’t dream of a laptop computer replacing a desktop, no matter how much money you were willing to spend. By the end of the 1990s, if you had money, you could do it.

Y2K

Finally, we have to end any talk of computers in the 1990s with the Y2K problem. The 1990s ended with a crisis. The mainstream view today is that it was overblown and we wasted a lot of money fixing it. My view, as someone who actually did a lot of Y2K work, was that the people who thought the world was going to end didn’t understand the problem. So yes, it was overblown, but it could have been bad, and because lots of people put in lots of overtime, it wasn’t.

One side effect of the Y2K problem was that a lot of older systems got decommissioned. Some of the systems truly couldn’t be made Y2K compliant. Some wouldn’t have been worth the effort. Many were broken in other ways and Y2K gave us an excuse to clean house. So there was one last wave of change late in 1999, thanks to Y2K. If 1980s computers are rare today, the Y2K problem bears some of the blame for it.

How did computers change during the 1990s: In conclusion

So how did computers change during the 1990s? Quite a lot, as it turns out. A fully comprehensive list would probably fill volumes, but the answer you’re looking for probably is in here somewhere.

If you found this post informative or helpful, please share it!

One thought on “How did computers change during the 1990s?

  • June 27, 2018 at 8:17 am
    Permalink

    The 1940s were the decade when computing changed the most. Leaving aside Charles Babbage, it’s when computing went from nothing to something. That’s an infinite change, which is more than any other decade will ever be able to claim!

    The biggest things in each decade:
    40s: the beginning
    50s: commercialization of computers
    60s: computers become mainstream in large business, minicomputers
    70s: birth of personal and small business computing
    80s: growth of personal computing and online services, graphic user interfaces
    90s: see the article!
    00s: birth of smartphones, explosion of e-commerce
    10s: ubiquitous computing: mobile, wearables, cars, smart speakers, cloud services, internet of things

    Now that we have pervasive computing, I’ll venture a prediction and say that the 20s will be about INvasive computing: implanted computers and possibly direct brain interface.

Comments are closed.