How did computers change during the 1990s?

How did computers change during the 1990s? Sounds like a test question, but I’ll bite. I’m willing to say it was the most pivotal decade for computing, rivaled only by the 1980s. Computers made multiple important changes during the 1990s, so let’s talk about them.

The Internet

How did computers change during the 1990s
How did computers change during the 1990s? This was a big and expensive PC in 1990, and its big, boxy appearance is pretty representative of what an early 90s PC looked like. By 1999, it was junk. Image credit: Procolotor/Wikimedia Commons

Any discussion of how computers changed during the 1990s has to start with the Internet. It might as well end with the Internet too. In 1990, the Internet was something you got to use while you were in college, or if you happened to work in the right industry. To have it at home, you had to be more than a little weird, and rich. By 1999, it was unusual if you didn’t have the Internet at home. Today it’s the main reason some people own computers. We weren’t quite there yet in 1999, but we were on our way.

It was also in the 1990s that the World Wide Web was invented, making the Internet graphical. Prior to the spring of 1993, the Internet was all text-based. Tim Berners-Lee’s idea to put a graphical interface on it was the key to making the Internet commercially viable.

The Internet of 1999 still had a lot of growing up to do. Social networking was still several years away. But many of the big players were active. Ebay was practically a national craze, as people emptied their closets and basements to eager bidders vying for things they hadn’t seen in years or even decades. You could buy books and music on Amazon. Video streaming was still years off, but you could rent movies from Netflix and mail them back instead of dealing with video rental stores and their late fees. And while not all restaurants had online ordering, you could order pizza.

Telecommunications speed

Telecommunications got much faster during the 90s, which was something else the Internet needed in order to be viable. In 1990, most people who had modems had modems that transmitted 1,200 or 2,400 bits per second. By 1999, my Internet connection transmitted at 256 kilobits per second. When I went online in 1999, my modem speed was 200 times faster than it had been in 1990.

By today’s standards, that 256-kilobit speed is anemic, but it was fast enough to stream audio. Commercializing streaming audio came later, and connections still had to speed up a bit for streaming video to be practical. But the jump from 56-kilobit dialup to always-on broadband made these things possible.

32-bit computers and operating systems

The first 32-bit computers and operating systems appeared in the 1980s, but 32-bit computing didn’t go mainstream until 1995. That brought numerous benefits, including increased speed and access to up to 4 gigabytes of memory. The increase in power made it possible to greatly improve reliability and, later, security. It also made full pre-emptive multitasking practical. Prior to 32-bit operating systems, multitasking was rare, or didn’t exist at all. You could only run one program at a time, so you had to exit one program and load another. Some operating systems let you pause one program to switch to another, or they offered so-called “cooperative” multitasking, where programs would tell the computer when they weren’t busy so another program could have your attention. In some cases this meant a program in the background hogged the computer’s attention while you tried to work in a different one.

3D graphics

Hardware and software improved dramatically in the 1990s, making 3D graphics much more practical. While 3D graphics existed in the 1980s, it was generally wireframe graphics that required you to use a lot of imagination to fill in the gaps. Then in 1992, id software released a 3D remake of a popular 2D maze shoot-’em-up game from 1981 called Castle Wolfenstein. The timing was good. In 1992 there were still a lot of aging PCs still hanging around that wouldn’t run Wolfenstein 3D, but Microsoft Windows 3.1 came out that year, and if your PC could run Windows 3.1, it was powerful enough to play Wolfenstein 3D. Even some PCs that were marginal for Windows could still run Wolfenstein 3D.

A year later, id software followed up with the smash hit Doom, which was even more ambitious graphically. And as the decade wore on, ever more ambitious 3D first-person shooter games came out, fueling demand for faster CPUs and graphics cards to make them more realistic.

Today we take 3D gaming from a first-person perspective for granted. But in the 1990s, it was still very new and it advanced very quickly through the decade.

Ever faster CPUs

Mainstream CPU speed accelerated dramatically throughout the 1990s, driven by newer and more demanding software. Although their best days were behind them, in 1990, there was still a market for 8-bit computers running at 1 MHz with 64K of memory, such as the Commodore 64. A top-of-the-line PC in 1990 sported a 486 processor running at 33 MHz and had between 2 and 4 megabytes of RAM, expandable to 16 megabytes.

By 1999, a top-of-the-line PC sported a Pentium III CPU running at 600 MHz and 128 megabytes of RAM, expandable to 256 or 386 megabytes. This was a dramatic improvement in 9 years. Today, a top-of-the-line PC from nine years ago feels a bit sluggish at times, but it’s still very useful. In 1999, that 9-year-old 486 that had cost thousands of dollars when it was new was essentially useless. A mainstream PC in 1999 typically sported an Intel or compatible CPU running at 300-400 MHz and 32-64 megabytes of RAM.

Digital music

In 1990, digital music meant music on CD instead of analog tape or vinyl. By 1999, it meant MP3 files, often of highly questionable legality. Apple hadn’t legitimized the MP3 player yet, and when Diamond Multimedia released its Rio MP3 player in 1998, the music industry saw it as a tool for pirates and nearly sued the company out of business. Legal digital music distribution was still a few years away, but by 1999, digital music meant something different than it had in 1990. But like the Internet itself, commercializing it took a number of years.

Style

For most of the 1990s, computers were what we now call boring beige boxes. The shade varied but they were invariably a very neutral off-white color, and they tended to be big boxes with lots of drive bays and slots for expansion. Bold design was the exclusive territory of expensive workstations that ran Unix, from companies like Steve Jobs’ Next, or from Silicon Graphics Inc.

In 1998, Steve Jobs released the Apple iMac, with a streamlined design and translucent plastic. It was bold, and you either loved it or thought it looked like a cheap toy. But it pushed the limits of what people thought a computer could look like.

It took a few years for black to overtake beige as the standard color for computers.

Portability

Laptop computers existed well before 1990, notably from Radio Shack. But laptops grew up a lot in the 1990s. By the end of the decade, laptops were expensive, but they had color displays and CD-ROMs and speeds that approached those of desktop PCs. Portable computing didn’t become mainstream for a few more years, but it certainly became a lot more practical in the 1990s. In the 1980s, you couldn’t dream of a laptop computer replacing a desktop, no matter how much money you were willing to spend. By the end of the 1990s, if you had money, you could do it.

Y2K

Finally, we have to end any talk of the 1990s with the Y2K problem. The 1990s ended with a crisis. The mainstream view today is that it was overblown and we wasted a lot of money fixing it. My view, as someone who actually did a lot of Y2K work, was that the people who thought the world was going to end didn’t understand the problem. So yes, it was overblown, but it could have been bad, and because lots of people put in lots of overtime, it wasn’t.

One side effect of the Y2K problem was that a lot of older systems got decommissioned. Some of the systems truly couldn’t be made Y2K compliant. Some wouldn’t have been worth the effort. Many were broken in other ways and Y2K gave us an excuse to clean house. So there was one last wave of change late in 1999, thanks to Y2K. If 1980s computers are rare today, the Y2K problem bears some of the blame for it.

How did computers change during the 1990s: In conclusion

So how did computers change during the 1990s? Quite a lot, as it turns out. A fully comprehensive list would probably fill volumes, but this should give you enough of an overview to answer a test question.

One thought on “How did computers change during the 1990s?

  • June 27, 2018 at 8:17 am
    Permalink

    The 1940s were the decade when computing changed the most. Leaving aside Charles Babbage, it’s when computing went from nothing to something. That’s an infinite change, which is more than any other decade will ever be able to claim!

    The biggest things in each decade:
    40s: the beginning
    50s: commercialization of computers
    60s: computers become mainstream in large business, minicomputers
    70s: birth of personal and small business computing
    80s: growth of personal computing and online services, graphic user interfaces
    90s: see the article!
    00s: birth of smartphones, explosion of e-commerce
    10s: ubiquitous computing: mobile, wearables, cars, smart speakers, cloud services, internet of things

    Now that we have pervasive computing, I’ll venture a prediction and say that the 20s will be about INvasive computing: implanted computers and possibly direct brain interface.

    Reply

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this:
WordPress Appliance - Powered by TurnKey Linux