How the IBM PC became the de facto standard for desktop computers

I saw a question on a vintage computing forum this week: How did the IBM PC become the de facto standard for PCs, and the only desktop computer architecture from the 1980s to survive until today?

It’s a very good question, and I think there were several reasons for it. I also think without all of the reasons, the IBM PC wouldn’t have necessarily won. In some regards, of course, it was a hollow victory. IBM has been out of the PC business for a decade now. Its partners Intel and Microsoft, however, reaped the benefits time and again.

Read more

Asus gets into the sub-$200 tablet fray

Now Asus is jumping into the sub-$150 tablet range too, but with a device that’s much more subdued than what Polaroid and Archos are offering.

It appears to me that Asus is trying to remain mid-tier, and hope that name recognition and reliability advantages (whether perceived or real) keep their tablet in the game.
Their $149 Memo Pad has a 7-inch 1024×600 display and a single-core VIA WM8950 CPU, running at 1 GHz. It will be running Android 4.1 Jelly Bean, and has the precious microSD card slot, which accepts up to a 32 GB card. Read more

RIP, Jack Tramiel, founder of Commodore

RIP, Jack Tramiel, founder of Commodore

Commodore founder Jack Tramiel, the orchestrator of the first line of affordable personal computers, died this weekend at the age of 83.

I don’t know exactly what to think about it, and I’m probably not alone, though it didn’t take long for tributes to pour in. Read more

Google drops a bombshell, buys Motorola

I was shocked to read today that Google went out and plunked down $12.5 billion for Motorola. I’m sure that other Android phone makers aren’t exactly happy about it–it means Google is going to be competing with them, unless Google just bought Motorola for patents–but I don’t really see how Google had much choice.

Google risks alienating its partners, but…. More on that in a minute.
Read more

Apple’s first CEO speaks

Business Insider has an interview with Apple’s first CEO, Michael Scott. (Not the guy from the TV sitcom.) It’s interesting reading from a historical standpoint.
Read more

Dinosaur hunting

Today I slipped over to Laclede Computer Trading Company for the first time in many years. I was in search of an ISA parallel card. They’re not easy to find these days, mostly because they aren’t particularly useful to most people these days, but I figured if anyone would have one, it would be them.

No dice. But man, what memories.

Laclede has been around forever–at least 20 years, and probably a whole lot longer than that. I remember taking spare 286 and 386 stuff there in the early 1990s and they actually gave me money for it. Math coprocessors, Packard Bell power supplies, other oddball stuff like that. I’d salvage stuff from upgrade projects and get a little extra money that way.

Most of the stuff in the store now is Pentium 4-level. Recent enough to be useful, old enough to be really cheap. There wasn’t a single ISA board in sight. It was a little sad, but honestly, Clinton was probably still president the last time someone came in looking for something like that. No point in keeping that kind of stuff around.

I lingered around a while though. I saw lots of old SGI and Sun workstations. I remember in 1995, when I was taking a C programming class in college, we used to have to get on waiting lists to use one of the limited number of SGI workstations. They compiled code instantly, and unless you did something incredibly stupid, you weren’t going to crash them. They were a lot nicer than the NeXT workstations we usually ended up having to use when we got tired of waiting in line.

Those systems cost more than a decent car in those days. Each. And now, depending on configuration, you can get one for $30, $60, or $80. Incredible. They’re a lot more useful than the Pentium 75 I had back then, but PCs eventually overtook those weird and wonderful and odd proprietary Unix architectures.

I left, wistfully, but as I got in the car, I spied something. I wasn’t sure that distinctive shape sitting on a distant shelf was what I thought it was, but what else could it be? So I went back in. The clerk gave me a knowing look.

Yep, it was what I thought it was. There, on a tall shelf, on top. 1977 called. They want their computer back.

There it was. The Commodore PET 2001. The early one, with the built-in cassette recorder and the calculator-style chiclet keypad that was even worse than the IBM PCjr.

The  Texas Instruments TI-99/4A was a flop from the early 1980s but some of its technology ended up in successful machines from other companies.

I’m pretty sure it wasn’t for sale. I didn’t ask, because I couldn’t afford it, and don’t have room for it. I stood there for a minute, studying it, then looked around some more. They also had a TI-99/4A, a contender from the early 1980s that couldn’t compete with Commodore, but some of its technology ended up in the Colecovision and, if I’m not mistaken, the IBM PCjr and Tandy 1000. It wasn’t a bad system, but it was horrendously overpriced. It cost more than a Commodore 64 but its capabilities were somewhere between a C-64 and a cheap VIC-20.

They also had a Commodore PC-10-III, which was one of Commodore’s PC/XT clones. And, next to the PC-10, there was a Radio Shack TRS-80 Model 1, the other forgotten personal computer from 1977.

Neat stuff. I don’t really have the interest to collect these old machines myself, but I’ll stop to admire someone else’s every chance I get.

Why I generally buy AMD

I was talking to a new coworker today and of course the topic of our first PCs came up. It was Cyrix-based. I didn’t mention my first PC (it seems I’m about four years older–it was an Am486SX2/66).

With only a couple of exceptions, I’ve always bought non-Intel PCs. Most of the Intel PCs I have bought have been used. One boss once went so far as to call me anti-corporate.

I’m not so much anti-corporate as I am pro-competition.I was a second-generation AMD fanboy, not first. When the Am386DX/40 hit town, I was aware it was the best value in the industry, giving better performance than an Intel 486SX/25 for the price of an Intel 386DX/33, but I didn’t really care because I was still an Amiga guy at that point in time.

But that’s the reason I’m an AMD guy today. One of the reasons the computer market is so stagnant today is because it’s dominated by Microsoft. There’s nothing exciting going on there. There hasn’t been anything exciting coming out of Microsoft since the mid 1990s when they had to compete with OS/2. OS/2 never captured a huge amount of market share, but OS/2 promised to be a better DOS than DOS and a better Windows than Windows, and to a large extent it delivered. Say what you want about OS/2, but I could load Tony La Russa Baseball 2 up in a DOS Window under OS/2 and it would run faster than it ran under DOS, even though the game didn’t have the machine’s full attention. OS/2 2.1 (and later 3.0) had Microsoft running scared, because it ran all of the software that was available in the early 1990s and it ran it quickly, in a fully pre-emptive multitasking environment. Microsoft responded with Windows 95 and Windows NT 4.0 because it had to–Windows 3.1 just couldn’t compete with OS/2’s stability, and booting into DOS using a custom boot disk for every game they wanted to run wasn’t something the public was going to put up with forever.

And what’s happened since then? Windows 98 was basically a service pack, improving the stability of Windows 95 but not offering anything revolutionary. Windows 2000 was a lot better than NT 4.0 but not the kind of jump that Windows 95 was over 3.1. Windows XP did a lot to improve backward compatibility with old DOS and Windows 9x games, and while it was a big leap from Windows 98 or ME, it wasn’t a tremendous improvement over Windows 2000. I haven’t heard anyone say anything good about Vista. At least the Windows 95 box was pretty, but Vista doesn’t really even have that going for it.

Microsoft’s primary competition today is illegal copies of its own operating system, so its main concern with Vista is keeping people from making copies of it. And it shows.

Apple is trying to compete, but its market share is around 10 percent. We don’t exactly have a duopoly.

When I got interested in computers in the 1980s, there was all sorts of interesting stuff going on. IBM and DOS were things you used at work to do accounting. At home we used all these weird and wonderful 8-bit computers that were technically obsolete, but engineers kept figuring out how to squeeze more capability out of them. Revisionist historians talk about Apple dominating the 8-bit era, but that wasn’t true. At its peak, Commodore sold as many C-64s in a single year as Apple sold Apple IIs in that line’s entire lifetime. Although Commodore was the king of sales, Atari arguably had the best 8-bit computer (the 800/XE/XL family). Tandy had its Color Computer line, and while it couldn’t match the graphics and sound capability that Commodore and Atari had, it had a far more powerful CPU. Coleco’s Adam is little more than the butt of a joke today, but on paper it should have done well. Coleco took several chips that Texas Instruments had used in its failed TI-99/4A, paired them up with a more conventional CPU (the popular Zilog Z-80), and made a competitive computer with it. Its biggest problem was that it was late to market and plagued with reliability problems at first. Kind of like Windows Vista.

And that’s the beauty of competition. In the 1980s, if you delivered a product like the Coleco Adam, you went out of business. But if your name is Microsoft and you have 85% of the marketplace, you can deliver something like the Coleco Adam and keep on chugging.

The really exciting stuff in the 1980s wasn’t in the 8-bit arena though. The Motorola 68000-based computers were where the action was. The most famous of these, of course, was the first-generation Macintosh. But the Atari ST and Commodore Amiga used the 68000 too, and unlike the first Macs, they paired the powerful CPU with color and powerful sound. The three companies threw bricks at each other a lot, but they kept each other honest. Apple ended up having to add color and sound and expansion slots to its Macs in order to compete. Commodore designed a low-cost Amiga to compete with Atari, and a higher-priced model with lots of drive bays and expansion slots to compete with Apple.

These three companies, ironically, built the kind of machine Bill Gates tried to get IBM to build in 1981. Gates wanted IBM to use a Motorola 68000, and then they would have used Xenix, Microsoft’s version of Unix, for an operating system.

The result was Microsoft trying to play catch-up. Windows was in development before these machines hit the market, but Microsoft knew the Mac was coming long before it happened. Microsoft had a prototype and was one of the first Mac developers. In typical Microsoft fashion, Microsoft was talking about Windows in 1983, but didn’t deliver anything until 1985, and what they delivered wasn’t useful for very much. It wasn’t until Windows 3.0 came out in 1990 that it hit prime time. By then the code was stable enough that you could use it for a few hours at a time, and PC CPUs were powerful enough that Windows could keep up with an Amiga or ST or Mac without embarrassing itself.

And that was the beginning of the end. It was one thing for Commodore and Atari to compete with 286 clones that could barely run Windows. But within a year or two, they were competing with Tandy 386s that sold for $1,200 at every Radio Shack in the country. Whether you lived in New York City or Buffalo, Missouri, you could walk into Radio Shack and see a computer running Windows and buy it on the spot. Neither Commodore nor Atari had a dealer network anything like that. And if you lived in a big enough city, you could walk into a "superstore" like Best Buy or Circuit City or Silo that were sweeping the nation at the time and buy a Packard Bell for even less.

By 1993, Commodore and Atari were non-factors in the marketplace. It was down to Apple and the PC clones running Windows.

So what does any of this have to do with AMD and Intel?

AMD and Intel keep each other honest. When Intel released the Itanium, AMD countered with its AMD64 architecture. While the Itanium gives better 64-bit performance, AMD64 does a much better job of running the 32-bit applications we all run today. Itanium has a better long-term approach, but it’s designed for a future that will never come on its own because people still want to play their old copy of The Sims and have it run well on their new computer. AMD’s answer for 64-bit computing was AMD64, and its success forced Intel to clone it.

The CPU isn’t as important today because Intel and AMD are making CPUs that have more power than today’s software knows what to do with. But that’s not Intel’s fault, and it’s not AMD’s fault. Microsoft can’t think of a good use for all the power or a way to harness it, and the industry doesn’t have the scrappy underdog companies like Commodore or Atari anymore to figure out a use for them and drive the industry.

But if one company had a total monopoly on CPUs, I’m afraid of what I’d see. Probably Intel would become more like Microsoft, delivering products that ran slower and at a higher price than each previous generation. It’s unnatural, but it’s the norm for a monopoly.

I’ve heard myself saying several times over the last three or four years that I don’t like computers anymore. But that’s not exactly true. Either I don’t like modern computers, or what HP and Dell sell today aren’t computers.

If either Intel or AMD were to succeed in squeezing the other company out of business, the modern computer would become even more underachieving and uninteresting than it is now.

Time for a core dump

I’ve been keeping a low profile lately. That’s for a lot of reasons. I’ve been doing mostly routine sysadmin work lately, which is mind-numbingly boring to write about, and possibly just a little bit less mind-numbingly boring to read about. While a numb mind might not necessarily be a bad thing, there are other reasons not to write about it.
During my college career, I felt like I had less of a private life than most of my classmates because of my weekly newspaper column. I wrote some pretty intensely personal stuff in there, and frankly, it seemed like a lot of the people I hung out with learned more about me from those columns than they did from hanging out with me. Plus, with my picture being attached, I’d get recognized when I went places. I remember many a Friday night, going to Rally’s for a hamburger and having people roll down their windows at stoplights and talk to me. That was pretty cool. But it also made me self-conscious. College towns have some seedy places, you know, and I worried sometimes about whether I’d be seen in the vicinity of some of those places and what people might think.

Looking back now, I should have wondered what they would be doing in the vicinity of those places and why it was OK for them to be nearby and not me. But that’s the difference between how I think now and how I thought when I was 20.

Plus, I know now a lot fewer people read that newspaper than its circulation and advertising departments wanted anyone to think. So I could have had a lot more fun in college and no one would have known.

I’m kidding, of course. And I’m going off on tangent after tangent here.

In the fall of 1999, I willingly gave up having a private life. The upside to that is that writing about things helps me to understand them a lot better. And sometimes I get stunningly brilliant advice. The downside? Well, not everyone knows how to handle being involved in a relationship with a writer. Things are going to come up in writing that you wish wouldn’t have. I know now that’s something you have to talk about, fairly early. Writing about past girlfriends didn’t in and of itself cost me those relationships but I can think of one case where it certainly didn’t help anything. The advice I got might have been able to save that relationship; now it’s going to improve some as-yet-to-be-determined relationship.

There’s another downside too. When you meet a girl and then she punches your name into a search engine, if you’re a guy like me who has four years’ worth of introspective revelations out on the Web, it kind of puts you at a disadvantage in the relationship. She knows a whole lot more about you than you do about her. It kind of throws off the getting-to-know-you process. I’d really rather not say how many times that’s happened in the past year. Maybe those relationships/prospective relationships were doomed anyway. I don’t have any way of knowing. One of them really hurt a lot and I really don’t want to go through it again.

So I’ve been trying to figure out for the past few weeks what to do about all this. Closing up shop isn’t an option. Writing strictly about the newest Linux trick I’ve discovered and nothing else isn’t an option. Writing blather about the same things everyone else is blathering about is a waste of time and worthless. Yes, I’ve been saying since March that much, if not all, of the SCO Unix code duplicated in Linux is probably BSD code that both of them ripped off at different points in time. And now it’s pretty much been proven that I was right. So what? How many hundreds of other people speculated the same thing? How could some of us be more right than others?

I’m going to write what I want, but I’m having a hard time deciding what I want to write. I know I have to learn how to hold something back. Dave Farquhar needs a private life again.

For a while, this may just turn into a log of Wikipedia entries I made that day. Yes, I’m back over there again, toiling in obscurity this time. For a while I was specializing in entries about 1980s home computing. For some reason when I get to thinking about that stuff I remember a lot, and I still have a pile of old books and magazines so I can check my facts. Plus a lot of those old texts are showing up online now. So now the Wikipedia has entries on things like the Coleco Adam and the Texas Instruments TI-99/4A. Hey, I find it interesting to go back and look at why these products were failures, OK? TI should have owned the market. It didn’t. Coleco should have owned the market, and they didn’t. Atari really should have owned the market and they crashed almost as hard as Worldcom. So how did a Canadian typewriter company end up owning the home computer market? And why is it that probably four people reading this know who on earth I’m talking about now, in 2003? Call me weird, but I think that’s interesting.

And baseball, well, Darrell Porter and Dick Howser didn’t have entries. They were good men who died way too young, long before they’d given everything they had to offer to this world. Roger Maris didn’t have an entry. There was more to Roger Maris than his 61 home runs.

The entries are chronicled here, if you’re interested in what I’ve been writing lately while I’ve been ignoring this place.

Much ado about nothing and other stuff

Much ado about nothing. The most recent report I read indicates that AOL/Time Warner and Red Hat are talking, but not about an acquisition. Sanity has entered the building…
Good thing User Friendly got a chance to get its two cents’ worth in. I got a couple bucks’ worth of laughter from it.
Much ado about something. On Sunday, Gentoo Linux developer Daniel Robbins announced that an obscure AMD Athlon bug slipped past Linux kernel developers, resulting in serious problems with Athlon- and Duron-based systems with AGP cards. This confirms some suspicions I’ve heard–one of the Linux mailing lists I subscribe to occasionally has rumblings about obscure and difficult-to-track-down Athlon problems.

The result was that Gentoo’s site was slashdotted into oblivion for a while, but hopefully it also resulted in some extra exposure for the distribution. Gentoo is another source-based distro. Lately I’ve been resigned to just using Debian to build my Linux boxes, but I’m still awfully fond of the idea of compiling your own stuff. As CPUs get faster and faster, I expect that to become more commonplace.

But I digress. The bug involves the CPU’s paging function. Older x86 CPUs used 4K pages. Starting with the Pentium, CPUs started allowing 4MB pages. But a bug in the Athlon’s implementation of this extended paging causes memory corruption when used in conjunction with an AGP cards.
Alan Cox is working on a workaround. I’m a bit surprised a patch isn’t already out there.

CPU bugs are discovered all the time, but it’s fairly rare for them to be serious. If you ever run across a Pentium-60 or Pentium-66 system, boot up Linux on it sometime and run the command dmesg. You’ll find workarounds for at least two serious bugs. A TI engineer named Robert Collins gained a fair bit of notoriety in the last decade by researching, collecting, and investigating CPU bugs. Part of it was probably due to his irreverant attitude towards Intel. (As you can see from this Wayback machine entry.) Sadly, I can’t find the story on the site anymore, since he was bought out by Dr. Dobb’s.
Catching up. I haven’t been making my rounds lately. The reason why is fairly obvious. I used my day off yesterday to have lunch with someone from my small group, then when I got home I read the e-mail I absolutely had to read, responded to those that absolutely had to get responses, answered a couple of voice messages, wrote and sent out a couple of other messages, looked up, and it was 5 p.m.

“Alright God,” I muttered. “I just gave the day to Your people. Time to go spend some time with You.” So I whipped out my handy-dandy Today’s Light Bible and read about Moses. Seemed appropriate. The inadequacy and jumping the gun and making excuses, that is. The Biblical “superheroes” were human just like us, and the book doesn’t gloss over that. Today’s Light is designed to divide the Bible into pieces so you can read the whole thing in two years. I can’t decide if I want to get through it in a year or in six months. A few years ago I read it in its entirety in four months, but that pace is a bit much. If you’re willing to spend as much time reading the Bible every day as the average person does watching TV, you can make it through in a few months. But it’s not exactly light reading, and I’m not sure I recommend that pace. If you’re willing to dedicate that kind of time to Bible study you’re probably better served by learning Greek so you can read the New Testament in the original. Then if you’ve still got your sanity you can think about tackling Hebrew.

I finally got around to reading Charlie Sebold’s entries for the last few days. One especially poignant observation: “I continue to be surprised at how much I remember about computers, and how much I forget about everything else (including far more important things).”

I sure can relate. I wish I could trade everything I remember about IBM PS/2s and Microchannel for something more useful. But I remember goofy baseball statistics too–I can recite the starting lineup and pitching rotation of the 1980 Kansas City Royals (I’ll spare you). But I can’t tell you the names of all seven people I met Sunday night.