The Revolution of 1985

Twenty five years ago yesterday, a revolution happened. Nobody really noticed, and nobody thinks about it today, but the effects are still here. That we take these things for granted today shows just how wide-reaching the revolution was.

It took the form of a computer with a 32-bit Motorola CPU, full stereo sound, a display capable of 4,096 colors, and a fully pre-emptive multitasking operating system. At a starting price of $1,295, though it rose to closer to $2,000 by the time you added a second drive and a monitor.

The specs on that machine don’t sound all that impressive today, but keep in mind what else was available in 1985. The state of the art from IBM was the 16-bit IBM PC/AT with very limited sound capability, color as an expensive option, and DOS 2.1. Windows at the time was little more than a glorified DOS shell. Apple had its Macintosh, but it cost twice as much as an Amiga, had only slightly better sound than that IBM, and just a tiny black and white display.

Over the course of the next nine years, Commodore sold 3 million Amigas. Along the way, they worked out the early glitches in the machine, and upgraded the capabilities, though not always as quickly as the competition. But the machine aged remarkably well. And ultimately it did for television production what the Macintosh did for publishing, replacing hundreds of thousands of dollars’ worth of specialized equipment with equipment that merely cost thousands, and fit comfortably on a large desk.

The big problem was that Commodore sold those three million machines to one million people, and never really knew what to do with it. It should have been a great business computer. It was the ultimate home computer. It could have been the ultimate education computer. And it was the ultimate video editing computer. But Commodore never marketed it effectively as any of those.

Mostly the company went through the motions while financier Irving Gould lined his pockets with whatever money was left after Commodore got done paying the bills each quarter. Some years, Commodore spent more money on Gould’s and his yes-man company president’s salaries than they spent on Amiga development.

So, slowly but surely, the competition caught up. VGA was better in some regards than the Amiga graphics and worse in others, but over time, the combination of VGA and fast 386 and 486 CPUs became enough to keep pace. Macintosh graphics followed a similar curve. Affordable sound cards for PCs started appearing in the late 1980s and were commonplace by 1992 or 93. It was a lot harder to get it all working on a PC, but when it worked, it worked pretty well. But making DOS boot disks to get it all working was a black art, an art I remember practicing at least until 1998.

It was in the early 1990s that PCs and Macs got multitasking. First it was horrible cooperative multitasking, followed later by pre-emptive multitasking like the Amiga had. Eventually they even added memory protection, something Amiga didn’t have (when it was initially designed with an 8 MHz CPU and 256K of RAM, that was the one thing they had to leave out).

The money ran out in 1994, and the rights to the architecture changed hands more times than most people can count. The Amiga’s days as a mainstream computer–if it ever could claim to be one–ended then.

The rest of the world spent the 1990s catching up. When Windows 95 came out with its promise of Plug and Play, improved multimedia, and pre-emptive multitasking, it was all old news to Amigans. Amigas had been doing all that for 10 years already.

For a long time after 1994, I was bitter. I’m less so now that the rest of the world has caught up. But I still wonder sometimes what might have been, if the industry had spent the 15 years between 1985 and 2000 innovating, rather than just catching up.

Why I generally buy AMD

I was talking to a new coworker today and of course the topic of our first PCs came up. It was Cyrix-based. I didn’t mention my first PC (it seems I’m about four years older–it was an Am486SX2/66).

With only a couple of exceptions, I’ve always bought non-Intel PCs. Most of the Intel PCs I have bought have been used. One boss once went so far as to call me anti-corporate.

I’m not so much anti-corporate as I am pro-competition.

Read more

The little-known story of Commodore

So I’m reading On The Edge, a longish book that tries to tell the story of Commodore properly, including the people who made it happen, and the companies it bought along the way.

I’m glad the story got told.

Read more

Intel inside the Mac–no more question mark

OK, it’s official. Intel has conquered one of the last holdouts: Soon you’ll be able to buy a Pentium-powered Mac.

Of course there are lots of questions now.First of all, Apple having problems with its CPU suppliers is nothing new. Apple’s first CPU supplier was a small firm called MOS Technology. You’ve probably never heard of it, but MOS was a subsidiary of a company you may have heard of: Commodore. Commodore, of course, was one of two other companies to release a ready-built home computer about the same time Apple did. The problem was that the Commodore and Apple computers had the same CPU. Commodore, of course, could undercut Apple’s price. And it did. Commodore president Jack Tramiel was an Auschwitz survivor, and Tramiel pretty much assumed his competitors were going to treat him the same way the Nazis did, so he never cut them any breaks either. At least not intentionally.

When other companies released licensed versions of MOS’ 6502 processor, Apple was the biggest customer. Rumor had it that Commodore was hoarding 6502s.

When Motorola released its legendary 68000 CPU, Apple was one of the first companies to sign up, and the first two commercially successful computers to use the m68K were made by Apple. And life was good. Apple wasn’t Motorola’s only customer but it was one of the biggest. Life was good for the better part of a decade, when Intel finally managed to out-muscle the performance of the Motorola 68040. So Apple conspired with Motorola and IBM to come up with something better, and the result was the PowerPC. And life was good again. The PowerPC wasn’t the best chip on the market, but of the two architectures that you could buy at every strip mall on the continent, it was clearly the better of the two.

Over time Apple’s relationship with Motorola cooled, and the relationship with IBM was off again and on again. Intel meanwhile kept trotting out bigger and bigger sledgehammers, and by brute force alone was able to out-muscle the PowerPC. Steve Jobs got creative, but eventually he just ran out of tricks. Switching to Intel in 2006 may or may not be the best option, but it’s just as easy to do now as it’s ever going to be.

So, now there’s the question of whether this will hurt Microsoft or Linux or both. The answer is yes. The real question isn’t whether it will hurt, but how much. As soon as Microsoft loses one sale, it’s hurt. The same goes for Red Hat.

To me, the question hinges on how attached Apple is to its hardware business. Steve Jobs has only said that OS X has been running on Intel in the labs for years. I have never heard him mention whether the hardware was a standard PC clone motherboard, or something of Apple’s design. I suspect he’s avoiding the question.

It would be possible to make OS X run on Apple hardware and only Apple hardware, even if the CPU is a standard Pentium 4 just like Dell uses. And at least at the outset, I expect Apple will do that. Apple may only have 3-5 percent of the market, but it’s 3-5 percent of a really big pie. The company is profitable.

It would also be possible to let Windows run on this hardware. That may be a good idea. Apple still has something to offer that nobody else does: The slick, easy to use and stable OS X, but on top of that, you can boot into Windows to play games or whatever. It makes Apple hardware worth paying a premium to get.

If Apple chooses to let OS X run on anything and everything, it hurts Linux and Windows more, but it probably hurts Apple too. There’s a lot of hardware out there, and a lot of it isn’t any good. Apple probably doesn’t want that support nightmare.

I think this will narrow the gigahertz gap and, consequently, the speed gap. I think it will help Apple’s marketshare, especially if they allow Windows to run on the hardware. I don’t see it having a devestating effect on any other operating system though. It will hurt marginal PC manufacturers before it hurts software companies.

Intel inside a Mac?

File this under rumors, even if it comes from the Wall Street Journal: Apple is supposedly considering using Intel processors.

Apple’s probably pulling a Dell.It’s technically feasible for Mac OS X to be recompiled and run on Intel; Nextstep ran on Intel processors after Next abandoned the Motorola 68K family. Mac OS X is based on Nextstep.

Of course the x86 is nowhere near binary-compatible with the PowerPC CPU family. But Apple has overcome that before; the PowerPC wasn’t compatible with the m68K either. Existing applications won’t run as fast under emulation, but it can be done.

Keeping people from running OS X on their whitebox PCs and even keeping people from running Windows on their Macs is doable too. Apple already knows how. Try installing Mac OS 9 on a brand-new Apple. You can’t. Would Apple allow Windows to run on their hardware but not the other way? Who knows. It would put them in an interesting marketing position.

But I suspect this is just Apple trying to gain negotiating power with IBM Microelectronics. Dell famously invites AMD over to talk and makes sure Intel knows AMD’s been paying a visit. What better way is there for Apple to get new features, better clock rates, and/or better prices from IBM than by flirting with Intel and making sure IBM knows about it?

I won’t rule out a switch, but I wouldn’t count on it either. Apple is selling 3 million computers a year, which sounds puny today, but that’s as many or more computers as they sold in their glory days. Plus Apple has sources of revenue that it didn’t have 15 years ago. If it could be profitable selling 3 million computers a year in 1990, it’s profitable today, especially considering all of the revenue it can bring in from software (both OS upgrades and applications), Ipods and music.

What happens when you overclock

I’ve never been a big fan of overclocking. I overclocked for a couple of weeks back in my Pentium-75 days but quit when my system started acting goofy. I did it again five years ago when I was writing my book, because, well, everyone expected me to talk about overclocking in it. So I overclocked again, and tried to use that overclocked machine in the process of writing a book. This foray only lasted a little while longer.

Read more

Fascination with old technology

I found this New York Times story on retro technology today. I have my own take on retro gaming.

My girlfriend tells me the 1980s are terribly hip with her students. As she was grading papers last night, I noticed one student had doodled Pac-Man on a paper, the way I remember my classmates and I doing in 1982.

I dig it.I was feeling nostalgic in the summer of 1996 when I started visiting old 8-bit oriented newsgroups on Usenet. Someone wrote in with a question about an Atari power supply, and I happened to have a Jameco catalog in my hands that was advertising some old surplus Atari boxes.

That led to me meeting Drew “Atari” Fuehring, who along with his brother had accumulated one of the largest collection of retro video game consoles in Missouri. Atari 2600, 5200, 7800; Vectrex; Colecovision; Intellivision–you name it, they had it, and if they didn’t have every cartridge and accessory that came with each, they had more than 75 percent of it.

I did a feature story on them for the Sunday magazine of the newspaper I was working at the time. It was easily the most enjoyable story I did during my time at that paper. Maybe the most enjoyable story I ever did.

I didn’t take up video game collecting, but obviously I never forgot that article. (I’d link to it but the database seems to be down forever.)

Those of us in our 20s (I’ve still got 3 1/2 months left of my 20s) grew up around technology. We’ve watched it grow up with us. So why does it seem so odd for us to think of older technology as something other than inferior? Isn’t that like saying that once you’ve heard rock ‘n’ roll, you have to give up jazz and blues?

In some regards the old stuff’s better. Hold up that original fake wood-grained Atari 2600 alongside my GPX-branded DVD player. Hold both of them up and then ask any person which of those two things originally cost more money. Even if they don’t have a clue what the two objects are, they’ll know.

There was a time when things were built to last and they weren’t rendered obsolete in two years or six months just to force us to buy more stuff.

Take the guy in the article who bought a 15-year-old Motorola cell phone. I’m sure some people think he’s nuts. The new phones have all the functions of a Palm Pilot in them, and you can play video games on them (funny, they’re old video games–I hold out hope that the people who make these gadgets have some clue) and you can take pictures with them and you can program them to play annoying songs when people call you, and I think some of them even do septuble duty as an MP3 player. But have you ever tried to talk on the phone with one? Or worse yet, talk with someone who’s talking on one? They’re terrible! They cut out all the time and the conversation sounds robotic, so everyone talks really loud trying to make up for the terrible quality–and succeed only in annoying everyone around them–and if you drive under a bridge, forget it. You’ll lose the connection.

I remember all the promises of digital. I’ll tell you what was so great about digital: It allowed the phone companies to cram a lot more conversations into a much narrower frequency range. It saved them a buttload of money, and we get the benefit of… ever-smaller, costlier phones that are easier to lose, along with an endless upgrade cycle. Trust me, next year the annoying salespeople in the mall will be asking you if you can watch movies on your cellphone, because you can on this year’s model.

Eugene Auh says he bought the phone to impress girls. Maybe he did, but he’ll keep the phone because it works.

I spend my day surrounded by technology and by the time I manage to get home, I really want to get away from it. My sister asked me a few months ago where my sudden fascination with trains came from. I think that’s exactly it. The first time I saw a train with onboard electronics that ran by remote control it really wowed me, but I’m constantly drawn to the old stuff. The older the better. I have a lot of respect for the 1950s units that my dad played with, but for me, the holy grail is an Ives train made between 1924 and 1928. In 1924, Ives came up with a technological marvel: a train that could not only reverse when the power was cycled, but added a neutral position to keep the train from slamming itself into reverse and doing a Casey Jones maneuver, and could keep the headlight lit at all times.

Trust me, it was a big deal in 1924.

Besides that, those oldies were built to be played with hard and built to last. And they were built to look good. Remember that picture I posted this weekend? That’s nothing. The electric units were gorgeous, with bright, enameled paints and brass trim and the works.

Why should I settle for a hunk of plastic made by someone who gets paid a dollar an hour whose electronics are going to fry in a year, rendering the thing motionless?

Nope. I like old stuff.

Next time I’m at a flea market and I see a Betamax VCR, I might just buy it.

How DOS came to be IBM’s choice of operating system

The urban legend says Gary Kildall snubbed the IBM suits by making them wait in his living room for hours while he flew around in his airplane, and the suits, not taking it well, decided to cut him out of the deal and opted to do business with Bill Gates and Microsoft, thus ending Digital Research’s short reign as the biggest manufacturer of software for small computers.

Read more

The pundits are wrong about Apple’s defection

Remember the days when knowing something about computers was a prerequisite for writing about them?
ZDNet’s David Coursey continues to astound me. Yesterday he wondered aloud what Apple could do to keep OS X from running on standard PCs if Apple were to ditch the PowerPC line for an x86-based CPU, or to keep Windows from running on Apple Macs if they became x86-based.

I’d link to the editorial but it’s really not worth the minimal effort it would take.

First, there’s the question of whether it’s even necessary for Apple to migrate. Charlie pointed out that Apple remains profitable. It has 5% of the market, but that’s beside the point. They’re making money. People use Apple Macs for a variety of reasons, and those reasons seem to vary, but speed rarely seems to be the clinching factor. A decade ago, the fastest Mac money could buy was an Amiga with Mac emulation hardware–an Amiga clocked at the same speed would run Mac OS and related software about 10% faster than the real thing. And in 1993, Intel pulled ahead of Motorola in the speed race. Intel had 486s running as fast as 66 MHz, while Motorola’s 68040 topped out at 40 MHz. Apple jumped to the PowerPC line, whose clock rate pretty much kept up with the Pentium line until the last couple of years. While the PowerPCs would occasionally beat an x86 at some benchmark or another, the speed was more a point of advocacy than anything else. When a Mac user quoted one benchmark only to be countered by another benchmark that made the PowerPC look bad, the Mac user just shrugged and moved on to some other advocacy point.

Now that the megahertz gap has become the gigahertz gap, the Mac doesn’t look especially good on paper next to an equivalently priced PC. Apple could close the gigahertz gap and shave a hundred bucks or two off the price of the Mac by leaving Motorola at the altar and shacking up with Intel or AMD. And that’s why every pundit seems to expect the change to happen.

But Steve Jobs won’t do anything unless he thinks it’ll get him something. And Apple offers a highly styled, high-priced, anti-establishment machine. Hippie computers, yuppie price. Well, that was especially true of the now-defunct Flower Power and Blue Dalmation iMacs.

But if Apple puts Intel Inside, some of that anti-establishment lustre goes away. That’s not enough to make or break the deal.

But breaking compatibility with the few million G3- and G4-based Macs already out there might be. The software vendors aren’t going to appreciate the change. Now Apple’s been jerking the software vendors around for years, but a computer is worthless without software. Foisting an instruction set change on them isn’t something Apple can do lightly. And Steve Jobs knows that.

I’m not saying a change won’t happen. But it’s not the sure deal most pundits seem to think it is. More likely, Apple is just pulling a Dell. You know the Dell maneuver. Dell is the only PC vendor that uses Intel CPUs exclusively. But Dell holds routine talks with AMD and shows the guest book signatures to Intel occasionally. Being the last dance partner gives Dell leverage in negotiating with Intel.

I think Apple’s doing the same thing. Apple’s in a stronger negotiating position with Motorola if Steve Jobs can casually mention he’s been playing around with Pentium 4s and Athlon XPs in the labs and really likes what he sees.

But eventually Motorola might decide the CPU business isn’t profitable enough to be worth messing with, or it might decide that it’s a lot easier and more profitable to market the PowerPC as a set of brains for things like printers and routers. Or Apple might decide the gigahertz gap is getting too wide and defect. I’d put the odds of a divorce somewhere below 50 percent. I think I’ll see an AMD CPU in a Mac before I’ll see it in a Dell, but I don’t think either event will happen next year.

But what if it does? Will Apple have to go to AMD and have them design a custom, slightly incompatible CPU as David Coursey hypothesizes?

Worm sweat. Remember the early 1980s, when there were dozens of machines that had Intel CPUs and even ran MS-DOS, yet were, at best, only slightly IBM compatible? OK, David Coursey doesn’t, so I can’t hold it against you if you don’t. But trust me. They existed, and they infuriated a lot of people. There were subtle differences that kept IBM-compatible software from running unmodified. Sometimes the end user could work around those differences, but more often than not, they couldn’t.

All Apple has to do is continue designing their motherboards the way they always have. The Mac ROM bears very little resemblance to the standard PC BIOS. The Mac’s boot block and partition table are all different. If Mac OS X continues to look for those things, it’ll never boot on a standard PC, even if the CPU is the same.

The same differences that keep Mac OS X off of Dells will also keep Windows off Macs. Windows could be modified to compensate for those differences, and there’s a precedent for that–Windows NT 4.0 originally ran on Intel, MIPS, PowerPC, and Alpha CPUs. I used to know someone who swore he ran the PowerPC versions of Windows NT 3.51 and even Windows NT 4.0 natively on a PowerPC-based Mac. NT 3.51 would install on a Mac of comparable vintage, he said. And while NT 4.0 wouldn’t, he said you could upgrade from 3.51 to 4.0 and it would work.

I’m not sure I believe either claim, but you can search Usenet on Google and find plenty of people who ran the PowerPC version of NT on IBM and Motorola workstations. And guess what? Even though those workstations had PowerPC CPUs, they didn’t have a prayer of running Mac OS, for lack of a Mac ROM.

Windows 2000 and XP were exclusively x86-based (although there were beta versions of 2000 for the Alpha), but adjusting to accomodate an x86-based Mac would be much easier than adjusting to another CPU architecture. Would Microsoft go to the trouble just to get at the remaining 5% of the market? Probably. But it’s not guaranteed. And Apple could turn it into a game of leapfrog by modifying its ROM with every machine release. It already does that anyway.

The problem’s a whole lot easier than Coursey thinks.

Inside track on VIA vs. Intel

Inside track on VIA vs. Intel

Many probably read today that Intel sued VIA for patent infringement, then VIA turned around and sued Intel for essentially the same thing, stating that Intel needs a license from VIA in order to make the P4 and i845. This unexpected drama in VIA vs. Intel probably has left a lot of people scratching their heads.

Read more