Collecting vintage computers can be fun. I also personally think it’s great that people are interested in preserving that history. Where to buy vintage computers hasn’t changed much over the years. It just may take a bit more work than it used to.
Some people think old computers are priceless. Others think they’re worthless. I don’t recommend wasting your time with people who think a Dell Pentium III laptop is worth $300. Think of the times you found a jewel for five bucks and keep moving.
In some ways, 1985 was a really pivotal year for computing. The industry was changing fast, but in 1985, many relics from the past were still present even as we had an eye for the future. Here’s a look back at computers in 1985 and what made that year so interesting.
I think 1985 was interesting in and of itself, but it also made the succeeding years a lot more interesting. A surprising amount of the technology that first appeared in 1985 still has an impact today.
Digital Equipment Corporation was perhaps the second most important computer company in history, behind IBM. Its minicomputers challenged IBM, and, indeed, Unix first ran on a DEC PDP-7. DEC’s Alpha CPU was one of the few chips to make Intel nervous for its x86 line. It created the first really good Internet search engine. In a just and perfect world, DEC would still be dominating. Instead, it faded away in the 1990s. What happened to Digital Equipment Corporation, or DEC?
Retro computing fans, especially Commodore and Atari enthusiasts, all know the story. Jack Tramiel left Commodore, the company he founded, in early 1984 at the height of its success. Then, within a few months, he gained control of Commodore rival Atari.
There are a few hucksters on Ebay, whom I don’t care to give free advertising by mentioning by name, who hawk “graded” cards on Ebay and claim them to be especially valuable. One even puts supposed appraised values in his listings in parenthesis, then invites you to visit his page for an explanation of “graded” value, where he cites an example of a run-of-the-mill 1970s star card, normally worth $60, being worth $2,500 once graded.
The thing is, that’s an edge case. It’s important to understand those edge cases to avoid a ripoff.
All in all it sounds reasonable to me. His recollection of DOS and some DOS version 8 confused me at first, but that was what the DOS buried in Windows ME was called. But mentioning it is appropriate, because it shows how DOS faded from center stage to being barely visible in the end, to the point where it was difficult to dig it out, and that it took 15 years for it to happen. He’s completely right, that if Microsoft had pulled the plug on DOS in 1985, Windows would have failed. Read more
This week, Mark Shuttleworth closed the longstanding Ubuntu bug #1, which simply read, “Microsoft has majority market share.” Because Microsoft didn’t lose its market share lead to Ubuntu, or Red Hat, or some other conventional Linux distribution, some people, including John C. Dvorak, are interpreting this as some kind of surrender.
I don’t see it as surrender at all. Microsoft’s dominant position, which seemed invincible in 2004 when Shuttleworth opened that bug, is slipping away. They still dominate PCs, but PCs as we know it are a shrinking part of the overall computing landscape, and the growth is all happening elsewhere.
I have (or at least had) a reputation as a Microsoft hater. That’s a vast oversimplification. I’m not anti-Microsoft. I’m pro-competition. I’m also pro-Amiga, and I’ll go to my grave maintaining that the death of Amiga set the industry back 20 years. I have Windows and Linux boxes at home, my wife has (believe it or not) an Ipad, and at work I’m more comfortable administering Linux than Windows right now, which seems a bit strange, especially considering it’s a Red Hat derivative and I haven’t touched Red Hat in what seems like 400 years.
What Shuttleworth is acknowledging is that we have something other than a duopoly again, for the first time in more than 20 years, and the industry is innovating and interesting again. Read more
Lifehacker came through with a gem this morning: How to block annoying political posts on Facebook. Though it’s really about filtering, so you can filter on pretty much anything with it, not just the names of political parties and this year’s candidates. Pretty much anything that people rant about on Facebook would be game for this.
It’s easy enough to just unsubscribe from certain people completely or in extreme cases, un-friend them. But this gives you a less extreme option.
GEM was an early GUI for the IBM PC and compatibles and, later, the Atari ST, developed by Digital Research, the developers of CP/M and, later, DR-DOS. (Digital Equipment Corporation was a different company.) So what was it, and what happened to GEM?
It was very similar to the Apple Lisa, and Apple saw it as a Lisa/Macintosh ripoff and sued. While elements of GEM probably were inspired by the Lisa, Digital Research actually hired several developers from Xerox PARC.
DRI demonstrated the 8086 version of GEM at COMDEX in 1984, and shipped it on 28 February 1985, beating Windows 1.0 to market by nearly 9 months. Read more
I didn’t have time to write everything I wanted to write yesterday, so I’m going to revisit Bill Gates and Gary Kildall today. Bill Gates’ side of the DOS story is relatively well documented in his biographies: Gates referred IBM to Gary Kildall, who for whatever reason was less comfortable working with IBM than Gates was. And there was an airplane involved, though what Kildall was doing in the airplane and why varies. By some accounts he was meeting another client, and by other accounts it was a joyride. IBM in turn came back to Gates, who had a friend of a friend who was cloning CP/M for the 8086, so Microsoft bought the clone for $50,000, cleaned it up a little, and delivered it to IBM while turning a huge profit. Bill Gates became Bill Gates, and Kildall and his company, Digital Research, slowly faded away.
The victors usually get to write the history. I’ve tried several times over the years to find Kildall’s side of the story. I first went looking sometime in 1996 or so, for a feature story about Internet misinformation I wrote for the Columbia Missourian‘s Sunday magazine. For some reason, every five years or so I end up chasing the story down again. Read more