Conspiracies, conspiracies everywhere

The topic of the day yesterday was Timothy McVeigh. I’d forgotten that yesterday was his day–I saw the lead story on The Kansas City Star announcing McVeigh was dead yesterday morning when I went to read up on the day’s events.
McVeigh raises a lot of uncomfortable questions. So let’s go back to a year after the Oklahoma City bombing, because that was when I got my wakeup call.

I was a crime reporter for the Columbia Missourian, a flaming liberal little daily newspaper in, frankly, what would be a worthless little town if it weren’t for the University of Missouri being there. But Columbia is situated in the middle of nowhere; aside from Columbia and Jefferson City, Central Missouri has no good-sized towns, and those two “cities” are cities only by Missouri standards. St. Louis has suburbs bigger than either of them. Central Missouri is backward, or rural, or backward and rural, depending on where you go.

Well, a guy by the name of Don Albright drove to Columbia one night and got drunk. He was pulled over, ticketed, and charged with driving while intoxicated. Albright maintained it was his constitutional right to drive drunk. Actually, he said his constitutional right to travel was being violated. “A driver is for hire,” Albright told me. “A traveler is a private citizen.”

I had a very long conversation with Albright. Albright was one of the biggest conspiracy theorists I’d ever talked to. He believed the United States was still technically a collection of British colonies; that there are actually two United States of Americas; that the Civil War, World Wars I and II, the Great Depression, and the Kennedy Assassination were all directly linked and part of the same conspiracy, and other bizarre beliefs. Another belief he shared with me was the New World Order, a belief Timothy McVeigh shared.

He was also militant. He took out liens on judges and prosecuting attorneys. And, on the first anniversary of the Oklahoma City bombing, Albright, along with others, threatened to attack government buildings as well as press organizations that didn’t “tell what was really going on.”

By this time, I was on Albright’s black list. One of his friends anonymously called me one day and told me to watch my back, so I took the threats seriously. I consciously avoided the newsroom, courthouse, post office, and police station that day. Fortunately, nothing eventful happened.

I suspect Albright’s motivation was primarily racial. During that single conversation, he brought up plenty of racial overtones. When we investigated him further, what we discovered was a person who didn’t want to accept any responsibility for his own past.

Albright had numerous supporters in and around Columbia. I spoke with a number of them outside the Boone County courthouse on the day of one of Albright’s scheduled court appearances. The only one who would give me his name was a guy by the name of Hobbes (I think his first name was Ken). An older woman, who would only go by “Mrs. Hobbes,” (I assume she was his mother), talked to me a little bit less. They were certainly fundamentalist Christians. They gave me pamphlets, a Constitutional Driver’s License (whereby I could grant myself the right to travel the nation’s roads freely), a copy of the Constitution, information on how I could secede from the United States and become a sovereign citizen, and other materials. But they sang exactly the same song Albright did, though Albright appeared to be racially motivated.

In 1992, while a senior in high school, I met a conspiracy theorist of another feather. He was a fervent believer in the writings of George Adamski, a UFO author who claimed he had been visited by beings from a yet-undiscovered planet in the solar system. Adamski, as I recall, had been widely discredited in the 1960s. But this guy’s beliefs (I don’t recall his name anymore, unfortunately) fit these others like a hand in a glove. He, too, spoke of the New World Order, the Trilateral Commission, and other oddities.

So… There are plenty of kooks like McVeigh out there. Some of them, like the last one I mentioned, are quirky but harmless. Albright, I believe, could be extremely dangerous. And, interestingly enough, although each type begins with a different premise at heart, they all come to nearly identical conclusions.

The common thread is that none of them trust the government and none of them fully understand the world around them. That’s fine. I don’t trust the government and I certainly don’t understand everything about the world around me. You can do one of two things when that happens. You can just accept that you don’t know everything and you never will know everything, and just try to understand the things that interest you or the things that affect you as best as you possibly can.

Or you can explain it all away as a giant conspiracy. Of course you can’t be the one that’s messed up. The rest of the world around you is messed up. And they’re doing it on purpose!

Time for a reality check.

Hard Fact Number One: Members of the hard left are every bit as disillusioned as members of the hard right. Most of my college professors despised Bill Clinton every bit as much as I did. They were liberal. We’ve got people on the hard left who can’t get what they want. We’ve got people on the hard right who can’t get what they want. [observation]Isn’t that called compromise?[/observation]

Hard Fact Number Two: It’s difficult to get people to cooperate with one another. It’s even more difficult to get organizations to cooperate with one another. If you spend any length of time within an organization of any considerable size, you begin to wonder how it keeps from unraveling just because of internal politics. And these are people who share the same interests! Want an example of how conspiracies are so difficult? Fine. Here’s one: Oracle and Sun and the United States Government against Microsoft. Remember how they bungled that one? And why? None of the parties could figure out what exactly they wanted on their own, let alone what they wanted collectively.

Conspiracies can happen. But they’re rare and generally short-lived.

McVeigh killed 168 people. Or, at the very least, McVeigh participated in the killing of 168 people. We don’t know if he and Terry Nichols acted alone. Probably not–there was a John Doe No. 2 who was never found. But McVeigh did kill innocent people, and he did it willfully and he expressed no remorse.

Yes, the United States Government is partially responsible for that. The Clinton administration did a lot of detestable things. Part of that was because Bill Clinton is and was a hopeless idealist, and he surrounded himself with the same types of people. They didn’t know how to handle people who didn’t share their worldview. And most of them probably didn’t forsee the possibility of a McVeigh-like backlash to Waco and Ruby Ridge. Holding the government accountable for those actions is necessary. Not handing the presidency to Al Gore is a good start, but that’s only a start. And the country was bitterly divided over that.

If you want to take that argument to its logical conclusion, who was it that put that administration in office? Hint: If you live in the United States, scroll up to the top of this page, get a good look at my picture, then go look in the mirror. You and I did that. But you didn’t vote for him, you say? Neither did I. Fifty-seven percent of us didn’t. The problem was, the 57% of us who wanted someone else couldn’t agree on the someone else to put in office, and we paid the price. But the fact is, most of us don’t care. So, since we put this government in place, aren’t we also responsible for its actions, especially when we refuse to fundamentally change it?

But blaming the United States Government for Timothy McVeigh’s actions is childish. When I was in fifth grade, another kid named Benji used to act up and then blame his poor behavior on the outcome of the 1985 World Series. There is no difference. Benji wasn’t mature enough to deal with his disappointment about the baseball season in a socially responsible manner. Timothy McVeigh wasn’t mature enough to deal with his disappointment with the government’s behavior in a socially responsible manner. The St. Louis Cardinals didn’t make Benji misbehave, and the U.S. Government didn’t make McVeigh blow up that building. The victims of McVeigh’s atrocity deserve better than that kind of logic.

Yes, the government is partially responsible because McVeigh’s actions are the consequence of some of its own actions. And the government’s job is to clean up its own mess. I’m not convinced it’s totally done that. But McVeigh was guilty, and he even admitted his guilt. The U.S. Government did what its laws call for it to do. So it actually owned up for once.

Don’t get used to it. Except for it only partially cleaning up, that is.

And, like it or not, McVeigh is now a martyr in some circles. Actually he’s been a martyr since the day of his arrest. But there’s a grain of truth in McVeigh’s beliefs. Our government is out of control, it’s irresponsible, and it’s not accountable to anyone.

But that’s our fault. Our government is supposed to be accountable to us, and as long as our Congressmen send plenty of pork back home, we keep them in office. And we vote for our presidents whimsically. The government knows that as long as they give us bread and circuses, we don’t care about much else.

And if we want to keep this kind of crap from ever happening again, we’re going to have to start giving a crap about more than just food and entertainment.

I’m not holding my breath.

Craig Mundie’s infamous speech

I haven’t said anything about Microsoft Executive Craig Mundie’s speech yet. Everyone’s heard of it, of course, and the typical response has been something along the lines of “Now we know Microsoft’s stance on Open Source.”

No, we’ve always known Microsoft’s stance on that. They’re scared of it. Remember the stereotype of open-source programmers: college students and college dropouts writing software in their basements that a lot of people are using, with the goal of toppling an industry giant. Seem far-fetched? Friends, that’s the story of Microsoft itself. Microsoft became an underground sensation in the late 1970s with Microsoft Basic, a programming language for the Altair and other kit computers and later for CP/M. And while we’ll probably never know the entire story of how and why this happened, when IBM decided to outsource the operating system for the IBM PC, they went to Microsoft and got both an OS and the must-have Microsoft Basic. Ten years later, IBM was just another hardware maker–really big, but getting squeezed. Today, 20 years later, IBM’s still a huge force in the computing industry, but in the PC industry, aside from selling ThinkPads, IBM’s a nobody. There may be hardware enthusiasts out there who’d be surprised to hear IBM makes and sells more than just hard drives.

Ironically, Microsoft’s response to this new threat is to act more and more like the giant it toppled. Shared Source isn’t a new idea. IBM was doing that in the 1960s. If you were big enough, you could see the source code. DEC did it too. At work, we have the source code to most of the big VMS applications we depend on day-to-day. Most big operations insist on having that kind of access, so their programmers can add features and fix bugs quickly. If Windows 2000 is ever going to get beyond the small server space, they really have no choice. But they do it with strings attached and without going far enough. An operation the size of the one I work for can’t get the source and fix bugs or optimize the code for a particular application. You’re only permitted to use the source code to help you develop drivers or applications. Meet the new Microsoft: same as the old Microsoft.

Some people have read this speech and concluded that Microsoft believes open-source software killed the dot-com boom. That’s ludicrous, and I don’t see that in the text. OSS was very good for the dot-com boom. OSS lowered the cost of entry: Operating systems such as FreeBSD and Linux ran on cheap PCs, rather than proprietary hardware. The OSs themselves were free, and there was lots of great free software available, such as the Apache Web server, and scripting languages like Python and Perl. You could do all this cool stuff, the same cool stuff you could do with a Sun or SGI server, for the price of a PC. And not only was it cheaper than everybody else, it was also really reliable.

The way I read it, Microsoft didn’t blame OSS for the dot-com bust. Microsoft blamed the advertising model, valuing market share over revenue, and giving stuff away now and then trying to get people to pay later.

I agree. The dot-com boom died because companies couldn’t find ways to make money. But I’m not convinced the dot-com boom was a big mistake. It put the Internet on the map. Before 1995, when the first banner ad ran, there wasn’t much to the Internet. I remember those early days. As a college student in 1993, the Internet was a bonanza to me, even though I wasn’t using it to the extent a lot of my peers were. For me, the Internet was FTP and Gopher and e-mail. I mostly ignored Usenet and IRC. That was pretty much the extent of the Internet. You had to be really determined or really bored or really geeky to get much of anything out of it. The World Wide Web existed, but that was a great mystery to most of us. The SGI workstations on campus had Web browsers. We knew that Mosaic had been ported to Windows, but no one in the crowd I ran in knew how to get it working. When we finally got it running on some of our PCs in 1994, what we found was mostly personal homepages. “Hi, my name is Darren and this is my homepage. Here are some pictures of my cat. Here’s a listing of all the CDs I own. Here are links to all my friends who have homepages.” The running joke then was that there were only 12 pages on the Web, and the main attraction of the 12 was links to the other 11.

By 1995, we had the first signs of business. Banner ads appeared, and graduating students (or dropouts) started trying to build companies around their ideas. The big attraction of the Web was that there was all this information out there, and it was mostly free. Online newspapers and magazines sprung up. Then vendors sprung up, offering huge selections and low prices. You could go to Amazon.com and find any book in print, and you’d pay less for it than you would at Barnes & Noble. CDNow.com did the same thing for music. And their ads supported places that were giving information away. So people started buying computers so they could be part of the show. People flocked from closed services like CompuServe and Prodigy to plain-old Internet, which offered so much more and was cheaper.

Now the party’s ending as dot-coms close up shop, often with their content gone forever. To me, that’s a loss only slightly greater than the loss of the Great Library. There’s some comfort for me: Five years from now, most of that information would be obsolete anyway. But its historical value would remain. But setting sentiment aside, that bonanza of freebies was absolutely necessary. When I was selling computers in 1994, people frequently asked me what a computer was good for. In 1995, it was an easier sell. Some still asked that question, but a lot of people came in wanting “whatever I need to get to be able to get on the Internet.” Our best-selling software package, besides Myst, was Internet In A Box, which bundled dialup software, a Web browser, and access to some nationwide provider. I imagine sales were easier still in 1996 and beyond, but I was out of retail by then. Suddenly, you could buy this $2,000 computer and get all this stuff for free. A lot of companies made a lot of money off that business model. Microsoft made a killing. Dell and Gateway became behemoths. Compaq made enough to buy DEC. AOL made enough to buy Time Warner. Companies like Oracle and Cisco, who sold infrastructure, had licenses to print money. Now the party’s mostly over and these companies have massive hangovers, but what’s the answer to the Ronald Reagan question? Hangover or no hangover, yes, they’re a whole heck of a lot better off than they were four years ago.

I’m shocked that Microsoft thinks the dot-com phenomenon was a bad thing.

If, in 1995, the Web came into its own but every site had been subscription-based, this stuff wouldn’t have happened. It was hard enough to swallow $2,000 for a new PC, plus 20 bucks a month for Internet. Now I have to pay $9.95 a month to read a magazine? I could just subscribe to the paper edition and save $2,500!

The new Internet would have been the same as the old Internet, only you’d have to be more than just bored, determined, and geeky to make it happen. You’d also have to have a pretty big pile of cash.

The dot-com boom put the Internet on the map, made it the hot ticket. The dot-com bust hurt. Now that sites are dropping out of the sky or at least scaling operations way back, more than half of the Web sites I read regularly are Weblogs–today’s new and improved personal home page. People just like me. The biggest difference between 1994 and 2001? The personal home pages are better. Yeah, the pictures of the cat are still there sometimes, but at least there’s wit and wisdom and insight added. When I click on those links to the left, I usually learn something.

But there is another difference. Now we know why it would make sense to pay for a magazine on the Internet instead of paper. Information that takes a month to make it into print goes online in minutes. It’s much easier and faster to type a word into a search engine than to leaf through a magazine. We can hear any baseball game we want, whether a local radio station carries our favorite team or not. The world’s a lot smaller and faster now, and we’ve found we like it.

The pump is primed. Now we have to figure out how to make this profitable. The free ride is pretty much over. But now that we’ve seen what’s possible, we’re willing to start thinking about whipping out the credit cards again and signing up, provided the cost isn’t outrageous.

The only thing in Mundie’s speech that I can see that Linus Torvalds and Alan Cox and Dan Gillmor should take offense to is Microsoft’s suspicion of anyone giving something away for free. Sure, Microsoft gives lots of stuff away, but always with ulterior motives. Internet Explorer is free because Microsoft was afraid of Netscape. Outlook 98 was free for a while to hurt Lotus Notes. Microsoft Money was free for a while so Microsoft could get some share from Quicken. It stopped being free when Microsoft signed a deal with Intuit to bundle Internet Explorer with Quicken instead of Netscape. And there are other examples.

Microsoft knows that you can give stuff away with strings attached and make money off the residuals. What Microsoft hasn’t learned is that you can give stuff away without the strings attached and still make money off the residuals. The dot-com bust only proves that you can’t necessarily make as much as you may have thought, and that you’d better spend what you do make very wisely.

The Internet needs to be remade, yes, and it needs to find some sustainable business models (one size doesn’t fit all). But if Mundie thinks the world is chomping at the bit to have Microsoft remake the Internet their way, he’s in for a rude awakening.

More Like This: Microsoft Linux Weblogs Internet Commentary

04/25/2001

The St. Louis Cardinals want a new stadium. It seems like everyone else is building a new stadium, and Busch Stadium was one of five multipurpose stadiums built in the late 1960s (Pittsburgh, Philadelphia, Cincinatti, St. Louis, and Atlanta) that looked almost exactly alike–and that wouldn’t have been so bad, I suppose, except they all looked like toilets. Well, after Anheuser-Busch sold the team to a group of investors, the new owners realized that humongous toilet-shaped stadiums with artificial turf are ugly, so they moved in the fences, ripped out the turf and put in grass, and since retro is in, they erected a hand-operated scoreboard in the upper deck (the seats they displaced were lousy anyway).

Now, Busch Stadium has always been a lousy place to watch a baseball game. The architecture harkens back to post-war East Germany. The stadium has no charms, aside from the retrofitted scoreboard. And unless you’re in the box seats, you need binoculars to see anything. There isn’t a good seat in the house. Once you’ve been to a game at Wrigley Field, or Royals Stadium (yeah, yeah, it’s officially Kaufmann Stadium now, but I’ll never change), you realize what watching a baseball game is supposed to be, and Busch Stadium ain’t it. It’s more fun to watch the Royals and Cubs lose in their home parks than it is to be there–it’s hard to call what you do at Busch “watching”–when the Cardinals win in theirs. Force large numbers of Kansas Citians to watch a few games at Busch Stadium at gunpoint, and they’ll realize how good they’ve got it with Royals Stadium, and then the Royals will start drawing two million fans again.

So the Cardinals want to tear it down. Great, I say. Blow it up. I’ll help. I’ll even donate a little money to the cause.

So, what’s wrong with the Cardinals’ plan to get rid of Busch? They want the State of Missouri to pay for it. And that’s wrong. Why should the citizens of Kansas City be helping to pay for St. Louis’ new stadium? Why should my mom, who’ll probably never go to another baseball game in her life and who almost certainly will never go to a Cardinal game, be ponying up towards that stadium? The argument is that it’ll bring in jobs and revenue.

Fine. So if Boeing decides it wants to move its corporate headquarters here to St. Louis, where it already has some presence anyway, the State of Missouri should pay for it. After all, that’ll bring in even more jobs (and white-collar jobs at that!), and the revenue it brings in will last all year.

There is no difference between those two things. They’re private enterprises that should get their own funding. Period. And besides, the Cardinals aren’t a good investment. If the players strike or are locked out at the end of the season, which is likely, nobody knows what will happen. At best, baseball will be damaged goods. At worst, diehards like me will be following Japanese baseball next season because there won’t be any pro baseball left in the States. If the State of Missouri wants to give the Cardinals a loan, fine, but a handout, no.

And that’s not even figuring in the other parts of the argument. The proposed new stadium is smaller and has less seating capacity than Busch. The Cardinals draw three million fans a year. They fill that wretched place. Cardinal fans would watch baseball on a playground in a slum if that was where the Cards were playing. So, somehow, building a smaller but much prettier stadium is going to help team revenue? Only if they raise ticket prices through the roof. And ticket prices are already awfully high. That move could very easily backfire. Football and hockey are already so expensive that you can’t go to a game without sitting in the middle of a bunch of yuppies complaining that they only made $100,000 on the stock market last year. So the solution is to make baseball, with its 81 home games, the same way? While it might work for a little while, it’s not sustainable. The Cardinals have a rabid following in central Illinois and throughout Missouri, but neither of those places is exactly yuppie town. Make baseball a game for the elite, and the The Rest of Us, who the team’s revenue is built on, will go to fewer games and spend less money as a result.

There’s always the veiled threat that the Cardinals will move, to the Missouri suburbs or the Illinois suburbs, or, ridiculously, out of St. Louis entirely. That last prospect won’t happen. The Cardinals won’t draw three million fans anywhere else. Two million, tops. The move to the Missouri suburbs isn’t likely–Missouri doesn’t want to pay for the stadium whether it’s in St. Louis or in Creve Couer. Illinois is a possibility, but not a risk the Cardinals ownership should be interested in taking. The Illinois suburbs are known for two things: crime and strip clubs. Do they really want their brand-new stadium to be next door to the Diamond Cabaret?

Yes, Cardinal fans will go watch baseball next door to the Diamond Cabaret. They’d watch baseball in the middle of East St. Louis if they had to. Or they’ll keep right on packing it in at Busch, lousy though it may be. It’s lousy, but it’s a good match for the team because it seats buttloads of people, and they consistently fill it, and the stadium may be an eyesore, but it’s nowhere near as old as Fenway Park or Wrigley Field and no one’s complaining about their structural integrity. Busch Stadium will be around for a while. And a lot of fans even like it.

Cardinal management doesn’t know how good they’ve got it, and Missouri needs to continue to call their bluff.

Enough of that. Let’s talk about us. That got your attention I’m sure. Performance this morning was, to put it mildly, pants. Then the system went down like a… never mind. I’m getting really tired of it. I’m paying nothing for this, and lately I’m getting what I pay for. I want to control my own destiny, and I’ve got this nice broadband Internet connection, and some spare parts (and what I lack is cheap) and I want some real sysadmin experience. So, I’m thinking really seriously about moving. I wanted to hit the Userland Top 100 before I moved on, and enough time may pass between now and the time that I get set up for that to happen I may meet that goal yet.

At the moment I’m leaning toward Greymatter, as it’ll give me everything I have here, just about, plus better discussion facilities. Suggestions welcome.

02/11/2001

Mailbag:

Innovation

Steve DeLassus asked me for some ideas of where I see innovation, since I said Microsoft isn’t it. That’s a tough question. On the end-user side, it’s definitely not Microsoft. They’ve refined some old ideas, but most of their idea of Innovation is taking utilities that were once separate products from companies Microsoft wants to drive out of business, then grafting them onto the OS in such a way as to make them appear integrated. What purpose does making the Explorer interface look like a Web browser serve? Doesn’t everyone who’s used a real file manager (e.g. Norton Commander or Directory Opus) agree that the consumer would have been better served by replicating something along those lines? Not that that’s particularly innovative either, but at least it’s improving. The only innovation Microsoft does outside of the software development arena (and that makes sense; Microsoft is first and foremost a languages company and always has been) seems to be to try to find ways to drive other companies out of business or to extract more money out of their customers.

Richard Stallman’s GNU movement has very rarely been innovative; it’s been all about cloning software they like and making their versions free all along. It’s probably fair to call Emacs innovative; it was a text editor with a built-in programming environment long before MS Word had that capability. But I don’t see a whole lot of innovation coming out of the Open Source arena–they’re just trying to do the same thing cheaper, and in some cases, shorter and faster, than everyone else.

So, where is there innovation? I was thinking there was more innovation on the hardware side of things, but then I realized that a lot of those “innovations” are just refinements that most people think should have been there in the first place–drives capable of writing to both DVD-R and CD-R media, for instance. Hardware acceleration of sound and network cards is another. Amiga had hardware acceleration of its sound in 1985, so it’s hard to call that innovation. It’s an obvious idea.

A lot of people think Apple and Microsoft are being really innovative with their optical mice, but optical mice were around for years and years before either of those companies “invented” them. The optical mice of 2000 are much better than the optical mice of 1991–no longer requiring a gridded mouse pad and providing smoother movement–but remember, in 1991, the mainstream CPUs were the Intel 80286 and the 80386sx. That’s a far, far cry from the Thunderbird-core AMD Athlon. You would expect a certain degree of improvement.

I’d say the PalmPilot is innovative, but all they really did was take a failed product, Apple’s Newton, and figure out what went wrong and make it better. So I guess you could say Apple innovated there, but that was a long time ago.

So I guess the only big innovation I’ve seen recently from the end-user side of things has been in the software arena after all. I’m still not sold on Ray Ozzie’s Groove, but have to admit it’s much more forward-thinking than most of the things I’ve seen. Sure, it looks like he’s aping Napster, but he started working on Groove in 1997, long before Napster. Napster’s just file sharing, which has been going on since the 1960s at least, but in a new way. There again, I’m not sure that it’s quite right to call it true innovation, but I think it’s more innovative than most of the things I’ve seen come out of Microsoft and Apple, who are mostly content to just copy each other and SGI and Amiga and Xerox. If they’re going to steal, they should at least steal the best ideas SGI and Amiga had. Amiga hid its menu bars to save screen space. Maybe that shouldn’t be the default behavior, but it would be nice to make that an option. SGI went one further, making the pull-down menus accessible anywhere onscreen by right-clicking. This isn’t the same as the context menu–the program’s main menu came up this way. This saved real estate and mouse movements.

I’m sure I could think of some others but I’m out of time this morning. I’d like to hear what some other people think is innovative. And yes, I’m going to try to catch up on e-mail, either this afternoon or this evening. I’ve got a pretty big backlog now.

Mailbag:

Innovation

Computer buying advice

Some sound computer buying advice. Here’s a Washington Post article on buying new PCs. Easy to understand in layman’s terms. And the advice is for the most part sound too, though I recommend always buying a good video card–a TNT2 will just add $60 or so to the cost of a low-end box and everything will run more nicely. The box I’m typing on right now has a cheap Cirrus Logic-based card in it, and the high CPU usage of its drivers hurts multitasking noticeably, even if I’m just browsing the Web while listening to music.

In a year this’ll be a moot point, as all chipsets will have serviceable embedded video. Even the enraging Intel i740, though not good for games, was great for productivity use and much better all around than this Cirrus and Trident garbage, and Intel’s newest chipsets have i740 derivatives in them. Future VIA chipsets will have S3 video in them. Same story.

I buy crap so you don’t have to–but don’t get me wrong. I buy the good stuff too. That way I’ll know the difference.

No more wimpy PC sound for me. I just connected an ancient but still awesome Harmon/Kardon 330A receiver (built in the late 1960s, I’m guessing — it once belonged to my dad) to my computer along with a pair of KLH 970A speakers I picked up for 30 bucks at Best Bait-n-Switch (unfortunately, the only nearby place that sells KLH speakers). These things are scarcely bigger than the cheap desktop speakers that came with the last PC I bought — 7 3/8″ high x 4 5/8″ wide by 4 3/8″ deep — but with the volume cranked to about 1/3 I can hear it throughout my apartment. I imagine at 2/3 I’d meet my neighbors. I won’t try that — I’m not interested in sharing my great tunage.

I can’t believe neither my mom nor my sister wanted this receiver — honestly, every time I’ve mentioned this thing at an audio place the salesperson has asked if I was interested in selling it — but hey, my dad would have wanted me to have a kickin’ audio setup for my PCs, right? This’ll work great for Royals broadcasts over the ‘Net once baseball season starts again, but not only that, this combination kicks out the jams almost as hard as punk legends The MC5, so I’m not complaining.

I’m happy enough with the results that I think rather than replacing my dying CD changer, once my Windows Me experiments are over I’ll mount my extra 15-gig drive somewhere on my LAN and put my Plextor Ultraplex CD-ROM drive to work ripping my entire CD collection, which I’ll then encode at 320 kbps. I doubt I’ll notice much difference.

If you’re like me and live with several PCs in close proximity to one another, rather than plugging an endless number of cheap desktop speakers into them, pick up an inexpensive receiver or use a castaway. You can plug a PC into any stereo input except phono, so most modern receivers should accomodate at least three PCs, and the speaker options are limited only by the receiver’s capabilities and available space. You’re likely to be much happier with such a setup than with any desktop speakers you’ll find, and a receiver plus speakers will usually cost much less than multiple pairs of any set of desktop speakers worth having would. Just be very careful to isolate your speakers away from any floppies and Zips and other magnetic media you might have. Some bookshelf speakers may be magnetically shielded, but don’t count on it.