Getting out of a sticky BIND

Setting up DNS on Linux isn’t supposed to be the easiest thing in the world. But it wasn’t supposed to be this hard either.
I installed Debian (since it’s nice and lean and mean) and BIND 9.2.1 and dutifully entered the named.conf file and the zones files. I checked out their syntax with the included tools (named-checkconf and named-checkzone). It checked out fine. But my Windows PCs wouldn’t resolve against it.
Read more

First look: The Proliant DL320

I’ve had the opportunity the past two days to work with Compaq’s Proliant DL320, an impossibly thin 1U rack-mount server. All I can say is I’m impressed.
When I was in college, a couple of the nearby pizza joints sold oversized 20″ pizzas. The DL320 reminded me of the boxes these pizzas came in. The resemblance isn’t lost on IBM: In its early ads for a competing product, I remember IBM using an impossibly thin young female model holding a 1U server on a pizza-joint set.

HP announced last week that Compaq’s Proliant series will remain basically unchanged, it will just be re-branded with the HP name. HP had no product comparable to the DL320.

I evaluated the entry-level model. It’s a P3 1.13 GHz with 128 MB RAM, dual Intel 100-megabit NICs, and a single 40-gigabyte 7200-rpm Maxtor/Quantum IDE drive. It’s not a heavy-duty server, but it’s not designed to be. It’s designed for businesses that need to get a lot of CPU power into the smallest possible amount of rack space. And in that regard, the DL320 delivers.

Popping the hood reveals a well-designed layout. The P3 is near the front, with three small fans blowing right over it. Two more fans in the rear of the unit pull air out, and two fans in the power supply keep it cool. The unit has four DIMM sockets (one occupied). There’s room for one additional 3.5″ hard drive, and a single 64-bit PCI slot. Obvious applications for that slot include a gigabit Ethernet adapter or a high-end SCSI host adapter. The machine uses a ServerWorks chipset, augmented by a CMD 649 for UMDA-133 support. Compaq utilizes laptop-style floppy and CD-ROM drives to cram all of this into a 1U space.

The fit and finish is very good. The machine looks and feels solid, not flimsy, which is a bit surprising for a server in this price range. Looks-wise, it brings back memories of the old DEC Prioris line.

The rear of the machine has a fairly spartan number of ports: PS/2 keyboard and mouse, two RJ-45 jacks, VGA, one serial port, and two USB ports. There’s no room for luxuries, and such things as a parallel port are questionable in this type of server anyway.

Upon initial powerup, the DL320 asks a number of questions, including what OS you want to run. Directly supported are Windows NT 4.0, Windows 2000, Novell NetWare, and Linux.

Linux installs quickly and the 2.4.18 kernel directly supports the machine’s EtherExpress Pro/100 NICs, CMD 649 IDE, and ServerWorks chipset. A minimal installation of Debian 3.0 booted in 23 seconds, once the machine finished POST. After compiling and installing a kernel with support for all the hardware not in the DL320 removed, that boot time dropped to 15 seconds. That’s less time than it takes for the machine to POST.

Incidentally, that custom kernel was a scant 681K in size. It was befitting of a server with this kind of footprint.

As configured, the DL320 is more than up to the tasks asked of low-end servers, such as user authentication, DNS and DHCP, and mail, file and print services for small workgroups. It would also make a nice applications server, since the applications only need to load once. It would also be outstanding for clustering. For Web server duty or heavier-duty mail, file and print serving, it would be a good idea to upgrade to one of the higher-end DL320s that includes SCSI.

It’s hard to find fault with the DL320. At $1300 for an IDE configuration, it’s a steal. A SCSI-equipped version will run closer to $1900.

Stand by your SCSI.

The Storage Review recently ran a feature on the Seagate Barracuda 36 series, Seagate’s current economy-class SCSI drive. Like many low-end Seagate SCSI drives of the past, it is a converted ATA/IDE design. And Storage Review eats these kinds of units up, because theoretically they provide a nice way to demonstrate the difference between IDE and SCSI.
The result? The SCSI unit was actually slower than its IDE brethren in some of the tests.

The conclusion? SCSI isn’t necessarily faster than IDE.

That’s partially right. Taking the same drive mechanism and replacing the IDE circuitry with SCSI circuitry won’t result in a rockin’-fast drive. SCSI does have more overhead than IDE, so without some other changes, the drive won’t be an impressive performer.

The thing is, people don’t buy expensive SCSI controllers and then put retreaded IDE drives on them. Or at least they shouldn’t. The Barracuda 36 series is intended for people replacing SCSI drives in older equipment. Since the drive will frequently be replacing a five-year-old drive (or older), it doesn’t have to be a screamer. Anything made today will be faster than anything you can find from the mid-90s.

SCSI gives other advantages over IDE. First, with a modern host adapter (don’t call it a controller; you’ll get dirty looks) you can connect 14 devices and only use one interrupt. On today’s crowded PCs that try to be everything to everyone, that can be a real boon. Second, you have far fewer limitations over cable length. Don’t buy an IDE cable longer than 18 inches; you’re just asking for trouble. I know, I know, some of you have 36-inch IDE cables and they work fine. Trust me: Replace it with a shorty, and you’ll get fewer data errors, which means a more reliable system at the very least, and possibly a faster system as well due to fewer retransmissions. With SCSI, you can actually use the top bays in that five-foot-tall megatower you bought. Third, you can get external SCSI devices, in the event that you made the mistake of not buying that five-foot-tall megatower, or if you just like portability. This is less of a factor in these days of Firewire and USB 2.0, but it’s still a nicety you don’t get with IDE. Fourth and most importantly, SCSI devices sharing the same bus can talk at the same time. When you put two IDE drives on the same channel, one drive has to wait for the other to shut up before it can speak its peace. This limits the advantage of having multiple drives. With multiple SCSI drives, you can actually saturate all that bandwidth you paid for.

The fifth advantage of may soon fade: command queuing. SCSI drives don’t have to perform requests in the order received. If you’re constantly accessing two files at once, reading one, then writing to the other, in alternating fashion, the IDE drive will be jumping all over the place. The SCSI drive will figure out how to reorder those requests so it doesn’t have to jump around as much. IBM’s recent Deskstar drives can do command queuing as well, provided the operating system supports that mode of operation. But it’s not a common feature in IDE drives yet. This advantage usually won’t show up in benchmarks, but it’s significant. SCSI drives, to use a popular middle-management buzzword, work smarter. If you’ve got a Windows 2000 or XP system with a SCSI drive in it, try using the system while defragmenting the drive. The system will be slower, but not unusable. That’s never true of an IDE drive.

And the sixth advantage of SCSI doesn’t really have much to do with SCSI. With SCSI, you get cutting-edge technologies first. You can’t buy a 15K RPM IDE drive. You can’t even buy a 10K RPM IDE drive. There’s only one IDE drive on the market with an 8-meg cache on it. Caches that size are commonplace on contemporary SCSI drives, and the gargantuan Seagate Barracuda 180 has a 16-meg cache. It also costs as much as a nice computer all by itself, so it’s not exactly a consumer-class drive, but it’s available if you’ve got more money than patience.

Benchmarks are deceiving. Some changes will double the benchmark scores, but a user won’t tell much difference. Other changes barely register, but the user notices them. SCSI is one of those, especially if you multitask a lot.

It’s true that there’s no point in spending $400-$500 for a disk subsystem in a PC you use for word processing and e-mail. You’ll notice a difference, but it’s not worth the extra cost. Although if you’re buying a used system and have a choice between a system with IDE disks and SCSI disks, you should get the SCSI system, even if it means ponying up another 50 bucks. You’ll thank yourself for it.

As for me, I love my SCSI systems with 10K RPM drives in them. They’re wicked fast, and no louder than the IDE drives of four or five years ago. (I don’t have a current IDE drive to compare them to.) I can let my e-mail inbox fill up with thousands of messages without it dragging beyond belief, and my non-Adobe applications load in less than three seconds. Most of them load in less than a second. The drives themselves are small and expensive, but you’re buying performance, not capacity. I can’t fill up a 9-gig drive with applications anyway. Neither can most people.

So no, SCSI isn’t a magic silver bullet. But that doesn’t mean it’s not worth having.

Vintage PCs and bubblegum and Unix and Windows server crashes

Mail. Svenson wrote in, a little bit disturbed at the “vintage” label I hung on Pentium IIs this week. Here’s what he said:

What you call a Vintage PC is about what I got as a "new" box at work!

OK, it's a P2/400 but the 128Meg is not EEC and the drive is a standard 10GB 5400rpm thing. No SCSI anywhere. That is the kind of hardware being installed here.

Oh, and, BTW, it has to run Win2000.

To which I replied my “vintage” label was at least slightly tongue-in-cheek. I’ve got a Celeron-400 here that’s still in heavy use. My P2/266 laptop doesn’t get much use anymore because my employer provided me with a P3-800 laptop late last year. There are people who call even that P3-800 passe. They’re idiots, and I have zero respect for them, but they’re out there, and unfortunately people listen to them. Today I’m hearing P2s mentioned with the same disdain that 286s were in 1993 and 386s in 1996. They’re still fine computers. As my workplace is well aware–our workhorse machine is still a P2-350 or 400 with a 5400 RPM IDE drive, and that looks to remain true for another couple of years.

It’s a buyer’s market. If you know someone who needs a computer, buy one of these. They’re built much better than a $399 eMachine, and the models with SCSI drives in them will outperform the eMachine for household tasks.

Absolutely nuts. If you’re in the market for Luis Gonzalez’s bubblegum (Gonzalez is the Arizona Diamondbacks’ slugging left fielder), it’s for sale. I got a bit far out there on my baseball collectibles, but never that far.

Absolutely funny. I’m so glad that the people at Microsoft and Unisys are incompetent. They set their sights on Unix with their “We Have the Way Out” campaign. Then someone noticed the Web site was running on, uh, well, FreeBSD. I see. Unix is good enough for them, but not for the rest of us. Word got out in a hurry, and they hastily moved the site over to Windows 2000. Within hours, the site was down. And down it stayed, for two days.

See what happens when you abandon Unix in your datacenter for Windows 2000? I gotta get me some of that. I’ll charge into my boss’ boss’ office today and tell him we need to migrate our VMS and Digital Unix and Linux systems to Windows 2000. He’ll ask why, and I’ll tell him the truth:

The systems we have now work too well and I need job security.

Wehavethewayout.com is working now, but Gatermann visited it yesterday and noted its form didn’t work right in Mozilla. So I guess you can only get information on Microsoft’s way out if you’re running Internet Explorer.

Maybe these guys are smart, but they have about as much common sense as the chair I’m sitting in.

That’s just as well. If their experience is any indication (trust me, it is), they can keep their information. I’ve seen more useful information written in bathroom stalls.

Dave installs Windows XP

We needed an XP box at work for testing. Duty to do the dirty deed fell to me. So after ghosting the Windows 2000 station several of us share, I pulled out an XP CD. It installed surprisingly quickly–less than half an hour. The system is a P3-667 with 128 MB RAM and an IBM hard drive (I don’t know the model).
It found the network and had drivers for all the hardware in the box. That doesn’t happen very often with Microsoft OSs, so it was nice.

I booted into XP, to be greeted by a hillside that was just begging to be overrun by tanks, but instead of tanks, there was this humongo start menu. I right-clicked on the Start button, hit Properties, and picked Classic view. There. I had a Win95-like Start menu. While I was at it, I went back and picked small icons. I don’t like humongous Start menus.

I also don’t like training wheels and big, bubbly title bars. The system was dog slow, so I right-clicked on the desktop to see what I could find to turn off. I replaced the Windows XP theme with the Classic theme. Then I turned off that annoying fade effect.

Still, the system dragged. I went into Control Panel, System, Performance. Bingo. I could pick settings for best appearance (whose choices are certainly debatable–I guess they look good if you like bright colors and have a huge monitor) or best performance. Guess which I picked? Much better.

Next, I went into Networking. I saw some QoS thing. I did a search. It’s intended to improve the quality of your network, at the price of 20% of your bandwidth. Forget that. I killed it.

After I did all that stuff, XP was reasonably peppy. It logs on and off quickly. I installed Office 2000 and it worked fine. The apps loaded quickly–just a couple of seconds. That’s how it should be. If I went in and edited the shortcuts in the Start menu to turn off the splash screens, they’d load instantly.

WinXP brings up a bunch of popups that I don’t like. If I wanted unexpected popup windows, I’d run a Web browser. I couldn’t quickly figure out how to disable those.

I couldn’t run Windows Update. It froze every time I tried.

I found a Windows XP tuning guide at ExtremeTech. I suspect turning off the eye candy will help more than most of the suggestions in that article. I suspect if I dug around I’d find other things. We’ll see if I get some time.

XP isn’t as bad as I expected, I guess. But I’m still not going to buy it.

This, on the other hand, is worth a second look. And a third. You can now run MS Office on Linux. No need to wait for Lindows, no need to abandon your current fave distro (at least if your fave distro is Red Hat, SuSE, Mandrake, Debian, or Caldera).

It’s 55 bucks. It’s available today. It brings Office 97/2000 and Lotus Notes r5 to your Linux desktop. Other Windows apps work, but their functionality isn’t guaranteed.

You can get some screenshots at CodeWeavers. It even makes the apps look like native Linux apps.

Spontaneous system reboots

Steve DeLassus asked me the other day what I would do to fix a PC that was rebooting itself periodically. It’s not him who’s having the problem, he says, it’s someone he knows. He must be trying to show up someone at work or on the Web or something.
So I gave him a few things I’d check, in order of likelihood.

Static electricity. A big static shock can send a system down faster than anything else I’ve seen. Keep a humidifier in the computer room to reduce static electricity. If you’re really paranoid, put a metal strip on your desk and connect it to ground (on your electrical outlet, not on your PC) and touch it before touching your PC. Some people metalize and ground part of their mouse pad. That’s a bit extreme but it works.

Power supply. This is the big one. A failing power supply can take out other components. And even if you have an expensive, big-brand box like a PCP&C or Enermax, they can fail. So I always keep a spare ATX power supply around for testing. It doesn’t have to be an expensive one–you just want something that can run the machine for a day or two to see if the problem goes away.
Overheating. Check all your fans to make sure they’re working. An overheated system can produce all sorts of weird behavior, including reboots. The computer we produced our school newspaper on back in 1996 tended to overheat and reboot about 8 hours into our marathon QuarkXPress sessions.

Memory. It’s extremely rare, but even Crucial produces the occasional defective module. And while bad memory is more likely to produce blue screens than reboots, it’s a possibility worth checking into. Download Memtest86 to exercise your memory.

CPU. If you’re overclocking and experiencing spontaneous reboots, cut it out and see what happens. Unfortunately, by the time these reboots become common, it may be too late. That turned out to be the case with that QuarkXPress-running PC I mentioned earlier. Had we replaced the fans with more powerful units right away, we might have been fine, but we ended up having to replace the CPU. (We weren’t overclocking, but this was an early Cyrix 6×86 CPU, a chip that was notorious for running hot.) Less likely today, but still possible.

Hard drive. I’m really reaching here. If you’re using a lot of virtual memory and you have bad sectors on your hard drive and the swapfile is using one or more of those bad sectors, a lot of unpredictable things can happen. A spontaneous reboot is probably the least of those. But theoretically it could happen.

Operating system. This is truly the last resort. People frequently try to run an OS that’s either too new or too old to be ideal on a PC of a particular vintage. If the system is failing but all the hardware seems to be OK, try loading the OS that was contemporary when the system was new. That means if it’s a Pentium-133, try Win95 on it. If it’s a P4, try Windows 2000 or Windows XP on it. When you try to run a five-year-old OS on a new system, or vice-versa, you can run into problems with poorly tested device drivers or a system strapped for resources.

Another good OS-related troubleshooting trick for failing hardware is to try to load Linux. Linux will often cause suspect hardware to fail, even if the hardware can run Windows successfully, because Linux pushes the hardware more than Microsoft systems do. So if the system fails to load Linux, start swapping components and try again. Once the system is capable of loading Linux successfully, it’s likely to work right in Windows too.

Troubleshooting advice: When you suspect a bad component, particularly a power supply, always swap in a known-good component, rather than trying out the suspect component in another system to see if the problem follows it. The risks of damaging the system are too great, particularly when you try a bad power supply in another system.

And, as always, you minimize the risks of these problems by buying high-quality components, but you never completely eliminate the risk. Even the best occasionally make a defective part.

A couple of quick things

Scanners in Windows 2000. While those two pompous, arrogant gits were out romping about and insulting one another, I was helping Gatermann put together an all-SCSI Windows 2000 system. I talked about that earlier this week. After much wrestling, we got the system booting and working, but his expensive Canon film scanner, which was the reason for all of this adventuring in the first place–his eclectic mix of Ultra160 and SCSI-2 and internal and external components was too much for his old card to handle–wouldn’t work under 2000. It worked fine in Win98, however. But if you’re scanning film, you’re pretty serious about your work, and 2000’s lack of stability is bad enough, while Win98’s lack of stability is enraging.
Side note: His scanner worked just fine in Linux with SANE and GIMP. The SANE driver was alpha-quality, but once he figured out the mislabeled buttons, it worked. Though flawed, it was no worse than a lot of drivers people ship for Windows, and it wasn’t any harder to set up either. Not bad, especially considering what he paid for it.

Gatermann, being a resourceful sort, did a search on Google groups and found a suggestion that he update his ASPI drivers. Since he had an Adaptec card, he could freely download and use Adaptec’s ASPI layer. He did, and the scanner started working.

It’s been a long time since I’ve had to do that to get a scanner going, but it’s been a long time since I’ve set up a SCSI scanner too.

Debian. At work on Friday, I booted the computer on my desk into Linux out of protest (more on that later… a lot more) and I figured while I was in Linux reading and responding to e-mail and keeping up with the usual news sources (I wasn’t having to do any NT administration at the time, which was why I was able to protest), I’d run apt-get update and apt-get upgrade. I run Debian Unstable at work, because Debian Unstable, though it’s considered alpha, is still every bit as stable as the stuff Mandrake and Red Hat have been pushing out the door the past 18 months. It’s also about as close to cutting-edge as I want to live on. Well, it had been a while since I did an update, and I was pleasantly surprised to find I suddenly had antialiased text in Galeon. That’s been my only gripe about Galeon until recently; the fonts looked OK, but they looked a whole lot better in Windows or on a Mac. The quality of the antialiasing still isn’t as good as in Windows, which in turn isn’t as good as on the Mac, but it’s better than none at all.

Galeon was already faster than any Windows-based browser I’d seen, but a recent Galeon build combined with the 0.99 build of Mozilla seemed even faster, and Web sites that previously didn’t render quite right (like Dan’s Data now rendered the same way as in that big, ugly browser from that monopolist in Redmond.

I expect with these last couple of updates, I’ll be spending even more time in Linux from here on out. I already have a full-time Linux station, but I use it about half the time and my Windows 2000 station about half the time. I may limit the Windows 2000 station to video editing very soon. And with some of the cool video programs out there for Linux now, it may share time. I suspect I’ll be doing editing on the Windows box, post-production on the Linux box, and then outputting the results to tape on the Windows box.

Adventure in SCSI

Gatermann called me last night. He’d gotten his new Adaptec 19160, but Windows 2000 wouldn’t recognize it. Unfortunately, he’d reformatted his main drive too, so there was no going back and cheating by installing both his old and new SCSI cards side by side, then installing the driver, then pulling the old card and moving the drive.
We tried a couple of things over the phone. No go. I suggested he try installing Linux just to make sure the card was good.

By the time I arrived, he had a working Red Hat 7.2 configuration. Put that in your pipe and smoke it, Microsoft.

We downloaded the latest Adaptec PCI drivers, using his other Linux box. Windows 2000 didn’t like them. We downloaded the previous version. That one, unlike the other, had a dedicated Adaptec 19160 driver. We installed that and it actually worked.

Forty minutes later, we had a working all-SCSI Windows configuration.

I like Linux more and more every day.

Encryption on the cheap

Disspam cruises along. It’s not often that I gush about a program, let alone a 4.5K Perl script, but Disspam continues to make my life easier. Granted, it simply takes advantage of existing network resources, but they’re resources that were previously (to my knowledge) limited to the mail administrator. Literally half my e-mail at home today was spam. Disspam caught every last piece.
A little scripting of my own. I’ve got a client at work who wants absolute privacy guaranteed. He and his assistant have some files they don’t want anyone else to be able to read, period. Well, there’s no way to guarantee that under NT, Unix, or VMS. Under NT, we can take away anyone else’s rights to read the file, but an administrator can give himself rights to read the file once again. We can make it set off all kinds of sirens if he does it, but that security isn’t good enough.

Well, the only way we can guarantee what they want is with encryption. But we’re nervous about making files that one and only one person can read, because last year, one of our executives went on vacation in Florida, fell ill, and died. We don’t want to be in a situation where critical information that a successor would need can’t be unlocked under any circumstance. So we need to encrypt in such a fashion that two people can unlock it, but only two. So the client’s backup is his assistant, and the assistant’s backup is the client. That way, if something ever happens to one of them, the other can unlock the files.

Password-protected Zip files are inadequate, because any computer manufactured within the past couple of years is more than fast enough to break the password through brute force in minutes, if not seconds. The same goes for password-protected Word and Excel documents. Windows 2000’s encryption makes it painfully easy to lock yourself out of your own files.

So I spent some time this afternoon trying to perfect a batch file that’ll take a directory, Zip it up with Info-Zip, then encrypt it with GnuPG. I chose those two programs because they’re platform-independent and open source, so there’s likely to always be some kind of support available for them, and this way we’re not subject to the whims of companies like NAI and PKWare. We’d be willing to pay for this capability, but this combination plus a little skullwork on my part is a better solution. For one, the results are compressed and encrypted, which commercial solutions usually aren’t. Since they may sometimes transfer the encrypted package over a dialup connection, the compression is important.

Plus, it’s really nice to not have to bother with procurement and license tracking. If 40 people decide they want this, we can just give it to them.

The biggest problem I ran into was that not all of the tools I had to use interpreted long filenames properly. Life would have been much easier if Windows 2000 had move and deltree commands as well. Essentially, here’s the algorithm I came up with:

Encrypt:
Zip up Private Documents subdirectory on user’s desktop
Encrypt resulting Zip file, dump file into My Documents
Back up My Documents to a network share

Decrypt and Restore:
Decrypt Zip file
Unzip file to C:Temp (I couldn’t get Unzip to go to %temp% properly)
Move files into Restored subdirectory on user’s desktop

I don’t present the batch files here yet because I’m not completely certain they work the right way every time yet.

They don’t quite have absolute security with this setup, but that’s where NTFS encryption comes in. If these guys are going to run this script every night to back the documents up, it’s no problem if they accidentally lock themselves out of those files. If their laptops get stolen, all local copies of the documents are encrypted so the thief won’t be able to read them. And the other user will be able to decrypt the copy stored on the server or on a backup tape. Or, I can be really slick and copy their GPG keys up onto the same network drive.

This job would be much easier with Linux and shell scripts–the language is far less clunky, and file naming is far less kludgy–but I have to make do. I guess in a pinch I could install the NT version of bash and the GNU utilities to give myself a Unixish environment to run the job, but that’s a lot more junk to install for a single purpose. That goes against my anti-bloat philosophy. I don’t believe in planning obsolescence. Besides, doing that would severely limit who could support this, and I don’t have to try to plant job security. I always get suspicious when people do things like that.

Is Windows optimization obsolete?

I read a statement on Bob Thompson’s website about Windows optimization, where he basically told a reader not to bother trying to squeeze more speed out of his Pentium-200, to spend a few hundred bucks on a hardware upgrade instead.
That’s flawed thinking. One of the site’s more regular readers responded and mentioned my book (thanks, Clark E. Myers). I remember talking at work after upgrading a hard drive in one of the servers last week. I said I ought to put my 10,000-rpm SCSI hard drive in a Pentium-133, then go find someone. “You think your Pentium 4 is pretty hot stuff, huh? Wanna race? Let’s see who can load Word faster.” And I’d win by a large margin. For that matter, if I were a betting man I’d be willing to bet a Pentium-200 or 233 with that drive would be faster than a typical P4 for everything but encoding MP3 audio and MP4 video.

Granted, I’ve just played into Thompson’s argument that a hardware upgrade is the best way to get more performance. An 18-gig 10K drive will run at least $180 at Hyper Microsystems, and the cheapest SCSI controller that will do it justice will run you $110 (don’t plug it into anything less than an Ultra Wide SCSI controller or the controller will be the bottleneck), so that’s not exactly a cheap upgrade. It might be marginally cheaper than buying a new case, motherboard, CPU and memory. Marginally. And even if you do that, you’re still stuck with a cruddy old hard drive and video card (unless the board has integrated video).

On the other hand, just a couple weekends ago I ripped out a 5400-rpm drive from a friend’s GW2K P2-350 and replaced it with a $149 Maxtor 7200-rpm IDE drive and it felt like a new computer. So you can cheaply increase a computer’s performance as well, without the pain of a new motherboard.

But I completely and totally reject the hypothesis that there’s nothing you can do in software to speed up a computer.

I was working on a computer at church on Sunday, trying to quickly burn the sermon onto CD. We’re going to start recording the sermon at the 8:00 service so that people can buy a CD after the 10:45 service if they want a copy of it. Since quality CDs can be had for a buck in quantity, we’ll probably sell discs for $2, considering the inevitable wear and tear on the drives. Today was the pilot day. The gain was set too high on the audio at 8:00, so I gave it another go at 10:45.

That computer was a Pentium 4, but that Pentium 4 made my Celeron-400 look like a pretty hot machine. I’m serious. And my Celeron-400 has a three-year-old 5400-rpm hard drive in it, and a six-year-old Diamond video card of some sort, maybe with the S3 ViRGE chipset? Whatever it is, it was one of the very first cards to advertise 3D acceleration, but the card originally sold for $149. In 1996, for 149 bucks you weren’t getting much 3D acceleration. As for its 2D performance, well, it was better than the Trident card it replaced.

There’s nothing in that Celeron-400 worth bragging about. Well, maybe the 256 megs of RAM. Except all the l337 h4xx0r5 bought 1.5 gigs of memory back in the summer when they were giving away 512-meg sticks in cereal boxes because they were cheaper than mini-frisbees and baseball cards (then they wondered why Windows wouldn’t load anymore), so 256 megs makes me look pretty lame these days. Forget I mentioned it.

So. My cruddy three-year-old Celeron-400, which was the cheapest computer on the market when I bought it, was outperforming this brand-new HP Pentium 4. Hmm.

Thompson says if there were any settings you could tweak to make Windows run faster, they’d be defaults.

Bull puckey.

Microsoft doesn’t give a rip about performance. Microsoft cares about selling operating systems. It’s in Microsoft’s best interest to sell slow operating systems. People go buy the latest and worst greatest, find it runs like a 1986 Yugo on their year-old PC, so then they go buy a Pentium 4 and Microsoft sells the operating system twice. Nice, isn’t it? After doing something like that once, people just buy a new computer when Microsoft releases a new operating system. Or, more likely, they buy a new computer every second time Microsoft releases a new operating system.

Microsoft counts on this. Intel counts on this. PC makers count on this. Best Bait-n-Switch counts on this. You should have seen those guys salivating over the Windows 95 launch. (It was pretty gross, really, and I didn’t just think that because I was running OS/2 at the time and wasn’t interested in downgrading.)

I’ve never had the privilege of working for an employer who had any money. Everywhere I’ve worked, we’ve bought equipment, then run it until it breaks, then re-treaded it and run it until it breaks again. Some of the people I work with have 486s on their desks. Not many (fortunately), but there are some. I’ve had to learn how to squeeze the last drop of performance out of some computers that never really had anything to offer in the first place. And I haven’t learned much in the past since I started my professional career in Feb. 1997, but I have learned one thing.

There’s a lot you can do to increase performance without changing any hardware. Even on an old Pentium.

First things first. Clean up that root directory. You’ve probably got dozens of backup copies of autoexec.bat and config.sys there. Get them gone. If you (or someone else) saved a bunch of stuff in the root directory, move it into C:My Documents where it belongs. Then defrag the drive, so the computer gets rid of the phantom directory entries. You’ll think you’ve got a new computer. I know, it’s stupid. Microsoft doesn’t know how to write a decent filesystem, and that’s why that trick works. Cleaning up a crowded root directory has a bigger effect on system performance than anything else you can do. Including changing your motherboard.

2. Uninstall any ancient programs you’re not running. Defrag afterward.

3. Right-click your desktop. See that Active Desktop crap? Turn it off. You’ll think you’ve got a new computer.

4. I am not making this up. (This trick isn’t in the book. Bonus.) Double-click My Computer. Go to Tools, Folder Options. Go to Web View. Select “Use Windows Classic Folders.” This makes a huge difference.

5. Turn off the custom mouse pointers you’re using. They’re slowing you down. Terribly.

6. Download and run Ad Aware. Spyware DLLs kill your system stability and speed. If you’ve got some spyware (you never know until you run it), Ad Aware could speed you up considerably. I’ve seen it make no difference. And I’ve seen it make all the difference in the world. It won’t cost you anything to find out.

7. Remove Internet Explorer. It’s a security risk. It slows down your computer something fierce. It’s not even the best browser on the market. You’re much better off without it. Download IEradicator from 98lite.net. It’ll remove IE from Win95, 98, ME, NT, and 2K SP1 or lower. If you run Windows 2000, reinstall, then run IEradicator, then install SP2 (or SP3 if it’s out by the time you read this). Then install Mozilla, or the lightweight, Mozilla-based K-Meleon instead. Need a lightweight mail client to replace Outlook Express? Give these a look. Run Defrag after you remove IE. You won’t believe how much faster your computer runs. Trust me. An Infoworld article several years back found that removing IE sped up the OS by as much as 15 percent. That’s more than you gain by moving your CPU up one speed grade, folks.

8. Reinstall your OS. OSs accumulate a lot of gunk, and sometimes the best thing to do is to back up your My Documents folder, format your hard drive, and reinstall your OS and the current versions of the apps you use. Then do all this other stuff. Sure, it takes a while. But you’ll have to do it anyway if you upgrade your motherboard.

9. Get a utilities suite. Norton Speed Disk does a much better job of defragmenting your hard drive than Windows’ built-in tool. It’s worth the price of Norton Utilities. Good thing too, because 90% of the stuff Norton Utilities installs is crap. Speed Disk, properly run, increases your disk performance enough to make your head spin. (The tricks are in the book. Sorry, I can’t give away everything.)

10. Get my book. Hey, I had to plug it somewhere, didn’t I? There are 3,000 unsold copies sitting in a warehouse in Tennessee. (O’Reilly’s going to get mad at me for saying that, so I’ll say it again.) Since there are 3,000 unsold copies sitting in a warehouse in Tennessee, that means there are about 3,000 people who don’t need to buy a new computer and may not know it. I don’t like that. Will there be an updated version? If those 3,000 copies sell and I can go to a publisher and tell them there’s a market for this kind of book based on the 2002 sales figures for my last one, maybe. Yes, there are things that book doesn’t tell you. I just told you those things. There are plenty of things that book tells you that this doesn’t. It’s 260 pages long for a reason.

Recent Microsoft OSs are high on marketing and low on substance. If Microsoft can use your computing resources to promote Internet Explorer, MSN, or anything else, they’ll do it. Yes, Optimizing Windows is dated. Spyware wasn’t known to exist when I wrote it, for instance. Will it help? Absolutely. I stated in that book that no computer made in 1996 or later is truly obsolete. I stand by that statement, even though I wrote it nearly three years ago. Unless gaming is your thang, you can make any older PC run better, and probably make it adequate for the apps you want to run. Maybe even for the OS you want to run. And even if you have a brand-new PC, there’s a lot you can do.

Like I said, I’d rather use my crusty old Celeron-400 than that brand-new P4. It’s a pile of junk, but it’s the better computer. And that’s entirely because I was willing to spend an hour or two cleaning it up.