Is Windows optimization obsolete?

I read a statement on Bob Thompson’s website about Windows optimization, where he basically told a reader not to bother trying to squeeze more speed out of his Pentium-200, to spend a few hundred bucks on a hardware upgrade instead.
That’s flawed thinking. One of the site’s more regular readers responded and mentioned my book (thanks, Clark E. Myers). I remember talking at work after upgrading a hard drive in one of the servers last week. I said I ought to put my 10,000-rpm SCSI hard drive in a Pentium-133, then go find someone. “You think your Pentium 4 is pretty hot stuff, huh? Wanna race? Let’s see who can load Word faster.” And I’d win by a large margin. For that matter, if I were a betting man I’d be willing to bet a Pentium-200 or 233 with that drive would be faster than a typical P4 for everything but encoding MP3 audio and MP4 video.

Granted, I’ve just played into Thompson’s argument that a hardware upgrade is the best way to get more performance. An 18-gig 10K drive will run at least $180 at Hyper Microsystems, and the cheapest SCSI controller that will do it justice will run you $110 (don’t plug it into anything less than an Ultra Wide SCSI controller or the controller will be the bottleneck), so that’s not exactly a cheap upgrade. It might be marginally cheaper than buying a new case, motherboard, CPU and memory. Marginally. And even if you do that, you’re still stuck with a cruddy old hard drive and video card (unless the board has integrated video).

On the other hand, just a couple weekends ago I ripped out a 5400-rpm drive from a friend’s GW2K P2-350 and replaced it with a $149 Maxtor 7200-rpm IDE drive and it felt like a new computer. So you can cheaply increase a computer’s performance as well, without the pain of a new motherboard.

But I completely and totally reject the hypothesis that there’s nothing you can do in software to speed up a computer.

I was working on a computer at church on Sunday, trying to quickly burn the sermon onto CD. We’re going to start recording the sermon at the 8:00 service so that people can buy a CD after the 10:45 service if they want a copy of it. Since quality CDs can be had for a buck in quantity, we’ll probably sell discs for $2, considering the inevitable wear and tear on the drives. Today was the pilot day. The gain was set too high on the audio at 8:00, so I gave it another go at 10:45.

That computer was a Pentium 4, but that Pentium 4 made my Celeron-400 look like a pretty hot machine. I’m serious. And my Celeron-400 has a three-year-old 5400-rpm hard drive in it, and a six-year-old Diamond video card of some sort, maybe with the S3 ViRGE chipset? Whatever it is, it was one of the very first cards to advertise 3D acceleration, but the card originally sold for $149. In 1996, for 149 bucks you weren’t getting much 3D acceleration. As for its 2D performance, well, it was better than the Trident card it replaced.

There’s nothing in that Celeron-400 worth bragging about. Well, maybe the 256 megs of RAM. Except all the l337 h4xx0r5 bought 1.5 gigs of memory back in the summer when they were giving away 512-meg sticks in cereal boxes because they were cheaper than mini-frisbees and baseball cards (then they wondered why Windows wouldn’t load anymore), so 256 megs makes me look pretty lame these days. Forget I mentioned it.

So. My cruddy three-year-old Celeron-400, which was the cheapest computer on the market when I bought it, was outperforming this brand-new HP Pentium 4. Hmm.

Thompson says if there were any settings you could tweak to make Windows run faster, they’d be defaults.

Bull puckey.

Microsoft doesn’t give a rip about performance. Microsoft cares about selling operating systems. It’s in Microsoft’s best interest to sell slow operating systems. People go buy the latest and worst greatest, find it runs like a 1986 Yugo on their year-old PC, so then they go buy a Pentium 4 and Microsoft sells the operating system twice. Nice, isn’t it? After doing something like that once, people just buy a new computer when Microsoft releases a new operating system. Or, more likely, they buy a new computer every second time Microsoft releases a new operating system.

Microsoft counts on this. Intel counts on this. PC makers count on this. Best Bait-n-Switch counts on this. You should have seen those guys salivating over the Windows 95 launch. (It was pretty gross, really, and I didn’t just think that because I was running OS/2 at the time and wasn’t interested in downgrading.)

I’ve never had the privilege of working for an employer who had any money. Everywhere I’ve worked, we’ve bought equipment, then run it until it breaks, then re-treaded it and run it until it breaks again. Some of the people I work with have 486s on their desks. Not many (fortunately), but there are some. I’ve had to learn how to squeeze the last drop of performance out of some computers that never really had anything to offer in the first place. And I haven’t learned much in the past since I started my professional career in Feb. 1997, but I have learned one thing.

There’s a lot you can do to increase performance without changing any hardware. Even on an old Pentium.

First things first. Clean up that root directory. You’ve probably got dozens of backup copies of autoexec.bat and config.sys there. Get them gone. If you (or someone else) saved a bunch of stuff in the root directory, move it into C:My Documents where it belongs. Then defrag the drive, so the computer gets rid of the phantom directory entries. You’ll think you’ve got a new computer. I know, it’s stupid. Microsoft doesn’t know how to write a decent filesystem, and that’s why that trick works. Cleaning up a crowded root directory has a bigger effect on system performance than anything else you can do. Including changing your motherboard.

2. Uninstall any ancient programs you’re not running. Defrag afterward.

3. Right-click your desktop. See that Active Desktop crap? Turn it off. You’ll think you’ve got a new computer.

4. I am not making this up. (This trick isn’t in the book. Bonus.) Double-click My Computer. Go to Tools, Folder Options. Go to Web View. Select “Use Windows Classic Folders.” This makes a huge difference.

5. Turn off the custom mouse pointers you’re using. They’re slowing you down. Terribly.

6. Download and run Ad Aware. Spyware DLLs kill your system stability and speed. If you’ve got some spyware (you never know until you run it), Ad Aware could speed you up considerably. I’ve seen it make no difference. And I’ve seen it make all the difference in the world. It won’t cost you anything to find out.

7. Remove Internet Explorer. It’s a security risk. It slows down your computer something fierce. It’s not even the best browser on the market. You’re much better off without it. Download IEradicator from 98lite.net. It’ll remove IE from Win95, 98, ME, NT, and 2K SP1 or lower. If you run Windows 2000, reinstall, then run IEradicator, then install SP2 (or SP3 if it’s out by the time you read this). Then install Mozilla, or the lightweight, Mozilla-based K-Meleon instead. Need a lightweight mail client to replace Outlook Express? Give these a look. Run Defrag after you remove IE. You won’t believe how much faster your computer runs. Trust me. An Infoworld article several years back found that removing IE sped up the OS by as much as 15 percent. That’s more than you gain by moving your CPU up one speed grade, folks.

8. Reinstall your OS. OSs accumulate a lot of gunk, and sometimes the best thing to do is to back up your My Documents folder, format your hard drive, and reinstall your OS and the current versions of the apps you use. Then do all this other stuff. Sure, it takes a while. But you’ll have to do it anyway if you upgrade your motherboard.

9. Get a utilities suite. Norton Speed Disk does a much better job of defragmenting your hard drive than Windows’ built-in tool. It’s worth the price of Norton Utilities. Good thing too, because 90% of the stuff Norton Utilities installs is crap. Speed Disk, properly run, increases your disk performance enough to make your head spin. (The tricks are in the book. Sorry, I can’t give away everything.)

10. Get my book. Hey, I had to plug it somewhere, didn’t I? There are 3,000 unsold copies sitting in a warehouse in Tennessee. (O’Reilly’s going to get mad at me for saying that, so I’ll say it again.) Since there are 3,000 unsold copies sitting in a warehouse in Tennessee, that means there are about 3,000 people who don’t need to buy a new computer and may not know it. I don’t like that. Will there be an updated version? If those 3,000 copies sell and I can go to a publisher and tell them there’s a market for this kind of book based on the 2002 sales figures for my last one, maybe. Yes, there are things that book doesn’t tell you. I just told you those things. There are plenty of things that book tells you that this doesn’t. It’s 260 pages long for a reason.

Recent Microsoft OSs are high on marketing and low on substance. If Microsoft can use your computing resources to promote Internet Explorer, MSN, or anything else, they’ll do it. Yes, Optimizing Windows is dated. Spyware wasn’t known to exist when I wrote it, for instance. Will it help? Absolutely. I stated in that book that no computer made in 1996 or later is truly obsolete. I stand by that statement, even though I wrote it nearly three years ago. Unless gaming is your thang, you can make any older PC run better, and probably make it adequate for the apps you want to run. Maybe even for the OS you want to run. And even if you have a brand-new PC, there’s a lot you can do.

Like I said, I’d rather use my crusty old Celeron-400 than that brand-new P4. It’s a pile of junk, but it’s the better computer. And that’s entirely because I was willing to spend an hour or two cleaning it up.

PC slumming

Slumming. I spent a portion of the day Saturday messing around with an old 486-133. The DCE at church asked me what it would take to build an intranet. I said an old PC. So he handed me an old 486-133. I can’t shake this machine. I built this computer back in 1994 or so for a law firm. I performed several upgrades on it, including the 133 MHz upgrade (it started out as either a 33 or a 66, not sure which). Three years ago or so, when it was obsolete, the firm called me and asked me to haul it away. I asked my church if they wanted it. They did.
This 486-133 is available because it lost its old job to an old Pentium-200 I scrounged up and rebuilt. Trying to run anything more than a simple fileserver is pushing the limits of this machine. But I like pushing the limits. So I decided to see what I could do with it. I took it home and opened it up. Hmm, It had a 72-pin and 4 30-pin SIMM sockets free. I tried out an old 8-meg SIMM I had. It didn’t like it. I thought I remembered seeing some old 30-pin SIMMs laying around…. I found some. I put them in. It counted to 20. Nice.

I tried out a 420-meg HD I’d salvaged from somewhere or another. The system detected it as an 850. Curious. I disconnected the true-blue 850 in the box. It still detected the 420 as an 850. Mislabeled, perhaps? I’ve seen stranger things. So I started to install Linux. I was able to partition the drive, but then it emitted a click-o’-death when Debian tried to initialize the swap partition. So I did what I should have done in the first place. I took off the cover. Next time someone asks me how a hard drive works, I’ll be able to show them. So the 850 flew solo.

Then I added the last from a stash of old DEC Etherworks 3 NICs I had (one of my employer’s clients handed me a bag of them months ago and said, “Donate them to your cause.” I’ve been giving them away one by one ever since) and installed Debian 2.2. Debian installed a lot slower than it does on a Pentium.

I installed Squishdot. I found it could be tweaked to give a very professional look. I also found it horribly confusing because it’s so unlike any other content management tool I’ve used. I messed around with it for a long while, but it was slow. Really slow.

I tried some alternative kernels. No improvement to speak of. I added the noatime parameter to the root partition’s entry in /etc/fstab. That helped a little.

But still, it was swapping out and the CPU was topping out as well. The homepage was taking 18 seconds to load. That’s not good.

Apache serves up static Web pages just fine–no slower than any other computer. But this dynamic stuff might just be too much.

So as a last resort, I compiled a lo-fat kernel. I took 2.2.19 and basically answered no to all but the absolute essentials. Mouse? Forget it. I was half-tempted to leave out floppy support, but that would make maintenance a bit more difficult.

It’s unfortunate that I don’t have any matching pairs of SIMMs laying around. Otherwise I’d swap the board out for a Pentium-75. I’ve got a couple of ancient Socket 5 boards laying around, and at least one Pentium-75 CPU. I’ve got two mismatched 4s, but that’s asking for instability, and I’m not sure if a P75 with 8 megs is any improvement over a 486-133 with 20.

Compiles take a couple of hours. I really should have just compiled a .deb package on a faster machine and moved it over. It seems hard to believe that it wasn’t terribly long ago that a 486 was a perfectly workable computer, and now it feels like a PC/XT. But the 486’s heyday was 10 years ago now. And 20 years ago, the PC/XT wasn’t on the market yet, though its direct ancestor, the IBM PC, was. So I guess it’s not too unreasonable to regard this 486-133 as the Turbo XT of today.

A philosophy of life

A week or two ago, e-mail went out from one of our VMS administrators. He wanted to reboot one of our VMS systems.
Later in the day, our other VMS administrator sent out mail. He wanted to reboot one of our systems.

My boss responded to the second message:

NT guys, please find something to reboot. The VMS guys are catching up with you.

-Middle Management

A couple days later, we had to reboot one of our NT systems. We were talking about that at lunch. I looked across the table. “We’re ahead of you today,” I said.

“You’re ahead of us most days,” he laughed. Yes, NT’s not exactly the bastion of stability, while VMS is as stable as Unix.

“It’s not enough for us to win the war. We have to win every battle,” I said, punctuating every syllable of the last two words with my fist on the table.

I was being ridiculous. We all knew I was being ridiculous.

But I was talking to one of my friends tonight about something completely unrelated, and as I struggled with what we were talking about, I realized those words, and the sentiment behind them, permeate just about everything I do.

I’m not sure that’s such a good thing.

And this surprises people because…?

Fox News seems to be surprised at how GenX raises its kids. I’m wondering why.
Let’s look at the 60s and 70s. The ’60s were the era of free love, which turned out not to be love, and actually turned out to cost a lot more than everyone thought, and really wasn’t all that fulfilling. The ’70s saw the wide availability of birth control and the legalization of abortion. Babies, wrote Strauss and Howe in Generations, were something you took pills to prevent. Kids in the movies of the time were hellions. Kids just weren’t a priority. The Boomers were trying to figure out what they wanted, in some cases having kids just in case, and going after what they wanted in no particular order.

One day when I was trying to explain it, I blurted out, “A lot of GenXers grew up with just one parent that didn’t want them. Lucky GenXers grew up with two parents who didn’t want them.”

That’s an overstatement, but not by much.

We are a reactionary generation, so we’re reacting to the way we were raised. We’re turning to religion (or, more frequently, spirituality), waiting longer to have kids, in some cases waiting longer to get married, and if we can work at home to spend more time with our families, we do it. (So when you ask me when I’m going to write another book, the answer probably is when I have a family to stay home with.) At the very least, we make an effort to be home.

Let’s go to Strauss and Howe. Keep in mind this was written in 1990:

“Economic risk-taking and cultural alienation will drive [GenXers] to seek stability in family life. First-wavers [born in the 1960s] may continue the Boom trend toward late marriage–not out of any quest for postadolescent self-discovery, but rather out of economic necessity and unwillingness to repeat the mistakes of their early-marrying, heavily divorcing Silent parents.”

In addition to this, they predicted that the USPS might come under ruinous attack by new enterprises run by GenXers. They had no idea what it would look like (they speculated it might involve computer hackers, as one of three possibilities) but it sounds to me like e-mail was just the ticket. They also predicted GenXers would change jobs a lot, and that we’d have loud, overpaid professional athletes who’d be full of themselves. They also predicted we’d be offensive (body piercing and tattoos maybe?), and…. AND they predicted an economic crisis right about now.

They also predicted a major crisis, along the same lines of World War II or at least the Cold War, sometime between the years of 2014 and 2025.

Fox News said GenX isn’t returning to the values of its grandparents. They’re right, and we won’t. Our grandparents are a different generational profile. Look to the people born from 1982 to now to do that.

What Strauss and Howe wrote was intentionally vague, because you can only predict trends, not events, by looking at generational cycles. But they sure seem to have gotten the trends pretty much right, especially now that we can go back with hindsight and start filling in some details.

So… GenX is going to raise its families in a more traditional manner than its parents, but won’t be as traditional as its grandparents. GenX will fight this current war, and get at best grudging respect and thanks from its elders. Analysts are already saying that the generation after GenX will be the next great generation, but they’re not old enough to get us through this crisis yet.

If and when the predicted crisis of 2014 comes, GenX will have to play a role in guiding us through it. But it will need help from the generations immediately before and after. That’s not necessarily a bad thing. George Washington’s generation found itself in exactly the same situation. The secret of Washington’s success was his ability to recognize the strengths in both his elders and his youngers, as well as in his own generation, and ask for help.

If GenX can emulate Washington, history will look on us with favor, though we’re likely forever slackers in the minds of the people who saw us alive.

Back to Strauss and Howe:

“Over four centuries, Reactive generations have been assigned the thankless job of yanking American history back on a stable course–and, afterward, have gotten few rewards for their sacrifices. Will this realization prompt [GenXers] to burn out young–or will it harden a gritty self-confidence around an important generational mission?”

But our time hasn’t come. So right now we’re just trying not to repeat the mistakes we saw others make. Those of us with families are starting at home. And that’s cool.

Another entry from the Clueless Dept.

Someone else who needs to buy a clue. I normally don’t have a problem with John Dvorak, and frequently I actually like his stuff. He’s not as clueless as some people make him out to be. Dvorak’s not as smart as he thinks he is, but one thing I’ve noticed about his critics is that they usually aren’t as smart as they think they are either.
Dvorak’s most recent Modest Proposal is that we fire all the technology ignorami out there and then, essentially, throw away corporate standards, let end-users run anything they bloody well want, and basically make them administrators of their own machines.

I’ve got a real problem with that. Case point: One of my employer’s executives recently brought in his home PC and insisted we get it running with remote access. Only one problem with that: He has Windows XP Home. XP Home’s networking is deliberately crippled, so businesses don’t try to save money by buying it. A sleazy move, but a reality we have to live with. We got it to work somewhat, but not to his satisfaction. He’s mad, but mostly because he doesn’t have any idea what changes went on under the hood in XP and doesn’t know he’s asking the impossible. But he’s perfectly competent using Word, Excel, PowerPoint and Outlook. He’s also very comfortable ripping his CDs to MP3 format–he’s got one of the largest MP3 collections in the company. He’s competent technologically. But he has no business with admin rights on his computer.

The same goes for a lot of our users. The record I’ve found for the most spyware-related files installed on a work PC is 87. These aren’t the technical ignorami who are installing this garbage. It’s the people who know how to use their stuff, but they love shareware and freeware. Maybe some of it helps them get their work done. But these people are the first to complain when their system crashes inexplicably. And I’m expected to keep not only the corporate standard apps like M$ Office running, but I’m also expected to support RealPlayer, Webshots, Go!Zilla, Gator, WinAmp, RealJukebox, AOL, and other programs that run ripshod all over the system and frequently break one another (or the apps I’m supposed to support).

If the users were completely responsible for keeping their systems running, that would be one thing. But install all that stuff on one computer and try to keep it running. You won’t have enough time to do your job.

Dvorak argues that people like me should solely be concerned with keeping the network working. That’s fine, but what about when some Luddite decides to ditch all modern apps and bring in an IBM PS/2 running DOS 5.0 and compatible versions of Lotus 1-2-3 and WordPerfect and dBASE? Unless there’s already an Ethernet card in that machine, I won’t be able to network it. And the person who decides a Macintosh SE/30 running System 6.0.8 is where it’s at will have a very difficult time getting on the network and won’t be able to exchange data with anyone else either.

Those scenarios are a bit ridiculous, but I’ve had users who would have done that if they could have. And someone wanting to run XP Home absolutely is not ridiculous, nor uncommon. If my job is to network every known operating system and make those users able to work together in this anarchy, my job has just become impossible.

As much as I would love for people to use Linux in my workplace and something other than Word and Outlook, the anarchy Dvorak is proposing is completely unworkable. It’s many orders of magnitude worse than the current situation.

This is just wrong too. Yes, New Englanders, I know about heartbreak. I’m from Kansas City. At least your Red Sox have posted more than one winning record in the past 10 years.

Anyway, not only are the Royals’ glory years over, they’ve forgotten where their glory years came from. They’ve once again denied Mark Gubicza entry into their Hall of Fame. Who? In the late 1980s, Mark Gubicza was the Royals’ second-best pitcher, behind Bret Saberhagen. Injuries did him in the same as Saberhagen (only a little sooner) but he’s still among their career leaders in wins and strikeouts.

And after spending 13 seasons in a Royals’ uniform, the Royals had a chance to trade Gubicza for hard-hitting DH Chili Davis. But you don’t trade a guy who’s poured his heart and soul into the team for 13 years and stayed completely and totally loyal to it no matter how much it hurt, right? Gubicza said yes. Gubicza went to the GM and told him that if he could make the Royals a better team by trading him, to trade him.

Chili Davis hit 30 home runs for the Royals in 1997. Then he bolted for the Yankees.

Meanwhile, Gubicza blew out his arm for good and the Angels released him. He pitched two games for them.

It takes a great man to tell the team he loves that the best thing he can do for them is to get traded for someone who can help the team more. That was Mark Gubicza. They don’t make ’em like him anymore.

But even more importantly, the immortal Charley Lau was once again denied entry. Who’s he? He was a journeyman catcher who spent his entire career as a backup and whose career batting average was .255, but that was because he had about zero natural ability. He was a genius with the bat, which was how he managed to hit .255. More importantly, Lau was the Royals’ hitting coach in the early 1970s. He spotted some skinny guy who was playing third base because Paul Schaal couldn’t play third base on artificial turf and their first choice to replace him, Frank White, couldn’t play third base at all. This skinny blond fielded just fine, but he was hitting terribly. Lau asked him what he was doing over the All-Star break. The kid said he was going fishing with Buck Martinez. Lau put his foot down. He told him he was going to stay in Kansas City and learn how to hit.

“He changed my stance. I had been standing up there like Carl Yastrzemski, but the next thing I knew I looked like Joe Rudi,” the kid recalled. But he started hitting. By the end of the year, he’d pulled his average up to a very respectable .282.

Soon Lau had every player on the Royals standing at the plate like Joe Rudi, and taking the top hand off the bat after contact with the ball. And the Royals created a mini-dynasty in the American League Western Division.

What was the name of that kid, anyway?

George Brett.

If it hadn’t been for Charley Lau, George Brett would have been nothing. The Royals probably would have never won anything. And they probably wouldn’t be in Kansas City anymore either. Who puts up with 30 years of losing, besides Cubs fans?

Charley Lau belongs in their Hall of Fame. Even if nobody besides George Brett and me remembers who he was.

All in no particular order…

U2. I couldn’t help but notice during U2’s halftime performance yesterday how much Bono has aged. Now, granted, he’s 42 or 43 now, so he’s not going to look 22 anymore, but last night he didn’t look 42 to me. His voice didn’t seem terribly strong either, but that’s something he’s battled for more than 20 years. During their famous Sarajevo gig in 1997, Edge had to sing a few numbers (including Sunday Bloody Sunday) because Bono had lost his voice.
Above all else, it was a show. The band showed up on stage, sans Bono. He was walking through the crowd. They played one obvious song (Beautiful Day), then in a flash of showmanship, projected the names of 9/11 victims as they played an obscure song off The Unforgettable Fire, the haunting MLK (one of two tributes to Martin Luther King Jr. on that album) before segueing into Where the Streets Have No Name, with a few improvised lyrics (including a chorus from All You Need is Love, a nod to Paul McCartney).

Very typical U2. U2 fans undoubtedly loved it or at least enjoyed it; not-so-big fans probably weren’t so impressed (they sounded worse than, for instance, Mariah Carey, but a musician I work with is convinced she was lip-syncing) and U2 haters probably found something else to hate. I was impressed that they didn’t sell out by playing three songs off their current album. They played a hit from a year ago, then they played an obscure song, then they played a minor hit from 15 years ago, but it wasn’t one of the two huge hits off that album.

Heartbreak. That was what the game itself was. The Rams didn’t show up to play for the first three quarters. I have to wonder how badly Warner was hurting, because he definitely didn’t look 100% (and if I can notice a difference, there definitely is one). I have to wonder what if he hadn’t taken those hits late in the game three weeks ago against Green Bay…?

Security. I see from this story that Linux is less secure than Windows, based on counting reports at SecurityFocus.

SecurityFocus reported a total of 96 Linux vulnerabilities, versus 42 Windows NT/2000 vulnerabilityes (24 for Windows 2000 and 18 for NT4.0). Buried deeper in the article, you see that Mandrake Linux 7.2 notched up 33 vulnerabilities, Red Hat 7.0 suffered 28, Mandrake 7.1 had 27 and Debian 2.2 had 26.

So, first things first, James Middleton seems to think 2=4.

Now, math aside, those 26 Debian vulnerabilities were in all likelihood present in all the other distributions. So there’s a lot of triple- or even quadruple-counting here.

I remember a good number of those Linux vulnerabilities. Some of them were buffer overflows in utilities that would be difficult or impossible to exploit without shell access to the machine. Some of them were in daemons (services) that may or may not be running at any given time. Very few were in the kernel itself. Bottom line is, a typical Linux-based Web server sitting behind a firewall with only port 80 exposed probably didn’t have anything to worry about. The same goes for a typical Linux-based Samba server.

This isn’t like Windows, where you get the components Microsoft deems necessary, whether you want them or not, and you fear removing or disabling them because you don’t know what else will break and have no way of knowing. With Mandrake, you’ll get some services you don’t want, but you can disable them without breaking stuff. Red Hat has reformed and installs surprisingly little in its minimum installation these days. Debian installs even less.

So, the dirty little secret this article didn’t tell you: Not all the security problems affected any given Linux server. Chances are most of the security flaws affected any given Windows server.

I hate it when technology journalists blindly spit out numbers without having a clue what they mean.

I may publish again. I was mad enough to fire off a proposal to one of my former editors to see if he’d be interested in a few magazine articles. It’s time there was some stuff out there written by someone who has a clue what he’s talking about.

Useful link. For once I saw a banner ad that halfway interested me today. At LowerMyBills.com you can compare different utilities services available to you. Long-distance rates include both the interstate and intrastate rate (important if you’re like me and rarely call out-of-state). Alas, they don’t list local phone service providers, and their high-speed Internet listings aren’t complete, but it’s better than nothing. They also do listings for loans and debt relief, neither of which I need right now.

If the site’s useful to you, you’ll know.

Off to a funeral visitation

I went to a funeral visitation yesterday. One of my colleagues lost his dad early Friday morning. Visitation was yesterday afternoon, so I made a point to drive up there.
His dad was 80, a pillar of the community, and from the number of people there, pretty obviously a lot of people are going to miss him. That’s a good feeling. I know. I lost my dad more than seven years ago.

Alan asked me a couple of questions. “I thought I’d heard you lost your dad already.”

“Yep. In 1994.”

“I miss him. I’ll bet you still miss yours, don’t you?”

“I do.”

“And I’ll bet it comes in spurts?”

“It does.”

I didn’t have anything profound to say to Alan. He could tell I understood. I think. I hope.

“Don’t stuff your emotions. That’s the worst thing you can do. You’ll miss him, so miss him. And,” I looked over in the direction of his dad and at the crucifix sitting in the coffin, “You’ll see him again.”

How Linux could own the education market

How Linux could own the education market. I spent some time yesterday evening working on computers. They were contrasts to the extreme: One, a brand-spankin’ new 1 GHz AMD Duron system with 512MB of RAM and 80 GB of 7200-rpm storage (IDE, unfortunately–but for $800, what do you want?). The other was an elderly AST 486SX/25 running Windows 3.1 belonging to a local teacher who goes to my church.
She teaches kindergarten, and the AST used to be her home computer. When she bought a Compaq Presario a couple of years ago, she took the AST to school. It’s more useful there than in her basement, and there’d be no computer in her classroom if it weren’t for that.

I don’t understand why that is. As much as my sister jokes about it, we don’t exactly live in the ghetto. The school district has money, but it isn’t spending it on computers. Whether that’s a good or bad thing depends on your point of view. The majority of people living in Oakville probably own home computers, so this probably isn’t contributing to the technology gap. But I wonder sometimes how things might have been if I’d been exposed to computers a few years earlier.

I was shocked how much I remembered about Windows 3.1. And I was able to figure out how to get her CD-ROM drive to play music CDs. Don’t ask me how; this was the first I’d messed with Windows 3.1 since 1994 and I’d prefer it stay that way–I was so impressed by Windows 3.1 that I’m one of the 12 people who actually went out and paid money for OS/2. I own actual, retail-box copies of OS/2 2.1, 3.0, and 4.0. And I remember distinctly thinking that her computer has enough memory to run OS/2 at least as well as it runs Windows 3.1…

I also remember distinctly thinking that my employer pays someone $15 a pound to haul better computers than hers away several times a year. We regard 486s as junk; low-end Pentiums may also go out, depending on whether the right person finds out about them beforehand. Usually they work just fine–the problem isn’t the computers, it’s people trying to run Internet Exploiter 6 and Office 2000 on them. They’d run Windows 95 and Office 95 perfectly fine.

But a lot of times we can’t give these old computers away because the licenses for the software that originally came with them are long gone. Old computers are useless without software, so no one would want them anyway.

Now, let me tell you something about kids. Kids don’t care much about the computers they use. As long as there’s software on them, they’ll use them. When I was a kid 20 years ago, I used Radio Shack TRS-80 computers at school. The next year, my family moved, and my new school had Commodore 64s. I couldn’t tell much difference. My next-door neighbor had a Radio Shack Color Computer. They were computers. The Commodores had better graphics, but from a usability standpoint, the biggest difference was where the cartridge slot was so you could change programs. Later on I took a summer class at the local junior college, learning about Apple IIs and IBM PCs. I adjusted smoothly. So did all the other kids in the class. Software was software.

Kids don’t care if the computer they’re using runs Windows or Mac OS or Linux. All they care about is whether there are cool programs to run.

So, businesses throw useless computers away, or they give useless computers to schools so they don’t have to pay someone to haul them away. And schools don’t generally know what to do with obsolete computers that lack software.

Linux won’t run fabulously on old 486s, but Debian with a lightweight window manager like IceWM will run OK. (Let’s face it, Windows 3.1 doesn’t run fabulously on them either–it crashes if you breathe wrong.) I know of a project to clone Oregon Trail on Linux. Great start. How about Sea Route to India? I remember playing that on C-64s at school. It may have been a type-in out of a magazine–I don’t remember where exactly it came from. In these violent times, Artillery might be too controversial, but it taught us early on about angles and forces. Artillery was an ancestor to games like Scorched Earth, but without the heavy-duty nukes. Close wasn’t good enough to win in Artillery. You had to be exact. And no blowing up the mountains between you and your opponents either. You had to figure out how to get over them.

But what about doing homework? By the time I was in the sixth grade, they were teaching us how to use word processors and databases and spreadsheets. AbiWord is a fabulous lightweight word processor. It gives you fonts and spell-checking and good page formatting. (I learned word processing on Bank Street Writer. AbiWord is a far, far cry from that. Frankly, I’d rather write a paper with vi than with Bank Street Writer.) Besides being feature-rich, AbiWord’s been lightning fast on every computer I’ve tried it on. Gnumeric is a nice, fast, capable spreadsheet. I don’t know of a free-form database, but I haven’t looked for one lately either. (I don’t think we need to be trying to teach our 6th graders SQL.)

But what about for younger kids? I remember a program called The Factory. The object was you combined chemicals to make monsters. Different chemicals made different monsters. I seem to remember you played around to see what chemicals would make which heads and torsos and arms. Then the computer started showing you monsters and you had to figure out what chemicals to give it to match them. I also remember a program called Snooper Troops. I don’t remember much else about it, other than it was a mystery and you went around looking for clues, and one of my classmates accidentally formatted the disk one day before any of us had managed to solve it. We couldn’t get the disk replaced, because it was out of print.

And Spinnaker had all sorts of simple titles for younger kids that let them tell stories and other stuff. It seemed cool at the time. But that was almost 20 years ago, so about all I remember was that sailboat logo and some corny theme music.

The other thing about those old days was that the majority of these programs were written in Basic. An ambitious teacher could modify them, to make them easier or harder, or improve the graphics a little. As we got older and learned to program, some of us would try our hand at making changes. You can’t do that anymore with Windows or Macintosh educational titles. Open source can bring all that back too, provided the programs are written in languages like Perl or Python. And it can give cash-strapped schools a way to get computers where kids can use them.

Now I’m wondering what it would take to write something like The Factory in Python…

Ho-hum.

Another day, another Outlook worm. Tell me again why I continue to use Outlook? Not that I ever open unexpected attachments. For that matter, I rarely open expected ones–I think it’s rude. Ever heard of cut and paste? It’s bad enough that I have to keep one resource hog open to read e-mail, so why are you going to make me load another resource hog, like Word or Excel, to read a message where the formatting doesn’t matter?
The last couple of times I received Word attachments that were important, I converted them to PDFs for grins. Would you believe the PDFs were considerably smaller? I was shocked too. Chances are there was a whole lot of revisioning data left in those documents–and it probably included speculative stuff that underlings like me shouldn’t see. Hmm. I guess that’s another selling point for that PDF-printer we whipped up as a proof of concept a couple of weeks ago, isn’t it? I’d better see if I can get that working again. I never did get it printing from the Mac, but seeing as all the decision-makers who’d be using it for security purposes use PCs, that’s no problem.

I spent the day learning a commercial firewall program. (Nope, sorry, won’t tell you which one.) My testbed for this thing will be an old Gateway 2000 box whose factory motherboard was replaced by an Asus SP97 at some point in the past. It’s got 72 megs of RAM. I put in an Intel Etherexpress Pro NIC today. I have another Etherexpress Pro card here that I’m bringing in, so I’ll have dual EEPros in the machine. The firewall has to run under Red Hat, so I started downloading Red Hat 7.2. I learned a neat trick.

First, an old trick. Never download with a web browser. Use the command-line app wget instead. It’s faster. The syntax is really simple: wget url. Example: wget http://www.linuxiso.org/download/rh7.2-i386-disc1.iso

Second trick: Download your ISOs off linuxiso.org. It uses some kind of round-robin approach to try to give you the least busy of several mirrors. It doesn’t always work so well on the first try. The mirror it sent me to first was giving me throughput rates that topped out at 200KB/sec., but frequently dropped as low as 3KB/sec.Usually they stayed in the 15MB/sec range. I cancelled the transfer (ctrl-c) and tried again. I got a mirror that didn’t fluctuate as wildly, but it rarely went above the 20MB/sec. range. I cancelled the transfer again and got a mirror that rarely dropped below 50MB/sec and occasionally spiked as high as 120MB/sec. Much better.

Third trick (the one I learned today): Use wget’s -c option. That allows wget to resume transfers. Yep, you can get the most important functionality of a download manager in a 147K binary. It doesn’t spy on you either. That allowed me to switch mirrors several times without wasting the little bit I’d managed to pull off the slow sites.

Fourth trick: Verify your ISOs after you download them. LinuxISO provides MD5 sums for its wares. Just run md5sum enigma-i386-disc1.iso to get a long 32-character checksum for what you just downloaded. If it doesn’t match the checksum on the site, don’t bother burning it. It might work, but you don’t want some key archive file (like, say, the kernel) to come up corrupt. Even though CD-Rs are dirt cheap these days and high-speed burners make quick work of them, there’s still no point in unnecessarily wasting 99 cents and five minutes on the disc and half an hour on a questionable install.

As for downloading the file in separate pieces like Go!Zilla does, there’s a command-line Linux program called mget that does it, but it doesn’t follow redirection and it doesn’t do FTP except through a proxy server, so I have a hard time recommending it as a general-purpose tool. When it works, it seems to work just fine. You might try mget, but chances are decent you’ll end up falling back on wget.