Is Windows optimization obsolete?

I read a statement on Bob Thompson’s website about Windows optimization, where he basically told a reader not to bother trying to squeeze more speed out of his Pentium-200, to spend a few hundred bucks on a hardware upgrade instead.
That’s flawed thinking. One of the site’s more regular readers responded and mentioned my book (thanks, Clark E. Myers). I remember talking at work after upgrading a hard drive in one of the servers last week. I said I ought to put my 10,000-rpm SCSI hard drive in a Pentium-133, then go find someone. “You think your Pentium 4 is pretty hot stuff, huh? Wanna race? Let’s see who can load Word faster.” And I’d win by a large margin. For that matter, if I were a betting man I’d be willing to bet a Pentium-200 or 233 with that drive would be faster than a typical P4 for everything but encoding MP3 audio and MP4 video.

Granted, I’ve just played into Thompson’s argument that a hardware upgrade is the best way to get more performance. An 18-gig 10K drive will run at least $180 at Hyper Microsystems, and the cheapest SCSI controller that will do it justice will run you $110 (don’t plug it into anything less than an Ultra Wide SCSI controller or the controller will be the bottleneck), so that’s not exactly a cheap upgrade. It might be marginally cheaper than buying a new case, motherboard, CPU and memory. Marginally. And even if you do that, you’re still stuck with a cruddy old hard drive and video card (unless the board has integrated video).

On the other hand, just a couple weekends ago I ripped out a 5400-rpm drive from a friend’s GW2K P2-350 and replaced it with a $149 Maxtor 7200-rpm IDE drive and it felt like a new computer. So you can cheaply increase a computer’s performance as well, without the pain of a new motherboard.

But I completely and totally reject the hypothesis that there’s nothing you can do in software to speed up a computer.

I was working on a computer at church on Sunday, trying to quickly burn the sermon onto CD. We’re going to start recording the sermon at the 8:00 service so that people can buy a CD after the 10:45 service if they want a copy of it. Since quality CDs can be had for a buck in quantity, we’ll probably sell discs for $2, considering the inevitable wear and tear on the drives. Today was the pilot day. The gain was set too high on the audio at 8:00, so I gave it another go at 10:45.

That computer was a Pentium 4, but that Pentium 4 made my Celeron-400 look like a pretty hot machine. I’m serious. And my Celeron-400 has a three-year-old 5400-rpm hard drive in it, and a six-year-old Diamond video card of some sort, maybe with the S3 ViRGE chipset? Whatever it is, it was one of the very first cards to advertise 3D acceleration, but the card originally sold for $149. In 1996, for 149 bucks you weren’t getting much 3D acceleration. As for its 2D performance, well, it was better than the Trident card it replaced.

There’s nothing in that Celeron-400 worth bragging about. Well, maybe the 256 megs of RAM. Except all the l337 h4xx0r5 bought 1.5 gigs of memory back in the summer when they were giving away 512-meg sticks in cereal boxes because they were cheaper than mini-frisbees and baseball cards (then they wondered why Windows wouldn’t load anymore), so 256 megs makes me look pretty lame these days. Forget I mentioned it.

So. My cruddy three-year-old Celeron-400, which was the cheapest computer on the market when I bought it, was outperforming this brand-new HP Pentium 4. Hmm.

Thompson says if there were any settings you could tweak to make Windows run faster, they’d be defaults.

Bull puckey.

Microsoft doesn’t give a rip about performance. Microsoft cares about selling operating systems. It’s in Microsoft’s best interest to sell slow operating systems. People go buy the latest and worst greatest, find it runs like a 1986 Yugo on their year-old PC, so then they go buy a Pentium 4 and Microsoft sells the operating system twice. Nice, isn’t it? After doing something like that once, people just buy a new computer when Microsoft releases a new operating system. Or, more likely, they buy a new computer every second time Microsoft releases a new operating system.

Microsoft counts on this. Intel counts on this. PC makers count on this. Best Bait-n-Switch counts on this. You should have seen those guys salivating over the Windows 95 launch. (It was pretty gross, really, and I didn’t just think that because I was running OS/2 at the time and wasn’t interested in downgrading.)

I’ve never had the privilege of working for an employer who had any money. Everywhere I’ve worked, we’ve bought equipment, then run it until it breaks, then re-treaded it and run it until it breaks again. Some of the people I work with have 486s on their desks. Not many (fortunately), but there are some. I’ve had to learn how to squeeze the last drop of performance out of some computers that never really had anything to offer in the first place. And I haven’t learned much in the past since I started my professional career in Feb. 1997, but I have learned one thing.

There’s a lot you can do to increase performance without changing any hardware. Even on an old Pentium.

First things first. Clean up that root directory. You’ve probably got dozens of backup copies of autoexec.bat and config.sys there. Get them gone. If you (or someone else) saved a bunch of stuff in the root directory, move it into C:My Documents where it belongs. Then defrag the drive, so the computer gets rid of the phantom directory entries. You’ll think you’ve got a new computer. I know, it’s stupid. Microsoft doesn’t know how to write a decent filesystem, and that’s why that trick works. Cleaning up a crowded root directory has a bigger effect on system performance than anything else you can do. Including changing your motherboard.

2. Uninstall any ancient programs you’re not running. Defrag afterward.

3. Right-click your desktop. See that Active Desktop crap? Turn it off. You’ll think you’ve got a new computer.

4. I am not making this up. (This trick isn’t in the book. Bonus.) Double-click My Computer. Go to Tools, Folder Options. Go to Web View. Select “Use Windows Classic Folders.” This makes a huge difference.

5. Turn off the custom mouse pointers you’re using. They’re slowing you down. Terribly.

6. Download and run Ad Aware. Spyware DLLs kill your system stability and speed. If you’ve got some spyware (you never know until you run it), Ad Aware could speed you up considerably. I’ve seen it make no difference. And I’ve seen it make all the difference in the world. It won’t cost you anything to find out.

7. Remove Internet Explorer. It’s a security risk. It slows down your computer something fierce. It’s not even the best browser on the market. You’re much better off without it. Download IEradicator from 98lite.net. It’ll remove IE from Win95, 98, ME, NT, and 2K SP1 or lower. If you run Windows 2000, reinstall, then run IEradicator, then install SP2 (or SP3 if it’s out by the time you read this). Then install Mozilla, or the lightweight, Mozilla-based K-Meleon instead. Need a lightweight mail client to replace Outlook Express? Give these a look. Run Defrag after you remove IE. You won’t believe how much faster your computer runs. Trust me. An Infoworld article several years back found that removing IE sped up the OS by as much as 15 percent. That’s more than you gain by moving your CPU up one speed grade, folks.

8. Reinstall your OS. OSs accumulate a lot of gunk, and sometimes the best thing to do is to back up your My Documents folder, format your hard drive, and reinstall your OS and the current versions of the apps you use. Then do all this other stuff. Sure, it takes a while. But you’ll have to do it anyway if you upgrade your motherboard.

9. Get a utilities suite. Norton Speed Disk does a much better job of defragmenting your hard drive than Windows’ built-in tool. It’s worth the price of Norton Utilities. Good thing too, because 90% of the stuff Norton Utilities installs is crap. Speed Disk, properly run, increases your disk performance enough to make your head spin. (The tricks are in the book. Sorry, I can’t give away everything.)

10. Get my book. Hey, I had to plug it somewhere, didn’t I? There are 3,000 unsold copies sitting in a warehouse in Tennessee. (O’Reilly’s going to get mad at me for saying that, so I’ll say it again.) Since there are 3,000 unsold copies sitting in a warehouse in Tennessee, that means there are about 3,000 people who don’t need to buy a new computer and may not know it. I don’t like that. Will there be an updated version? If those 3,000 copies sell and I can go to a publisher and tell them there’s a market for this kind of book based on the 2002 sales figures for my last one, maybe. Yes, there are things that book doesn’t tell you. I just told you those things. There are plenty of things that book tells you that this doesn’t. It’s 260 pages long for a reason.

Recent Microsoft OSs are high on marketing and low on substance. If Microsoft can use your computing resources to promote Internet Explorer, MSN, or anything else, they’ll do it. Yes, Optimizing Windows is dated. Spyware wasn’t known to exist when I wrote it, for instance. Will it help? Absolutely. I stated in that book that no computer made in 1996 or later is truly obsolete. I stand by that statement, even though I wrote it nearly three years ago. Unless gaming is your thang, you can make any older PC run better, and probably make it adequate for the apps you want to run. Maybe even for the OS you want to run. And even if you have a brand-new PC, there’s a lot you can do.

Like I said, I’d rather use my crusty old Celeron-400 than that brand-new P4. It’s a pile of junk, but it’s the better computer. And that’s entirely because I was willing to spend an hour or two cleaning it up.

Editing my second video…

You know it’s a different kind of church when you see one making music videos. You’re probably not too surprised to hear that’s the kind of church I go to. And you’re probably not too surprised to hear I’m involved.
I spent a healthy chunk of time Monday editing video. A local radio personality recorded a version of “Mary Did You Know?” a few years back. I know, that doesn’t sound good, but his version is pretty powerful. I’ve heard several versions of it, and I think I like his best, and I’m not just saying that because I know people who know him. I’m also not just saying that because he gave us permission to use the recording. If that version wasn’t good, I’d have assembled a band to re-record it–one of the guys in my Bible study group plays guitar, and another one of them plays drums and has a recording studio in his basement.

So anyway, I’ve got a song I can legally use, and we secured permission to use a couple of different movies about Jesus so we’d have some footage to put to the video. And I gave myself a crash course in Premiere. Put the emphasis on “crash,” because I did bluescreen 2000 at one point. I muttered something about toy operating systems and got back to work. I hope Adobe eventually gets a clue about Linux–there’s plenty of proprietary, high-end video stuff out there for Linux, but nothing in the prosumer arena yet. And I do believe that if you build it, they will come.

After too many hours, I had something halfway workable. Since I was dealing with professional footage, I had a giant headstart. My partner in crime, Brad, had written up an outline that I more or less followed. There were one or two minor points where I didn’t agree with him about where the video fit, so I changed them, but I’d say I went with his outline 75% of the time, if not much more.

So I called Brad and asked him if he wanted to come over. I figured out how to get my DV500 to output to my ancient Commodore composite monitor, which was a good thing, The video was showing up much too dark on my computer screen, but when I exported to NTSC it was beautiful. I’d been playing with levels trying to get it right; I ended up just undoing all of the changes.

What I had can’t be considered finished product; the transitions are pretty lame where there are any at all, and I had a couple of gaps where I didn’t have any video that fit so I threw in a Rembrandt painting. Then I noticed that it didn’t matter what you did to the color on a Rembrandt painting; it still looked far better than any video I’ve ever seen, so I went looking for other Rembrandt paintings to put in. So the video was substantially done, but there’ll be minor changes.

It blew Brad away. I’ll admit, I learned from our first video, so the big mistakes that were in the first video aren’t in this one. And Premiere has great tools to help you avoid those mistakes–you can set the timeline to show every single frame in the video, and to show the waveform of the audio, which takes the guesswork out of transitions and lining things up.

At the end of it, Brad turned to me. “Dave, you are an artist. Do you know that?”

I’m not so sure about that one. Brad’s my ideas man. He tells me what he sees in his head, then I try to find a way to somehow put it up on the screen. And every once in a while I’ll get a better idea. Those are usually 3-4 seconds long. So then I revert back to his. And the result is something that looks decent. Plus a number of the things that happened were just accidents. I had some video of Jesus and the disciples walking through a field with some sheep in the background. I threw it in for lack of anything else to put there. Then about the 10th time I’d played through that sequence–you do a lot of playback during editing–I noticed that during the line “Did you know that your baby boy was Heaven’s perfect lamb?” Jesus happened to look down–towards a lamb walking past. I’d be pretty impressed if someone else put that subtle detail in there. But this was an accident. Or, more likely, it was God doing me a favor.

It’s been a lot of work, but a lot of fun.

And, incidentally, if you ever find yourself having to do any video production, Premiere 6 is an excellent product. I really dislike Adobe as a company, and I wish there were a better product out there than Premiere 6, but I sure haven’t found it. At $250, the Pinnacle DV200 bundled with Premiere 6 is a steal. If you’re into home movies and already have a camcorder with a firewire port (or are considering one), a DV200 and a little time will give you the snazziest home movies on the block.

Sorcerer, meet Squid. Squid, meet Sorcerer.

I didn’t feel all that well last night. Not sure if I’m coming down with something, or if it’s something else. I’ve actually felt a little weird for the last couple of days, so I’ve been sucking down zinc lozenges, and I remembered Steve DeLassus’ advice the last time I got sick: swallow a raw garlic clove. I felt fine the next day. So guess what I had for breakfast this morning? That’ll solve the problem of anyone wanting to come near me all day…
I napped a good part of the evening, but I got a little work done. I finally got the guts to raise my hand in the Sorcerer mailing list and ask if anyone else was having problems compiling XFree86. Turns out there was a bug. So now I don’t feel so stupid. It took a couple of hours to compile, and at first I configured it wrong, but now I’ve got a usable GUI.

I also installed Squid on the Sorcerer box. There isn’t a spell for Squid yet, and I’m not positive I can write it (it requires adding users and doinking with configuration files, and editing configuration files automatically goes a little beyond my Unix lack-of-expertise), but I may give it a try. One thing that annoys me about Squid: It uses really lame compiler options, and it ignores the system default options. I need to learn the syntax of make files so I can try to override that. The main reason to run Squid is for performance, so who wouldn’t want a Squid compiled to wring every ounce of performance it can out of the CPU?

But at any rate, I installed it, and did minimal–and I mean minimal–configuration: adding a user “squid” and setting it to run as that user, changing ownership of its directory hierarchy, opening it up to the world (I’m behind a firewall), running squid -NCd1, and putting a really lame script in /etc/rc3.d. Here’s the script:

#!/bin/sh
echo “Starting Squid…”
/usr/local/squid/bin/squid

See? Told you it was lame.

Performance? It smokes. There are a few sites that Squid seems to slow down no matter what, but www.kcstar.com absolutely rips now, so I can get my Royals updates faster.

It makes sense. My Squid boxes have previously been TurboLinux boxes, which are nice, minimalist systems, but they’re designed for portability. In other words, they’re still 386-optimized. Plus, they’re running the 2.2 kernel and ext2. This one’s running 2.4.9, disk formatted reiserfs, with everything optimized for i686.

Building 98 boxes

I knuckled down yesterday at work and started building a new laptop image for some deployed users. What they’re using now isn’t stable and it isn’t fast, and much of the software is dated. So rather than patch yet again, we’re starting over. I built a 98 install, leaving out anything I could (such as Drive Converter, since we’re already using FAT32 over my protests, and Disk Compression, which isn’t compatible with FAT32 and I just know it’s only a matter of time before some end user decides he’s too short on disk space and runs it only to be greeted by a PC that won’t boot).
Law #1: The more you install, the slower the system runs, and no amount of disk or registry optimization will completely make up for that.

After I got a decent 98 install down, I did some cleanup. All the .txt files in the Windows directory? Gone. All the BMP files? See ya. Channel Screen Saver? B’bye. I got the C:Windows directory down under 150 entries without losing any functionality. There are probably some GIF and JPEG files in there, and some WAVs possibly, that can also go. I’ll have to check. And of course I did my standard MSDOS.SYS tweaks.

Then I defragmented the drive, mostly to get the directories compressed, rebooted, and timed it. 18 seconds. Not bad for a P2-300.

Next, I installed Office 2000. Once I got all that in place, Windows’ boot time ballooned to 32 seconds, which just goes to show how Microsoft apps screw around on the OS’s turf entirely too much–Office makes more changes to the OS than Internet Explorer–but the boot time is still well below what we’ve come to expect from a P2-300.

One of my coworkers had the nerve to say, “Don’t forget to run Cacheman!” Cacheman my ass. I can put vcache entries in system.ini myself, thank you very much. And I can change the file and path cache in the Registry myself, without having to use some lame program to do it. And cleaning up the directories makes a much bigger difference than those hacks do. It just doesn’t make you feel l33t or anything. Heaven forbid we should ever do anything simple and effective to improve system performance.

Law #2: Most of the tweaks floating around there on the ‘Net do little more than let you feel like you’ve done something. I condensed the useful tricks into a single book chapter. And I also told you what those tricks really do, and the side effects they have, unlike a certain multi-megabyte Web site hosted on AOL… You can do the majority of the things you need to do by practicing restraint and judiciously using just a small number of software tools.

I know how to make a fast Win98 PC. It’s not like I wrote a book about that or anything…

Oh, but how am I ensuring stability? I’m forcing the issue. Yes, I see that list of 47 software packages they have to have. Here’s Windows and Office 2000 and ACT!. Now they have to test it. Does it crash? OK. Now we’ll add the remaining 44 things, one at a time and see which one is breaking stuff. If it’s unstable by the time all of that’s done, it’s because the end users who were testing were sloppy with their testing.

Mac mice, PC data recovery

A two-button Mac mouse!? Frank McPherson asked what I would think of the multibutton/scroll wheel support in Mac OS X. Third-party multibutton mice have been supported via extensions for several years, but not officially from Ye Olde Apple. So what do I think? About stinkin’ time!

I use 3-button mice on my Windows boxes. The middle button double-clicks. Cuts down on clicks. I like it. On Unix, where the middle button brings up menus, I’d prefer a fourth button for double-clicking. Scroll wheels I don’t care about. The page up/down keys have performed that function just fine for 20 years. But some people like them; no harm done.

Data recovery. One of my users had a disk yesterday that wouldn’t read. Scandisk wouldn’t fix it. Norton Utilities 2000 wouldn’t fix it. I called in Norton Utilities 8. Its disktool.exe includes an option to revive a disk, essentially by doing a low-level format in place (presumably it reads the data, formats the cylinder, then writes the data back). That did the trick wonderfully. Run Disktool, then run NDD, then copy the contents to a fresh disk immediately.

So, if you ever run across an old DOS version of the Norton Utilities (version 7 or 8 certainly; earlier versions may be useful too), keep them! It’s something you’ll maybe need once a year. But when you need them, you need them badly. (Or someone you support does, since those in the know never rely on floppies for long-term data storage.) Recent versions of Norton Utilities for Win32 don’t include all of the old command-line utilities.

Hey, who was the genius who decided it was a good idea to cut, copy and paste files from the desktop? One of the nicest people in the world slipped up today copying a file. She hit cut instead of copy, then when she went to paste the file to the destination, she got an error message. Bye-bye file. Cut/copy-paste works fine for small files, but this was a 30-meg PowerPoint presentation. My colleague who supports her department couldn’t get the file back. I ride in on my white horse, Norton Utilities 4.0 for Windows in hand, and run Unerase off the CD. I get the file back, or so it appears. The undeleted copy won’t open. On a hunch, I hit paste. Another copy comes up. PowerPoint chokes on it too.

I tried everything. I ran PC Magazine’s Unfrag on it, which sometimes fixes problematic Office documents. No dice. I downloaded a PowerPoint recovery program. The document crashed the program. Thanks guys. Robyn never did you any harm. Now she’s out a presentation. Not that Microsoft cares, seeing as they already have the money.

I walked away wondering what would have happened if Amiga had won…

And there’s more to life than computers. There’s songwriting. After services tonight, the music director, John Scheusner, walks up and points at me. “Don’t go anywhere.” His girlfriend, Jennifer, in earshot, asks what we’re plotting. “I’m gonna play Dave the song that he wrote. You’re more than welcome to join us.”

Actually, it’s the song John and I wrote. I wrote some lyrics. John rearranged them a little (the way I wrote it, the song was too fast–imagine that, something too fast from someone used to writing punk rock) and wrote music.

I wrote the song hearing it sung like The Cars, (along the lines of “Magic,” if you’re familiar with their work) but what John wrote and played sounded more like Joe Jackson. Jazzy. I thought it was great. Jennfier thought it was really great.

Then John tells me they’re playing it Sunday. They’re what!? That will be WEIRD. And after the service will be weird too, seeing as everybody knows me and nobody’s ever seen me take a lick of interest in worship music before.

I like it now, but the lyrics are nothing special, so I don’t know if I’ll like it in six months. We’ll see. Some people will think it’s the greatest thing there ever was, just because two people they know wrote it. Others will call it a crappy worship song, but hopefully they’ll give us a little credit: At least we’re producing our own crappy worship songs instead of playing someone else’s.

Then John turns to me on the way out. “Hey, you’re a writer. How do we go about copyrighting this thing?” Besides writing “Copyright 2000 by John Scheusner and Dave Farquhar” on every copy, there’s this.  That’s what the Web is for, friends.

~~~~~~~~~~

Note: I post this letter without comment, since it’s a response to a letter I wrote. My stuff is in italics. I’m not sure I totally agree with all of it, but it certainly made me think a lot and I can’t fault the logic.

From: John Klos
Subject: Re: Your letter on Jerry Pournelle’s site

Hello, Dave,

I found both your writeup and this letter interesting. Especially interesting is both your reaction and Jerry’s reaction to my initial letter, which had little to do with my server.To restate my feelings, I was disturbed about Jerry’s column because it sounded so damned unscientific, and I felt that he had a responsibility to do better.
His conclusion sounded like something a salesperson would say, and in fact did sound like things I have heard from salespeople and self-promoted, wannabe geeks. I’ve heard all sorts of tales from people like this, such as the fact that computers get slower with age because the ram wears out…

Mentioning my Amiga was simply meant to point out that not only was I talking about something that bothered me, but I am running systems that “conventional wisdom” would say are underpowered. However, based upon what both you and Jerry have replied, I suppose I should’ve explained more about my Amiga.

I have about 50 users on erika (named after a dear friend). At any one moment, there are anywhere from half a dozen to a dozen people logged on. Now, I don’t claim to know what a Microsoft Terminal Server is, nor what it does, but it sounds something like an ’80s way of Microsoft subverting telnet.

My users actually telnet (technically, they all use ssh; telnet is off), they actually do tons of work is a shell, actually use pine for email and links (a lynx successor) for browsing. I have a number of developers who do most of their development work in any of a number of languages on erika (Perl, C, C++, PHP, Python, even Fortran!).

Most of my users can be separated into two groups: geeks and novices. Novices usually want simple email or want to host their domain with a minimum of fuss; most of them actually welcome the simplicity, speed, and consistency of pine as compared to slow and buggy webmail. Who has used webmail and never typed a long letter only to have an error destroy the entire thing?

The geeks are why sixgirls.org got started. We all
had a need for a place
to call home, as we all have experienced the nomadic life of being a geek
on the Internet with no server of our own. We drifted from ISP to ISP
looking for a place where our Unix was nice, where our sysadmins listened,
and where corporate interests weren’t going to yank stuff out from underneath us at any moment. Over the years, many ISPs have stopped
offering shell access and generally have gotten too big for the comfort of
geeks.

If Jerry were replying to this now, I could see him saying that shells are
old school and that erika is perhaps not much more than a home for  orphans and die-hard Unix fans. I used to think so, too, but the more novice users I add, the more convinced I am that people who have had no shell experience at all prefer the ease, speed, and consistency of the shell
over a web browser type interface. They’re amazed at the speed. They’re
surprised over the ability to instantly interact with others using talk and ytalk.

The point is that this is neither a stopgap nor a dead end; this IS the
future. I read your message to Jerry and it got me thinking a lot. An awful
lot. First on the wisdom of using something other than what Intel calls a server, then on the wisdom of using something other than a Wintel box as a server. I probably wouldn’t shout it from the mountaintops if I were doing it, but I’ve done it myself. As an Amiga veteran (I once published an article in Amazing Computing), I smiled when I saw what you were doing with your A4000. And some people no doubt are very interested in that. I wrote some about that on my Weblogs site (address below if you’re interested).

I am a Unix Systems Administrator, and I’ve set up lots of servers. I made
my decision to run everything on my Amiga based upon several
criteria:
One, x86 hardware is low quality. I stress test all of the servers I
build, and most x86 hardware is flawed in one way or another. Even if
those flaws are so insignificant that they never affect the running of a
server, I cannot help but wonder why my stress testing code will run just
fine on one computer for months and will run fine on another computer for
a week, but then dump a core or stop with an error. But this is quite
commonplace with x86 hardware.

For example, my girlfriend’s IBM brand FreeBSD computer can run the stress testing software indefinitely while she is running the GIMP, Netscape, and all sorts of other things. This is one of the few PCs that never has any problems with this stress testing software. But most of the other servers I set up, from PIIIs, dual processor PIIIs and dual Celerons, to Cyrix 6×86 and MII, end up having a problem with my software after anywhere from a few days to a few weeks. But they all have remarkable uptimes, and none crash for any reason other than human error (like kicking the cord).

However, my Amigas and my PowerMacs can run this software indefinitely.

So although I work with x86 extensively, it’s not my ideal choice. So what
else is there? There’s SPARC, MIPS, m68k, PowerPC, Alpha, StrongARM… pleanty of choices.

I have a few PowerMacs and a dual processor Amiga (68060 and 200 mhz PPC 604e); however, NetBSD for PowerMacs is not yet as mature as I need it to be. For one, there is no port of MIT pthreads, which is required for MySQL. Several of my users depend on MySQL, so until that is fixed, I can’t consider using my PowerMac. Also, because of the need to boot using Open Firmware, I cannot set up my PowerMac to boot unattended. Since my machine is colocated, I would have to be able to run down to the colocation facility if anything ever happened to it. That’s
fine if I’m in the city, but what happens when I’m travelling in Europe?

SPARC is nice, but expensive. If I could afford a nice UltraSPARC, I
would. However, this porject started as a way to have a home for
geeks; coming up with a minimum of $3000 for something I didn’t even plan to charge for wasn’t an option.

Alpha seems too much like PC hardware, but I’d certainly be willing to
give it a try should send me an old Alpha box.

With MIPS, again, the issue is price. I’ve always respected the quality of
SGI hardware, so I’d definitely set one up if one were donated.

StrongARM is decent. I even researched this a bit; I can get an ATX
motherboard from the UK with a 233 mhz StrongARM for about 310 quid. Not too bad.

But short of all of that, I had a nice Amiga 4000 with a 66 mhz 68060, 64
bit ram, and wide ultra SCSI on board. Now what impresses me about this
hardware is that I’ve run it constantly. When I went to New Orleans last
year during the summer, I left it in the apartment, running, while the
temperatures were up around 100 degrees. When I came back, it was
fine. Not a complaint.

That’s the way it’s always been with all of my Amigas. I plug them in,
they run; when I’m done, I turn off the monitor. So when I was considering
what computer to use as a server when I’d be paying for a burstable 10
Mbps colocation, I wanted something that would be stable and consistent.

 Hence Amiga.

One of my users, after reading your letter (and, I guess, Jerry’s),
thought that I should mention the load average of the server; I assume
this is because of the indirectly stated assumption that a 66 mhz 68060 is
just squeaking by. To clarify that, a 66 mhz 68060 is faster per mhz than
any Pentium by a measurable margin when using either optimised code (such as a distributed.net client) or straight compiled code (such as LAME). We get about 25,000 hits a day, for a total of about 200 megs a day, which accounts for one e

ighth of one percent of the CPU time. We run as a Stratum 2 time server for several hundred computers, we run POP and IMAP services, sendmail, and we’re the primary nameserver for perhaps a hundred machines. With a distributed.net client running, our load average hovers arount 1.18, which means that without the dnet client, we’d be idle most of the time.

If that weren’t good enough, NetBSD 1.5 (we’re running 1.4.2) has a much
improved virtual memory system (UVM), improvements and speedups in the TCP stack (and complete IPv6 support), scheduler enhancements, good softdep support in the filesystem (as if two 10k rpm 18 gig IBM wide ultra drives aren’t fast enough), and more.

In other words, things are only going to get better.

The other question you raise (sort of) is why Linux gets so much more
attention than the BSD flavors. I’m still trying to figure that one
out. Part of it is probably due to the existance of Red Hat and
Caldera and others. FreeBSD gets some promotion from Walnut
Creek/BSDi, but one only has to look at the success of Slackware to
see how that compares.

It’s all hype; people love buzz words, and so a cycle begins: people talk
about Linux, companies spring up to provide Linux stuff, and people hear
more and talk more about Linux.

It’s not a bad thing; anything that moves the mainstream away from
Microsoft is good. However, the current trend in Linux is not good. Red
Hat (the company), arguably the biggest force in popularising Linux in the
US, is becoming less and less like Linux and more and more like a software company. They’re releasing unstable release after unstable release with no apologies. Something I said a little while ago, and someone has been using as his quote in his email:
In the Linux world, all of the major distributions have become
companies. How much revenue would Red Hat generate if their product was flawless? How much support would they sell?

I summarise this by saying that it is no longer in their best interest to
have the best product. It appears to be sufficient to have a working
product they can use to “ride the wave” of popularity of Linux.

I used Linux for a long time, but ultimately I was always frustrated with
the (sometimes significant) differences between the distributions, and
sometimes the differences between versions of the same distribution. Why
was it that an Amiga running AmigaDOS was more consistent with Apache and Samba docs than any particular Linux? Where was Linux sticking all of
these config files, and why wasn’t there documentation saying where the
stuff was and why?

When I first started using BSD, I fell in love with its consistency, its
no bull attitude towards ports and packa
ges, and its professional and
clean feel. Needless to say, I don’t do much linux anymore.

It may well be due to the people involved. Linus Torvalds is a
likeable guy, a smart guy, easily identifiable by a largely computer
illiterate press as an anti-Gates. And he looks the part. Bob Young is
loud and flambouyant. Caldera’s the company that sued Microsoft and probably would have won if it hadn’t settled out of court. Richard
Stallman torques a lot of people off, but he’s very good at getting
himself heard, and the GPL seems designed at least in part to attract
attention. The BSD license is more free than the GPL, but while
freedom is one of Stallman’s goals, clearly getting attention for his
movement is another, and in that regard Stallman succeeds much more than the BSD camp. The BSD license may be too free for its own good.

Yes, there aren’t many “figureheads” for BSD; most of the ones I know of
don’t complain about Linux, whereas Linux people often do complain about the BSD folks (the major complaint being the license).

I know Jerry pays more attention to Linux than the BSDs partly because Linux has a bigger audience, but he certainly knows more about Linux than about any other Unix. Very soon after he launched his website, a couple of Linux gurus (most notably Moshe Bar, himself now a Byte columnist) started corresponding with him regularly, and they’ve made Linux a reasonably comfortable place for him, answering his questions and getting him up and going.

So then it should be their responsibility, as Linux advocates, to give
Jerry a slightly more complete story, in my opinion.

As for the rest of the press, most of them pay attention to Linux only because of the aforementioned talking heads. I have a degree in journalism from supposedly the best journalism school in the free world, which gives me some insight into how the press works (or doesn’t, as is usually the case). There are computer journalists who get it, but a g

ood deal of them are writing about computers for no reason in particular, and their previous job and their next job are likely to be writing about something else. In journalism, if three sources corroborate something, you can treat it as fact. Microsoft-sympathetic sources are rampant, wherever you are. The journalist probably has a Mac sympathy since there’s a decent chance that’s what he uses. If he uses a Windows PC, he may or may not realize it. He’s probably heard of Unix, but his chances of having three local Unix-sympathetic sources to use consistently are fairly slim. His chances of having three Unix-sympathetic sources who agree enough for him to treat what they say as fact (especially if one of his Microsofties contradicts it) are probably even more slim.

Which furthers my previous point: Jerry’s Linux friends should be more
complete in their advocacy.

Media often seems to desire to cater to the lowest common denominator, but it is refreshing to see what happens when it doesn’t; I can’t stand US
news on TV, but I’ll willingly watch BBC news, and will often learn more
about US news than if I had watched a US news program.

But I think that part of the problem, which is compounded by the above, is
that there are too many journaists that are writing about computers,
rather than computer people writing about computers.

After all, which is more presumptuous: a journaist who thinks that he/she
can enter the technical world of computing and write authoritatively about
it, or a computer person who attempts to be a part time journalist? I’d
prefer the latter, even if it doesn’t include all of the accoutrements
that come from the writings of a real journalist.

And looking at the movement as a whole, keep in mind that journalists look for stories. Let’s face it: A college student from Finland writing an operating system and giving it away and millions of people thinking it’s better than Windows is a big story. And let’s face it, RMS running
around looking like John the Baptist extolling the virtues of something called Free Software is another really good story, though he’d get a lot more press if he’d talk more candidly about the rest of his life, since that might be the hook that gets the story. Can’t you see this one now?

Yes. Both of those stories would seem much more interesting than, “It’s
been over three years and counting since a remote hole was found in
OpenBSD”, because it’s not sensationalistic, nor is it interesting, nor
can someone explain how you might end up running OpenBSD on your
appliances (well, you might, but the fact that it’s secure means that it’d
be as boring as telling you why your bathtub hasn’t collapsed yet).

Richard Stallman used to keep a bed in his office at the MIT Artificial Intelligence Lab.

He slept there. He used the shower down the hall. He didn’t have a home outside the office. It would have distracted him from his cause: Giving away software.

Stallman founded the Free Software movement in 1983. Regarded by many as the prophet of his movement (and looking the part, thanks to his long, unkempt hair and beard), Stallman is both one of its most highly regarded programmers and perhaps its most outspoken activist, speaking at various functions around the world.

Linux was newsworthy, thanks to the people behind it, way back in 1993 when hardly anyone was using it. Back then, they were the story. Now, they can still be the story, depending on the writer’s approach.

If there are similar stories in the BSD camp, I’m not aware of them. (I can tell you the philosophical differences between OpenBSD,  NetBSD and FreeBSD and I know a little about the BSD directory structure, but that’s where my knowledge runs up against its limits. I’d say I’m more familiar with BSD than the average computer user but that’s not saying much.) But I can tell you my editor would have absolutely eaten this up. After he or she confirmed it wasn’t fiction.

The history is a little dry; the only “juicy” part is where Berkeley had
to deal with a lawsuit from AT&T (or Bell Labs; I’m not doing my research
here) before they could make their source free.

Nowadays, people are interested because a major layer of Mac OS X is BSD, and is taken from the FreeBSD and NetBSD source trees. Therefore, millions of people who otherwise know nothing about BSD or its history will end up running it when Mac OS X Final comes out in January; lots of people already are running Mac OS X Beta, but chances are good that the people who bought the Beta know about the fact that it’s running on BSD.

And it’s certainly arguable that BSD is much more powerful and robust than Windows 2000. So there’s a story for you. Does that answer any of your question?

Yes; I hope I’ve clarified my issues, too.

Neat site! I’ll have to keep up on it.

Thanks,
John Klos