Stopping spam.

Forget what I wrote yesterday. I was going to post the stuff I wrote in Ohio when I realized it isn’t all that good, it’s definitely not useful, and the people who annoy me the most are the people who can’t get over themselves. No one cares what I ate for breakfast, and the only people who care what went on in Ohio already know.
So here’s something useful instead. It’s the coolest thing I’ve found all year. Maybe all decade, for that matter.

Spam begone. I hate spam. It wastes my time and my bandwidth and, ultimately, my money. I’ve seen some estimates that spam costs ISPs as much as $5 per month per account. You’d better believe they’re passing those losses on to you.

There are tons and tons of anti-spam solutions out there, but most of them run on the mailserver side, so for an end-user to use them, they have to set up a mail server and either use it for mail or run fetchmail to pull the mail in from ISP’s mail servers. I’ve done that, but it’s convoluted. But that’s trivial compared to setting up the anti-spam kits.

I was crusing along, vaguely happy, when my local mailserver developed bad sectors on the hard drive, so one day when I went to read my mail, I heard clunking noises. I turned around, flipped on the power switch to the server’s attached monitor, and saw read errors. Hmm. I hope that mail wasn’t important…

Eventually I shut down my mail server and put up with the spam, hoping I’d come up with a better idea.

I found it in a Perl script called disspam.pl, written by Mina Naguib.

It took a little doing to get it running in Debian. Theoretically it’ll run on any OS that has Perl installed. Here’s what I did in Debian:

su (to become root)
apt-get install libnet-perl (Perl couldn’t see the network without this, so the next command in this sequence was failing. This hopefully isn’t necessary on other distros, as I have no idea what the equivalent would be.)
perl -MCPAN -e shell (as per readme–I accepted the defaults, then when it asked for CPAN servers, I told it my continent and country. Then it gave me 48 choices. I picked a handful at random, since none were any more obviously close to me than others.)
install Net::POP3 (as per readme)
quit
cp sample.conf disspam.conf
chmod 755 disspam.pl

Next, I loaded up disspam.conf into a text editor. It looks just like a Windows-ish INI file.

The second line gives me an exclude list. It’ll take names and e-mail addresses. So I put in a few important names that could possibly be blocked (friends with AOL and Hotmail addresses). That way if their ISPs ever misbehave and get blacklisted, their mail will still get to me. Then I popped down to the end of the file and configured my POP3 mailbox. I had an account I hadn’t read in a week, so I figured I’d get a good test. Just drop in your username, password, and POP3 server like you would for your e-mail client. If you have more than one account, copy and paste the section.

Bada bing, bada boom. You’re set. Run disspam.pl and watch. In my case, it flagged and deleted about a dozen messages, typical of what I usually get, like mail offering me Viagra or access to horny cheerleaders or how to find out anything about anyone (which I already know–I have a journalism degree). The only questionable thing it flagged was mail from MLB.com. I can’t get off their mailing list ever since I voted online for the All-Star game. No importa, I never read that mail anyway. I could have always added MLB.com to my exclude list if what they had to say mattered to me.

But if you’re like me and get lots of mail–that was my less-busy account–and about half of it is spam, that stuff’s going to scroll by really fast. So here’s what I recommend doing: when you execute disspam.pl, use the following command line:

~/disspam/disspam.pl ~/disspam/disspam.conf >> ~/disspam/disspam.log

Then you can examine disspam.log. If disspam ever deletes something it shouldn’t have, you can add the person to your exclude list and e-mail them to ask what they wanted. It looks to be less work than deleting all that spam. Probably less embarrassing too. Have you ever accidentally opened one of those horny cheerleader e-mail messages when there were people around? Yikes!

I fired up Ximian Evolution, pulled down my mail, and had 15 new messages. No spam. None. Sweet bliss.

It’s just version 0.05 and the author considers it beta, but I love it already.

Unix’s power allows you to string simple tools together to make powerful ones. Here are some suggestions.

You can e-mail the log to yourself with these commands:

mail -s disspam [your_address] rm ~/disspam/disspam.log

If you want the computer to do all the work for you, here’s the command sequence:

cronttab

Then add these entries:

0 0 * * * mail -s disspam [your_address] * 0 * * * ~/disspam/disspam.pl ~/disspam/disspam.conf >> ~/disspam/disspam.log

If you read your mail on the same machine that runs disspam, you can substitute your user account name for your e-mail address and save your ISP a little traffic.

You’ll have to provide explicit paths for disspam.pl and disspam.conf.

The first entry causes it to mail the log at midnight, then delete the original. The second entry filters your inbox(es) on the hour, every hour. To filter more frequently you can add more lines:


* 10 * * * ~/disspam/disspam.pl ~/disspam/disspam.conf >> ~/disspam/disspam.log
* 20 * * * ~/disspam/disspam.pl ~/disspam/disspam.conf >> ~/disspam/disspam.log
* 30 * * * ~/disspam/disspam.pl ~/disspam/disspam.conf >> ~/disspam/disspam.log
* 40 * * * ~/disspam/disspam.pl ~/disspam/disspam.conf >> ~/disspam/disspam.log
* 50 * * * ~/disspam/disspam.pl ~/disspam/disspam.conf >> ~/disspam/disspam.log

This program shouldn’t be necessary for very long. It’s short and simple (4.5K worth of Perl) so there’s no reason why mail clients shouldn’t start incorporating similar code. Until they do, you run the risk of disspam and your mail client getting out of sync and some spam coming through. If you read your mail on a Linux box with an mbox-compliant client like Sylpheed or Balsa or Kmail, you can bring fetchmail into the equation. Then create a .fetchmailrc file in your home directory (name it ~/.fetchmailrc to ensure it goes to the right place). Here’s the format of .fetchmailrc:

poll SERVERNAME protocol PROTOCOL username NAME password PASSWORD

So here’s an example that would work for me:

poll mail.swbell.net protocol pop3 username dfarq password censored

Next, set your mail client to no longer check for mail automatically, then type crontab and edit your disspam lines so they read like this:

* 20 * * * disspam.pl disspam.conf >> ~/disspam.log ; fetchmail (your server name)

In case you’re interested, the semicolon tells Unix not to execute the second command until the first one is complete. If you have more than one mail account, add another fetchmail line.

As an aside, Evolution seems to use the mbox file format but it doesn’t store its file where fetchmail will find it. I think you could symlink /var/spool/mail/yourusername to ~/evolution/local/Inbox/mbox and it would work. I haven’t tried that little trick yet.

But even if you’re not ambitious enough to make it run automatically and integrate with all that other stuff, it’s still a killer utility you can run manually. And for that matter, if you can get Perl running on NT or even on a Mac, this ought to run on them as well.

Check it out. It’ll save you time and aggravation. And since it only reads the headers to decide what’s spam and what’s not, it’ll save bandwidth and, ultimately, it’ll save your ISP a little cash. Not tons, but every little bit can help. You can’t expect them to pass their savings on to you, but they’ll certainly pass their increased expenses on to you. So you might as well do a little something to lower those expenses if you can. Sometimes goodwill comes back around.

Upgrading a P2-300

Case study: Revitalizing a PII-300
It took me three and a half hours one night to squeeze another year or two of useful life out of a PII-300.

A fellow member of the Board of Directors at my church approached me one night. “Would you reinstall the OS on my computer?” he asked. He had a PII-300, not a barn burner by any modern measure, but not a slouch of a computer either. But as a performer it had been very much an underachiever of late. I had walked him through reinstalling the operating system over the phone back around Christmas and it had solved some problems, but not everything. It appeared his computer needed a clean start.

When I looked at it, I agreed. It wasn’t particularly stable and it definitely wasn’t fast. He had a Castlewood Orb drive to facilitate quick backups, so I had him copy his data directories (named Documents and My Documents), along with his AOL directory, over to the Orb. I also spotted a directory called Drv. As an afterthought, I grabbed that one too.

I proceeded to boot off a CD-ROM-enabled boot floppy. Tepidly, I typed the magic words format c: at the command prompt. Quickly I noticed a problem: the words “Saving current bad sector map” on the screen. As the drive formatted, Rick asked the magic question. “What do you think of partitioning?”

Dirty secret #1: Any time you see bad sectors, you should absolutely FDISK the drive. Bad clusters can be caused by physical problems on the disk, but they can also be caused by corruption of the FAT. No disk utility that I’ve ever seen (not Scandisk, not Disk Doctor, not even SpinRite) fixes that. The only way to fix that (verified by a technicians I talked to at Gibson Research, the makers of SpinRite) is to fdisk and format the drive.

Dirty secret #2: FAT16 is much faster than FAT32. Since Rick wasn’t opposed to partitioning the drive, I created a 2GB FAT16 partition. You do this by answering No when fdisk asks if you want to enable large disk support. This partition holds the operating system.

I exited FDISK, ran it again, and this time answered Y when it asked the cryptic large-disk question. I created a partition that spanned the rest of the drive. Then I rebooted, typed format c: then format d:, and watched for bad clusters. There were none. Excellent.

End result: I had a 2-gig FAT16 C drive and a 6-gig FAT32 D drive.

Dirty secret #3: Never, ever, ever, ever, ever (unless someone’s holding a gun to your head) install Windows as an upgrade. You have a Windows 95 CD and a Windows 98 upgrade CD? So what. Install Windows 98 on the bare drive. Setup will find no Windows installation present and ask for your Windows 95 CD. You insert your Win95 CD, it investigates it to make sure it’s not a blank CD with win.com on it somewhere, then asks for your Win98 CD back. End result: a clean install. Even if you install Win95 immediately followed by Win98, you get extra garbage you don’t need. And it takes twice as long.

Windows took about 30 minutes to install. I tackled his applications. When I installed MS Office, I did a complete install with one exception. I drilled down into Office Tools, found Find Fast, and unchecked it. Find Fast is a resource hog and doesn’t do anything useful.

I installed Office to drive D.

He’d bought Norton Systemworks on sale one weekend, hoping it would help his performance. It didn’t. I showed him a trick. Rather than install Systemworks directly, I explored the CD, drilled into the Norton Utilities directory, and ran Setup from there. I intentionally left out almost everything. Speed Disk and Disk Doctor are the two superstars. I also kept the Optimization Wizard. I left out most of the rest, because the other stuff doesn’t do anything useful but it sure slows down your system. When it asked about running Disk Doctor at startup, I said no. It just slows down startup and doesn’t do anything useful. I did let it replace Scandisk with Disk Doctor. That way if you get an improper shutdown, Disk Doctor can clean up the mess before Windows starts and makes a bigger mess. But Disk Doctor should run when you need it. Not all the time.

Then I drilled down into the Norton Antivirus directory and installed it. Then I did the same for Ghost. I needn’t have done that. Just copying the Ghostpe.exe file out of that directory onto a boot floppy suffices. More on Ghost later.

I installed this stuff to drive D.

Next, I installed his scanner software, Lotus SmartSuite, and his DVD decoder.

I copied the data back over from his Orb disk, noticed his modem wasn’t working, and installed the device driver I found in the Drv directory I’d copied over to the Orb as an afterthought. (I’d much rather back up too much stuff than not enough.) Then I copied his AOL directory over to drive D and installed AOL 5.0 over the top of it. It picked up all his settings.

I cleaned up c:msdos.sys and rebooted, watching the time. It booted in about 45 seconds, including POST. I was happy. Rick was very happy.

I did the other standard Windows optimizations outlined in chapter 2 of Optimizing Windows. I cleared out his root directory on C. Then I ran Norton Speed Disk. I had it do the full file reordering and directory sorting bit (also described in Optimizing Windows). Clearing out the root directory makes disk access much more efficient, but only after Speed Disk discards the now-empty directory entries. Directory sorting makes disk access more efficient by putting the important files early in the list so Windows finds them faster. The results are marvelous.

Finally, I ran Ghost. I copied the Ghost executable to a boot floppy that contained the Castlewood device driver internal.sys, then booted from it and Ghosted his drive to the Orb drive. Fifteen minutes later, he had an image of his system, so he can return back to this state any time he wants.

End result: Rick’s P2-300 with an 8-gig Quantum Bigfoot drive (a notoriously slow hard drive) and 288 MB RAM received a new lease on life. Despite its slow processor and hard drive, it performs better than a lot of consumer-level PCs available today.

That was a good investment of 3 1/2 hours.

Another entry from the Clueless Dept.

Someone else who needs to buy a clue. I normally don’t have a problem with John Dvorak, and frequently I actually like his stuff. He’s not as clueless as some people make him out to be. Dvorak’s not as smart as he thinks he is, but one thing I’ve noticed about his critics is that they usually aren’t as smart as they think they are either.
Dvorak’s most recent Modest Proposal is that we fire all the technology ignorami out there and then, essentially, throw away corporate standards, let end-users run anything they bloody well want, and basically make them administrators of their own machines.

I’ve got a real problem with that. Case point: One of my employer’s executives recently brought in his home PC and insisted we get it running with remote access. Only one problem with that: He has Windows XP Home. XP Home’s networking is deliberately crippled, so businesses don’t try to save money by buying it. A sleazy move, but a reality we have to live with. We got it to work somewhat, but not to his satisfaction. He’s mad, but mostly because he doesn’t have any idea what changes went on under the hood in XP and doesn’t know he’s asking the impossible. But he’s perfectly competent using Word, Excel, PowerPoint and Outlook. He’s also very comfortable ripping his CDs to MP3 format–he’s got one of the largest MP3 collections in the company. He’s competent technologically. But he has no business with admin rights on his computer.

The same goes for a lot of our users. The record I’ve found for the most spyware-related files installed on a work PC is 87. These aren’t the technical ignorami who are installing this garbage. It’s the people who know how to use their stuff, but they love shareware and freeware. Maybe some of it helps them get their work done. But these people are the first to complain when their system crashes inexplicably. And I’m expected to keep not only the corporate standard apps like M$ Office running, but I’m also expected to support RealPlayer, Webshots, Go!Zilla, Gator, WinAmp, RealJukebox, AOL, and other programs that run ripshod all over the system and frequently break one another (or the apps I’m supposed to support).

If the users were completely responsible for keeping their systems running, that would be one thing. But install all that stuff on one computer and try to keep it running. You won’t have enough time to do your job.

Dvorak argues that people like me should solely be concerned with keeping the network working. That’s fine, but what about when some Luddite decides to ditch all modern apps and bring in an IBM PS/2 running DOS 5.0 and compatible versions of Lotus 1-2-3 and WordPerfect and dBASE? Unless there’s already an Ethernet card in that machine, I won’t be able to network it. And the person who decides a Macintosh SE/30 running System 6.0.8 is where it’s at will have a very difficult time getting on the network and won’t be able to exchange data with anyone else either.

Those scenarios are a bit ridiculous, but I’ve had users who would have done that if they could have. And someone wanting to run XP Home absolutely is not ridiculous, nor uncommon. If my job is to network every known operating system and make those users able to work together in this anarchy, my job has just become impossible.

As much as I would love for people to use Linux in my workplace and something other than Word and Outlook, the anarchy Dvorak is proposing is completely unworkable. It’s many orders of magnitude worse than the current situation.

This is just wrong too. Yes, New Englanders, I know about heartbreak. I’m from Kansas City. At least your Red Sox have posted more than one winning record in the past 10 years.

Anyway, not only are the Royals’ glory years over, they’ve forgotten where their glory years came from. They’ve once again denied Mark Gubicza entry into their Hall of Fame. Who? In the late 1980s, Mark Gubicza was the Royals’ second-best pitcher, behind Bret Saberhagen. Injuries did him in the same as Saberhagen (only a little sooner) but he’s still among their career leaders in wins and strikeouts.

And after spending 13 seasons in a Royals’ uniform, the Royals had a chance to trade Gubicza for hard-hitting DH Chili Davis. But you don’t trade a guy who’s poured his heart and soul into the team for 13 years and stayed completely and totally loyal to it no matter how much it hurt, right? Gubicza said yes. Gubicza went to the GM and told him that if he could make the Royals a better team by trading him, to trade him.

Chili Davis hit 30 home runs for the Royals in 1997. Then he bolted for the Yankees.

Meanwhile, Gubicza blew out his arm for good and the Angels released him. He pitched two games for them.

It takes a great man to tell the team he loves that the best thing he can do for them is to get traded for someone who can help the team more. That was Mark Gubicza. They don’t make ’em like him anymore.

But even more importantly, the immortal Charley Lau was once again denied entry. Who’s he? He was a journeyman catcher who spent his entire career as a backup and whose career batting average was .255, but that was because he had about zero natural ability. He was a genius with the bat, which was how he managed to hit .255. More importantly, Lau was the Royals’ hitting coach in the early 1970s. He spotted some skinny guy who was playing third base because Paul Schaal couldn’t play third base on artificial turf and their first choice to replace him, Frank White, couldn’t play third base at all. This skinny blond fielded just fine, but he was hitting terribly. Lau asked him what he was doing over the All-Star break. The kid said he was going fishing with Buck Martinez. Lau put his foot down. He told him he was going to stay in Kansas City and learn how to hit.

“He changed my stance. I had been standing up there like Carl Yastrzemski, but the next thing I knew I looked like Joe Rudi,” the kid recalled. But he started hitting. By the end of the year, he’d pulled his average up to a very respectable .282.

Soon Lau had every player on the Royals standing at the plate like Joe Rudi, and taking the top hand off the bat after contact with the ball. And the Royals created a mini-dynasty in the American League Western Division.

What was the name of that kid, anyway?

George Brett.

If it hadn’t been for Charley Lau, George Brett would have been nothing. The Royals probably would have never won anything. And they probably wouldn’t be in Kansas City anymore either. Who puts up with 30 years of losing, besides Cubs fans?

Charley Lau belongs in their Hall of Fame. Even if nobody besides George Brett and me remembers who he was.

Much ado about nothing and other stuff

Much ado about nothing. The most recent report I read indicates that AOL/Time Warner and Red Hat are talking, but not about an acquisition. Sanity has entered the building…
Good thing User Friendly got a chance to get its two cents’ worth in. I got a couple bucks’ worth of laughter from it.
Much ado about something. On Sunday, Gentoo Linux developer Daniel Robbins announced that an obscure AMD Athlon bug slipped past Linux kernel developers, resulting in serious problems with Athlon- and Duron-based systems with AGP cards. This confirms some suspicions I’ve heard–one of the Linux mailing lists I subscribe to occasionally has rumblings about obscure and difficult-to-track-down Athlon problems.

The result was that Gentoo’s site was slashdotted into oblivion for a while, but hopefully it also resulted in some extra exposure for the distribution. Gentoo is another source-based distro. Lately I’ve been resigned to just using Debian to build my Linux boxes, but I’m still awfully fond of the idea of compiling your own stuff. As CPUs get faster and faster, I expect that to become more commonplace.

But I digress. The bug involves the CPU’s paging function. Older x86 CPUs used 4K pages. Starting with the Pentium, CPUs started allowing 4MB pages. But a bug in the Athlon’s implementation of this extended paging causes memory corruption when used in conjunction with an AGP cards.
Alan Cox is working on a workaround. I’m a bit surprised a patch isn’t already out there.

CPU bugs are discovered all the time, but it’s fairly rare for them to be serious. If you ever run across a Pentium-60 or Pentium-66 system, boot up Linux on it sometime and run the command dmesg. You’ll find workarounds for at least two serious bugs. A TI engineer named Robert Collins gained a fair bit of notoriety in the last decade by researching, collecting, and investigating CPU bugs. Part of it was probably due to his irreverant attitude towards Intel. (As you can see from this Wayback machine entry.) Sadly, I can’t find the story on the site anymore, since he was bought out by Dr. Dobb’s.
Catching up. I haven’t been making my rounds lately. The reason why is fairly obvious. I used my day off yesterday to have lunch with someone from my small group, then when I got home I read the e-mail I absolutely had to read, responded to those that absolutely had to get responses, answered a couple of voice messages, wrote and sent out a couple of other messages, looked up, and it was 5 p.m.

“Alright God,” I muttered. “I just gave the day to Your people. Time to go spend some time with You.” So I whipped out my handy-dandy Today’s Light Bible and read about Moses. Seemed appropriate. The inadequacy and jumping the gun and making excuses, that is. The Biblical “superheroes” were human just like us, and the book doesn’t gloss over that. Today’s Light is designed to divide the Bible into pieces so you can read the whole thing in two years. I can’t decide if I want to get through it in a year or in six months. A few years ago I read it in its entirety in four months, but that pace is a bit much. If you’re willing to spend as much time reading the Bible every day as the average person does watching TV, you can make it through in a few months. But it’s not exactly light reading, and I’m not sure I recommend that pace. If you’re willing to dedicate that kind of time to Bible study you’re probably better served by learning Greek so you can read the New Testament in the original. Then if you’ve still got your sanity you can think about tackling Hebrew.

I finally got around to reading Charlie Sebold’s entries for the last few days. One especially poignant observation: “I continue to be surprised at how much I remember about computers, and how much I forget about everything else (including far more important things).”

I sure can relate. I wish I could trade everything I remember about IBM PS/2s and Microchannel for something more useful. But I remember goofy baseball statistics too–I can recite the starting lineup and pitching rotation of the 1980 Kansas City Royals (I’ll spare you). But I can’t tell you the names of all seven people I met Sunday night.

What on earth is going on?

AOL-Time Warner in talks to buy Red Hat? I found this this morning. It’s intriguing, but I can’t decide if a buyout would be a good thing or a bad thing. After all, Netscape was in decline when AOL bought it. It nosedived afterward. Obviously, the problem was twofold. When AOL acquired Netscape, they didn’t acquire all of its mindshare. Some of the most talented people got fed up and left. You can take Jim Barksdale or you can leave him. The loss of Marc Andreesen and Jamie Zawinski, though, was substantial.
The second problem was that AOL wasn’t serious about competing. They bought a browser technology and basically sat on it. Netscape 4.x was fundamentally flawed, as even Zawinski acknowledges, although I would argue it was no more fundamentally flawed than IE 4.x. The Gecko engine, on which Netscape 6.x is based, is solid technology, even though it took longer to get to market than anyone had hoped. Although Netscape 6.x won’t bowl anyone over, other browsers based on the technology, such as Galeon, are absolutely fantastic. But AOL chose to release a half-hearted browser with the Netscape name on it and continued to use the IE engine in its flagship product even after the favorable agreement with Microsoft that prompted AOL to do so in the first place expired.

That begs the question of what AOL would do with Red Hat if it owned it. Red Hat is still the big-name player in the Linux field, but Red Hat is concentrating on the server market. You can still buy Red Hat at retail, but on the desktop, Red Hat is arguably #3 in popularity now behind France’s Mandrake and Germany’s SuSE. Red Hat is the only Linux company that’s making money, but that’s largely by selling consulting. That’s not AOL’s core business. At this point, AOL is more of a media company than a technology company. Software just gives AOL more outlets to sell its media content. Consulting doesn’t do that.

The best possible scenario for a Red Hat buyout would be for AOL to, as Microsoft puts it, “eat its own dog food,” that is, rip out the infrastructure it bought from other companies and replace it with the technology it just developed or acquired. Since AOL is largely powered by Sun servers, it wouldn’t be terribly difficult to migrate the infrastructure to Red Hat running on Intel. Then AOL could give a big boost to its newly-acquired services division by saying, “We did it and we can help you do it too.” They can also cite Amazon’s recent successes in moving its infrastructure to Red Hat Linux. There is precedence for that; after AOL bought Time Warner, the entire company started using AOL for e-mail, a move widely questioned by anyone who’s used anything other than AOL for mail.

Of course, it would be expected that AOL would port its online service to Linux, which would create the truly odd couple of the computing field. AOL, meet sed and awk. Red Hat would certainly lose its purity and much of its credibility among the Linux die-hards. AOL would bank on making up the loss by gaining users closer to the mainstream. AOL could potentially put some Linux on its corporate desktops, but being a media company, an all-out migration to Linux everywhere within is very far-fetched.

To really make this work, AOL would either have to enter the hardware business and sell PCs at retail using its newly acquired Red Hat distribution and newly ported AOL for Linux and possibly an AOL-branded office suite based on OpenOffice, or it would have to partner with a hardware company. Partnering with a big name seems unlikely–a Compaq or an HP or an IBM wouldn’t do it for fear of retaliation from Microsoft. Sun has never expressed any interest in entering the retail computer business, and even though Sun loves to take opportunities to harm Microsoft, Sun probably wouldn’t cooperate with AOL if AOL replaced its Sun infrastructure with Red Hat Linux. Struggling eMachines might be the best bet, since it’s strictly a consumer brand, has a large presence, but hasn’t consistently turned a profit. But AOL could just as easily follow eMachines’ example, buying and re-branding low-end Far East clones and selling them at retail as loss-leaders, taking advantage of its lack of need for Windows (which accounts for roughly $75 of the cost of a retail PC) and making its profit off new subscribers to its dialup and broadband customers. A $349 PC sold at retail with a flashy GUI, decent productivity software and AOL is all the computer many consumers need.

The advantage to this scenario for everyone else is that AOL would probably dump more development into either the KDE or GNOME projects in order to give itself more and higher-quality software to offer. The official trees can either take these changes or leave them. Undoubtedly, some of the changes would be awful, and the official trees would opt to leave them. But with its 18 years’ worth of experience developing GUIs, some of the changes would likely be a good thing as well.

The more likely scenario: AOL will buy out Red Hat, not have a clue what to do with it, and Red Hat Linux will languish just like Netscape.

The even more likely scenario: AOL will come to its senses, realize that Red Hat Linux has nothing to do with its core business, and the two companies will go their separate ways.

Building 98 boxes

I knuckled down yesterday at work and started building a new laptop image for some deployed users. What they’re using now isn’t stable and it isn’t fast, and much of the software is dated. So rather than patch yet again, we’re starting over. I built a 98 install, leaving out anything I could (such as Drive Converter, since we’re already using FAT32 over my protests, and Disk Compression, which isn’t compatible with FAT32 and I just know it’s only a matter of time before some end user decides he’s too short on disk space and runs it only to be greeted by a PC that won’t boot).
Law #1: The more you install, the slower the system runs, and no amount of disk or registry optimization will completely make up for that.

After I got a decent 98 install down, I did some cleanup. All the .txt files in the Windows directory? Gone. All the BMP files? See ya. Channel Screen Saver? B’bye. I got the C:Windows directory down under 150 entries without losing any functionality. There are probably some GIF and JPEG files in there, and some WAVs possibly, that can also go. I’ll have to check. And of course I did my standard MSDOS.SYS tweaks.

Then I defragmented the drive, mostly to get the directories compressed, rebooted, and timed it. 18 seconds. Not bad for a P2-300.

Next, I installed Office 2000. Once I got all that in place, Windows’ boot time ballooned to 32 seconds, which just goes to show how Microsoft apps screw around on the OS’s turf entirely too much–Office makes more changes to the OS than Internet Explorer–but the boot time is still well below what we’ve come to expect from a P2-300.

One of my coworkers had the nerve to say, “Don’t forget to run Cacheman!” Cacheman my ass. I can put vcache entries in system.ini myself, thank you very much. And I can change the file and path cache in the Registry myself, without having to use some lame program to do it. And cleaning up the directories makes a much bigger difference than those hacks do. It just doesn’t make you feel l33t or anything. Heaven forbid we should ever do anything simple and effective to improve system performance.

Law #2: Most of the tweaks floating around there on the ‘Net do little more than let you feel like you’ve done something. I condensed the useful tricks into a single book chapter. And I also told you what those tricks really do, and the side effects they have, unlike a certain multi-megabyte Web site hosted on AOL… You can do the majority of the things you need to do by practicing restraint and judiciously using just a small number of software tools.

I know how to make a fast Win98 PC. It’s not like I wrote a book about that or anything…

Oh, but how am I ensuring stability? I’m forcing the issue. Yes, I see that list of 47 software packages they have to have. Here’s Windows and Office 2000 and ACT!. Now they have to test it. Does it crash? OK. Now we’ll add the remaining 44 things, one at a time and see which one is breaking stuff. If it’s unstable by the time all of that’s done, it’s because the end users who were testing were sloppy with their testing.

Craig Mundie’s infamous speech

I haven’t said anything about Microsoft Executive Craig Mundie’s speech yet. Everyone’s heard of it, of course, and the typical response has been something along the lines of “Now we know Microsoft’s stance on Open Source.”

No, we’ve always known Microsoft’s stance on that. They’re scared of it. Remember the stereotype of open-source programmers: college students and college dropouts writing software in their basements that a lot of people are using, with the goal of toppling an industry giant. Seem far-fetched? Friends, that’s the story of Microsoft itself. Microsoft became an underground sensation in the late 1970s with Microsoft Basic, a programming language for the Altair and other kit computers and later for CP/M. And while we’ll probably never know the entire story of how and why this happened, when IBM decided to outsource the operating system for the IBM PC, they went to Microsoft and got both an OS and the must-have Microsoft Basic. Ten years later, IBM was just another hardware maker–really big, but getting squeezed. Today, 20 years later, IBM’s still a huge force in the computing industry, but in the PC industry, aside from selling ThinkPads, IBM’s a nobody. There may be hardware enthusiasts out there who’d be surprised to hear IBM makes and sells more than just hard drives.

Ironically, Microsoft’s response to this new threat is to act more and more like the giant it toppled. Shared Source isn’t a new idea. IBM was doing that in the 1960s. If you were big enough, you could see the source code. DEC did it too. At work, we have the source code to most of the big VMS applications we depend on day-to-day. Most big operations insist on having that kind of access, so their programmers can add features and fix bugs quickly. If Windows 2000 is ever going to get beyond the small server space, they really have no choice. But they do it with strings attached and without going far enough. An operation the size of the one I work for can’t get the source and fix bugs or optimize the code for a particular application. You’re only permitted to use the source code to help you develop drivers or applications. Meet the new Microsoft: same as the old Microsoft.

Some people have read this speech and concluded that Microsoft believes open-source software killed the dot-com boom. That’s ludicrous, and I don’t see that in the text. OSS was very good for the dot-com boom. OSS lowered the cost of entry: Operating systems such as FreeBSD and Linux ran on cheap PCs, rather than proprietary hardware. The OSs themselves were free, and there was lots of great free software available, such as the Apache Web server, and scripting languages like Python and Perl. You could do all this cool stuff, the same cool stuff you could do with a Sun or SGI server, for the price of a PC. And not only was it cheaper than everybody else, it was also really reliable.

The way I read it, Microsoft didn’t blame OSS for the dot-com bust. Microsoft blamed the advertising model, valuing market share over revenue, and giving stuff away now and then trying to get people to pay later.

I agree. The dot-com boom died because companies couldn’t find ways to make money. But I’m not convinced the dot-com boom was a big mistake. It put the Internet on the map. Before 1995, when the first banner ad ran, there wasn’t much to the Internet. I remember those early days. As a college student in 1993, the Internet was a bonanza to me, even though I wasn’t using it to the extent a lot of my peers were. For me, the Internet was FTP and Gopher and e-mail. I mostly ignored Usenet and IRC. That was pretty much the extent of the Internet. You had to be really determined or really bored or really geeky to get much of anything out of it. The World Wide Web existed, but that was a great mystery to most of us. The SGI workstations on campus had Web browsers. We knew that Mosaic had been ported to Windows, but no one in the crowd I ran in knew how to get it working. When we finally got it running on some of our PCs in 1994, what we found was mostly personal homepages. “Hi, my name is Darren and this is my homepage. Here are some pictures of my cat. Here’s a listing of all the CDs I own. Here are links to all my friends who have homepages.” The running joke then was that there were only 12 pages on the Web, and the main attraction of the 12 was links to the other 11.

By 1995, we had the first signs of business. Banner ads appeared, and graduating students (or dropouts) started trying to build companies around their ideas. The big attraction of the Web was that there was all this information out there, and it was mostly free. Online newspapers and magazines sprung up. Then vendors sprung up, offering huge selections and low prices. You could go to Amazon.com and find any book in print, and you’d pay less for it than you would at Barnes & Noble. CDNow.com did the same thing for music. And their ads supported places that were giving information away. So people started buying computers so they could be part of the show. People flocked from closed services like CompuServe and Prodigy to plain-old Internet, which offered so much more and was cheaper.

Now the party’s ending as dot-coms close up shop, often with their content gone forever. To me, that’s a loss only slightly greater than the loss of the Great Library. There’s some comfort for me: Five years from now, most of that information would be obsolete anyway. But its historical value would remain. But setting sentiment aside, that bonanza of freebies was absolutely necessary. When I was selling computers in 1994, people frequently asked me what a computer was good for. In 1995, it was an easier sell. Some still asked that question, but a lot of people came in wanting “whatever I need to get to be able to get on the Internet.” Our best-selling software package, besides Myst, was Internet In A Box, which bundled dialup software, a Web browser, and access to some nationwide provider. I imagine sales were easier still in 1996 and beyond, but I was out of retail by then. Suddenly, you could buy this $2,000 computer and get all this stuff for free. A lot of companies made a lot of money off that business model. Microsoft made a killing. Dell and Gateway became behemoths. Compaq made enough to buy DEC. AOL made enough to buy Time Warner. Companies like Oracle and Cisco, who sold infrastructure, had licenses to print money. Now the party’s mostly over and these companies have massive hangovers, but what’s the answer to the Ronald Reagan question? Hangover or no hangover, yes, they’re a whole heck of a lot better off than they were four years ago.

I’m shocked that Microsoft thinks the dot-com phenomenon was a bad thing.

If, in 1995, the Web came into its own but every site had been subscription-based, this stuff wouldn’t have happened. It was hard enough to swallow $2,000 for a new PC, plus 20 bucks a month for Internet. Now I have to pay $9.95 a month to read a magazine? I could just subscribe to the paper edition and save $2,500!

The new Internet would have been the same as the old Internet, only you’d have to be more than just bored, determined, and geeky to make it happen. You’d also have to have a pretty big pile of cash.

The dot-com boom put the Internet on the map, made it the hot ticket. The dot-com bust hurt. Now that sites are dropping out of the sky or at least scaling operations way back, more than half of the Web sites I read regularly are Weblogs–today’s new and improved personal home page. People just like me. The biggest difference between 1994 and 2001? The personal home pages are better. Yeah, the pictures of the cat are still there sometimes, but at least there’s wit and wisdom and insight added. When I click on those links to the left, I usually learn something.

But there is another difference. Now we know why it would make sense to pay for a magazine on the Internet instead of paper. Information that takes a month to make it into print goes online in minutes. It’s much easier and faster to type a word into a search engine than to leaf through a magazine. We can hear any baseball game we want, whether a local radio station carries our favorite team or not. The world’s a lot smaller and faster now, and we’ve found we like it.

The pump is primed. Now we have to figure out how to make this profitable. The free ride is pretty much over. But now that we’ve seen what’s possible, we’re willing to start thinking about whipping out the credit cards again and signing up, provided the cost isn’t outrageous.

The only thing in Mundie’s speech that I can see that Linus Torvalds and Alan Cox and Dan Gillmor should take offense to is Microsoft’s suspicion of anyone giving something away for free. Sure, Microsoft gives lots of stuff away, but always with ulterior motives. Internet Explorer is free because Microsoft was afraid of Netscape. Outlook 98 was free for a while to hurt Lotus Notes. Microsoft Money was free for a while so Microsoft could get some share from Quicken. It stopped being free when Microsoft signed a deal with Intuit to bundle Internet Explorer with Quicken instead of Netscape. And there are other examples.

Microsoft knows that you can give stuff away with strings attached and make money off the residuals. What Microsoft hasn’t learned is that you can give stuff away without the strings attached and still make money off the residuals. The dot-com bust only proves that you can’t necessarily make as much as you may have thought, and that you’d better spend what you do make very wisely.

The Internet needs to be remade, yes, and it needs to find some sustainable business models (one size doesn’t fit all). But if Mundie thinks the world is chomping at the bit to have Microsoft remake the Internet their way, he’s in for a rude awakening.

More Like This: Microsoft Linux Weblogs Internet Commentary

01/19/2001

Software of the day: SecurePC, from www.citadel.com . I spent most of yesterday evaluating it. The biggest thing it does that system policies won’t do is prevent the installation of software–in other words, it makes NT live up to the hype it’s had forever. I tried installing about 20 or so programs, using different methods to try to get around it, and I couldn’t. The setup programs either gave bogus error messages, told me installing software had been disabled, died outright, or crashed. In one instance, the setup program started, asked some questions, then told me installing software had been disabled. Nice.

The only things it won’t block are standalone programs, such as Steve Gibson’s self-contained gems, that don’t require any installation. But I’m not so concerned about those. For one, they’re rare. For two, they usually don’t conflict with anything because they don’t venture outside themselves. Their only danger is that they might be virus-infected, but that’s why we install always-on virus protection and push virus definitions.

The goal is to be able to set up PCs for use in the field, get them working right, then lock them down so as to keep people from breaking them by installing AOL and Webshots and every piece of beta software under the sun and break it.

SecurePC will do a few things system policies will as well, and its user interface is much nicer than Microsoft’s Poledit. Poledit will allow finer control of the control panels, so SecurePC doesn’t totally replace it, but the combination of the two will let you really lock a machine down. And frankly, even Windows 95 is pretty reliable as long as it’s running on good hardware and the user doesn’t mess with it.

But SecurePC is obviously targeting companies used to paying someone $100 an hour or more to fix PCs, because it runs $99. A 10-pack of the network version is $550. That’s a bargain for a company, but this would be incredibly useful in public computer labs in schools, libraries and churches, who frequently can’t afford that. It’s a shame. Hey, if it were priced lower I’ll bet some people would even buy it for home use. I have one friend who could really use it–it’d keep his 20-year-old brother from messing up his PC.

Tyrannical Security. This kind of software is a draconian measure, but what people all too often forget is that when a PC is sitting on a desk at work, it ceases to be a PC. It’s a CC–corporate computer, not personal computer. It’s a corporate asset, set up the way the corporation dictates. If the corporation says no screen savers, no Webshots, no stupid Yahoo news ticker, no RealAudio, then that’s law. Problem is, that’s impossible to enforce with the tools that come with Windows. But a third-party product to enforce them is a Godsend. Computer toys eat memory and CPU cycles, slowing it down and thus hurting productivity, and many of these toys are so poorly written as to make Microsoft look like a model of stability. Personally, I can’t wait for the day when Real Networks goes out of business. So these programs go in, break stuff, and then there’s lost productivity while waiting for the tech to arrive, then still more while an overworked tech tries to fix it. If we were to buy 1,000 copies of some security program that works and roll it out to everyone on our network, I’d be willing to bet it would pay for itself in three months.

The number of the day: 146. I use the Al Gore method of taking IQ tests. I keep taking them over and over again until I like the results. They say the 135-145 range looks like a genius to most people; the 145-165 range is a true genius. I’m accused of being a genius frequently enough that I’m probably at least a 135.

So since I climbed 22 points in a day, I can assume I’ll climb another 22 points today if I take another one, which will put me at 168–high genius level. Then I can take another one tomorrow, gain another 22 points, and apply for Mensa membership.

Or I can forget about it and get on with life. I think I like that idea better.

Scanner troubleshooting secrets

~Mail Follows Today’s Post~

Scanner wisdom. One of the things I did last week was set up a Umax scanner on a new iMac DV. The scanner worked perfectly on a Windows 98 PC, but when I connected it to the Mac it developed all sorts of strange diseases–not warming up properly, only scanning 1/3 of the page before timing out, making really loud noises, crashing the system…

I couldn’t resolve it, so I contacted Umax technical support. The tech I spoke with reminded me of a number of scanner tips I’d heard before but had forgotten, and besides that, I rarely if ever see them in the scanner manuals.

  • Plug scanners directly into the wall, not into a power strip. I’ve never heard a good explanation of why scanners are more sensitive to this than any other peripheral, but I’ve seen it work.
  • Plug USB scanners into a powered hub, or better yet, directly into the computer. USB scanners shouldn’t need power from the USB port, since they have their own power source, but this seems to make a difference.
  • Download the newest drivers, especially if you have a young operating system like MacOS 9, Mac OS X, Windows ME, or Windows 2000. It can take a little while for the scanner drivers to completely stabilize. Don’t install off the CD that came with the scanner, because it might be out of date. Get the newest stuff from the manufacturer’s Web site.
  • Uninstall old drivers before installing the new ones. This was the problem that bit me. The new driver didn’t totally overwrite the old one, creating a conflict that made the scanner go goofy.
  • Buy your scanner from a company that has a track record of providing updated drivers. Yes, that probably means you shouldn’t buy the $15 scanner with the $25 mail-in rebate. Yes, that means don’t buy HP. Up until a couple of years ago, getting NT drivers out of HP was like pulling teeth; now HP is charging for Windows 2000 drivers. HP also likes to abandon and then pick back up Mac support on a whim. Terrible track record.

Umax’s track record is pretty darn good. I’ve downloaded NT drivers for some really ancient Umax scanners after replacing old Macs with NT boxes. I once ran into a weird incompatibility with a seven-year-old Umax scanner–it was a B&W G3 with a wide SCSI controller (why, I don’t know) running Mac OS 8.6. Now that I think about it, I think the incompatibility was with the controller card. The scanner was discontinued years ago (before Mac OS 8 came out), so expecting them to provide a fix was way out of line.
m I’ve ever had with a Umax that they didn’t resolve, so when I spec out a scanner at work, Umax is always on my short list.

And here’s something I just found interesting. Maybe I’m the only one. But in reading the mail on Jerry Pournelle’s site, I found this. John Klos, administrator of sixgirls.org, takes Jerry to task for saying a Celeron can’t be a server. He cites his 66 MHz 68060-based Amiga 4000, which apparently acts as a mail and Web server, as proof. Though the most powerful m68k-based machine ever made, its processing power pales next to any Celeron (spare the original cacheless Celeron 266 and 300).

I think the point he was trying to make was that Unix plays by different rules. Indeed, when your server OS isn’t joined at the hip to a GUI and a Web browser and whatever else Gates tosses in on a whim, you can do a lot more work with less. His Amiga would make a lousy terminal server, but for serving up static Web pages and e-mail, there’s absolutely nothing wrong with it. Hosting a bunch of Web sites on an Amiga 4000 just because I could sounds very much like something I’d try myself if I had the hardware available or was willing to pay for the hardware necessary.

But I see Jerry Pournelle’s point as well.

It’s probably not the soundest business practice to advertise that you’re running off a several-year-old sub-100 MHz server, because that makes people nervous. Microsoft’s done a pretty admirable job of pounding everything slower than 350 MHz into obsolescence and the public knows this. And Intel and AMD have done a good job of marketing their high-end CPUs, resulting in people tending to lay blame at the CPU’s feet if it’s anything but a recent Pentium III. And, well, if you’re running off a shiny new IBM Netfinity, it’s very easy to get it fixed, or if need be, to replace it with another identical one. I know where to get true-blue Amiga parts and I even know which ones are interchangeable with PCs, but you might well be surprised to hear you can still get parts and that some are interchangeable.

But I’m sure there are far, far more sub-100 MHz machines out there in mission-critical situations functioning just fine than anyone wants to admit. I know we had many at my previous employer, and we have several at my current job, and it doesn’t make me nervous. The biggest difference is that most of them have nameplates like Sun and DEC and Compaq and IBM on them, rather than Commodore. But then again, Commodore’s reputation aside, it’s been years since I’ve seen a computer as well built as my Amiga 2000. (The last was the IBM PS/2 Model 80, which cost five times as much.) If I could get Amiga network cards for a decent price, you’d better believe I’d be running that computer as a firewall/proxy and other duties as assigned. I could probably get five years’ uninterrupted service from old Amy. Then I’d just replace her memory and get another ten.

The thing that makes me most nervous about John Klos’ situation is the business model’s dependence on him. I have faith in his A4000. I have faith in his ability to fix it if things do go wrong (anyone running NetBSD on an Amiga knows his machine better than the onsite techs who fix NetFinity servers know theirs). But there’s such thing as too much importance. I don’t let Apple certified techs come onsite to fix our Macs anymore at work, because I got tired of them breaking other things while they did warranty work and having to fix three things after they left. I know their machines better than they do. That makes me irreplaceable. A little job security is good. Too much job sercurity is bad, very bad. I’ll be doing the same thing next year and the year after that. It’s good to be able to say, “Call somebody else.” But that’s his problem, not his company’s or his customers’.

~~~~~~~~~~

From: rock4uandme
To: dfarq@swbell.net
Sent: Wednesday, October 25, 2000 1:22 PM
Subject: i`m having trouble with my canon bjc-210printer…

i`m having trouble with my canon bjc210 printer it`s printing every thing all red..Can you help???
 
 
thank you!!    john c
 
~~~~~~~~~

Printers aren’t my specialty and I don’t think I’ve ever seen a Canon BJC210, but if your printer has replacable printheads (some printers make the printhead part of the ink cartridge while others make them a separate component), try replacing them. That was the problem with the only Canon printer I’ve ever fixed.
 
You might try another color ink cartridge too; sometimes those go bad even if they still have ink in them.
 
If that fails, Canon does have a tech support page for that printer. I gave it a quick look and it’s a bit sketchy, but maybe it’ll help. If nothing else, there’s an e-mail address for questions. The page is at http://209.85.7.18/techsupport.php3?p=bjc210 (to save you from navigating the entire www.ccsi.canon.com page).
 

I hope that helps.

Dave
 
~~~~~~~~~~
 

From: Bruce Edwards
Subject: Crazy Win98 Networking Computer Problem

Dear Dave:

I am having a crazy computer problem which I am hoping you or your readers
may be able to give me a clue to.  I do have this posted on my daily
journal, but since I get very little traffic, I thought your readership or
yourself may be able to help.  Here’s the problem:

My wife’s computer suddenly and inexplicably became very slow when accessing
web sites and usually when accessing her e-mail.  We access the internet
normally through the LAN I installed at home.  This goes to a Wingate
machine which is connected to the aDSL line allowing shared access to the
internet.

My computer still sends and receives e-mail and accesses the web at full
speed.  Alice’s computer now appears to access the web text at about the
speed of a 9600 baud modem with graphics coming down even more slowly if at
all.  Also, her e-mail (Outlook Express) usually times out when going
through the LAN to the Wingate machine and then out over the internet. 
The LAN is working since she is making a connection out that way.

File transfer via the LAN between my PC and hers goes at full speed.
Something is causing her internet access to slow to a crawl while mine is
unaffected.  Also, it appears to be only part of her internet access.  I can
telnet out from her computer and connect to external servers very fast, as
fast as always.  I know telnet is just simple text, but the connection to
the server is very rapid too while connecting to a server via an http
browser is much much slower and then, once connected, the data flows so slow
it’s crazy.

Also, dial-up and connect to the internet via AOL and then use her mail
client and (external to AOL) browser works fine and is as speedy as you
would expect for a 56K modem.  What gives?

I tried reinstalling windows over the existing set-up (did not do anything)
and finally started over from “bare metal” as some like to say.  Reformat
the C drive.  Reinstall Windows 98, reinstall all the drivers, apps, tweak
the configuration, get it all working correctly.  Guess what?  Same slow
speed via the aDSL LAN connection even though my computer zips out via the
same connection.  Any suggestions?

Sincerely,

Bruce W. Edwards
e-mail:  bruce@BruceEdwards.com
Check www.BruceEdwards.com/journal  for my daily journal.

Bruce  🙂
Bruce W. Edwards
Sr. I.S. Auditor  
~~~~~~~~~~

From: Dave Farquhar [mailto:dfarq@swbell.net]Sent: Monday, October 23, 2000 6:16 PM
To: Edwards, Bruce
Cc: Diana Farquhar
Subject: Re: Crazy Win98 Networking Computer Problem

Hi Bruce,
 
The best thing I can think of is your MTU setting–have you run any of those MTU optimization programs? Those can have precisely the effect you describe at times. Try setting yor MTU back to 1500 and see what that does. While I wholeheartedly recommend them for dialup connections, MTU tweaking and any sort of LAN definitely don’t mix–to the point that I almost regret even mentioning the things in Optimizing Windows.
 
Short of that, I’d suggest ripping out all of your networking protocols and adapters from the Network control panel and add back in TCP/IP and only the other things you absolutely need. This’ll keep Windows from getting confused and trying to use the wrong transport, and eliminate the corrupted TCP/IP possibility. These are remote, but possible. Though your reinstall should have eliminated that possibility…
 
If it’s neither of those things, I’d start to suspect hardware. Make sure you don’t have an interrupt conflict (rare these days, but I just saw one a couple weeks ago so I don’t rule them out). Also try swapping in a different cable or NIC in your wife’s machine. Cables of course go bad more frequently than NICs, though I’ve had horrible luck with cheap NICs. At this point I won’t buy any ethernet NIC other than a Bay Netgear, 3Com or Intel.
 
I hope that helps. Let me know how it goes for you.

Dave 
~~~~~~~~~~
From: Bruce Edwards

Hi Dave:
 
Thank you for posting on your web site. I thought you would like an update.
 
I verified the MTU setting was still at 1500 (it was).  I have not used one of the optimizing programs on this PC.
 
I removed all the adapters from the PC via the control panel.  Rebooted and only added back TCP/IP on the Ethernet card. 
 
I double checked the interrupts in the control panel, there do not appear to be any conflicts and all devices report proper function.
 
I still need to 100% verify the wiring/hubs.  I think they are O.K. since that PC, using the same adapter, is able to file share with other PCs on the network.  That also implies that the adapter is O.K.
 
I will plug my PC into the same hub and port as my wife’s using the same cable to verify that the network infrastructure is O.K.
 
Then, I’ll removed the adapter and try a different one.
 
Hopefully one of these things will work.
 
Cheers,
 
Bruce
~~~~~~~~~~

This is a longshot, but… I’m wondering if maybe your DNS settings are off, or if your browser might be set to use a proxy server that doesn’t exist. That’s the only other thing I can think of that can cause sporadic slow access, unless the problem is your Web browser itself. Whichever browser you’re using, have you by any chance tried installing and testing the other one to see if it has the same problems?
 
In my experience, IE 5.5 isn’t exactly the greatest of performers, or when it does perform well, it seems to be by monopolizing CPU time. I’ve gotten much better results with IE 5.0. As for Netscape, I do wish they’d get it right again someday…
 
Thanks for the update. Hopefully we can find an answer.

Dave 
~~~~~~~~~~ 

Abandoned intellectual property

Abandoned Intellectual Property. I read a piece on this subject at OSOpinion over the weekend, and I’ve been thinking about it ever since. There are, of course, a lot of people calling for abolition of copyright or radical changes. This is, believe it or not, one of the tamer proposals I’ve read.

I’m definitely of two minds on this one. Take my first ever publication for money, in 1991. Compute Magazine, before Bob Guccione had managed to totally ram it into the ground, opted to buy my spring break project I collaborated on with a friend. We were writing a video game for the Commodore 64 and 128 and we were getting tired of trying to draw the title screen manually with graphics commands (bad enough on the 128 which had Basic commands to do such things, but on the 64 you were talking peeks and pokes all over the place–someone really should have written this thing back in 1982!) so we wrote a program to do the work for us. You loaded the sprites, moved ’em around, hit a key, and it gave you the Basic code to re-create the screen, suitable for inclusion in your program. We never finished the game, but we got a cool $350 and international recognition (OK, so it was a dwindling audience, but how many high school kids can say they’re published authors at age 16?).

Now, the problem. General Media whittled Compute down until it was basically just another PC mag, abandoning the multiplatform support that made it so great (I read about my beloved Commie 8-bits but still got the opportunity to learn about Macs, Amigas and PCs–what could be better?), market share continued to dwindle, and eventually Guccione and GM sold out to Ziff-Davis, who fulfilled your subscription with a choice of mags (I remember I opted for PC/Computing). So the copyright went to Ziff-Davis, who never did anything with the old Compute stuff. A few years later, Ziff-Davis fell on hard times and eventually hacked itself up into multiple pieces. Who owns the old Compute stuff now? I have no idea. The copyrights are still valid and enforcable. I seriously doubt if anyone cares anymore whether you have the Nov. 1991 issue of Compute if you’re running MOB Mover on your 64/128 or emulator, but where do you go for permission?

The same goes for a lot of old software. Sure, it’s obsolete but it’s useful to someone. A 68020-based Mac would be useful to someone if they could get software for it. But unless the original owner still has his/her copies of WriteNow, Aldus SuperPaint and Aldus Persuasion (just to name a few desirable but no-longer-marketable abandoned titles) to give you, you’re out of luck. Maybe you can get lucky and find some 1995 era software to run on it, but it’ll still be a dog of a computer.

But do we have an unalienable right to abandoned intellectual property, free of charge? Sure, I want the recordings Ric Ocasek made with his bands before The Cars. A lot of people want to get their hands on that stuff, but Ocasek’s not comfortable with that work. Having published some things that I regret, I can sympathize with the guy. I like how copyright law condemns that stuff to obscurity for a time. (Hopefully it’d be obscure in the public domain too because it’s not very good, but limiting the number of copies that can exist clinches it.)

Obscurity doesn’t mean no one is exploited by stealing it. I can’t put it any better than Jerry Pournelle did.

I don’t like my inability to walk into record stores and buy Seven Red Seven’s Shelter or Pale Divine’s Straight to Goodbye or The Caulfields’ Whirligig, but I couldn’t easily buy them in 1991 when they were still in print either. But things like that aren’t impossible to obtain: That’s what eBay and Half.com are for.

For the majority of the United States’ existence, copyright law was 26 years, renewable for another 26. This seems to me a reasonable compromise. Those who produce content can still make a living, and if it’s no longer commercially viable 26 years later, it’s freely available. If it’s still viable, the author gets another 26-year-ride. And Congress could sweeten the deal by offering tax write-offs for the premature release of copyrighted material into the public domain, which would offer a neat solution to the “But by 2019, nobody would want WriteNow anymore!” problem. Reverting to this older, simpler law also solves the “work for hire” problem that exploits musicians and some authors.

All around, this scenario is certainly more desirable for a greater number of people than the present one.

From: Bruce Edwards

Dear Dave:

I am having a crazy computer problem which I am hoping you or your readers may be able to give me a clue to.  I do have this posted on my daily journal, but since I get very little traffic, I thought your readership or
yourself may be able to help.  Here’s the problem:

My wife’s computer suddenly and inexplicably became very slow when accessing web sites and usually when accessing her e-mail.  We access the internet normally through the LAN I installed at home.  This goes to a Wingate machine which is connected to the aDSL line allowing shared access to the internet.

My computer still sends and receives e-mail and accesses the web at full speed.  Alice’s computer now appears to access the web text at about the speed of a 9600 baud modem with graphics coming down even more slowly if at
all.  Also, her e-mail (Outlook Express) usually times out when going through the LAN to the Wingate machine and then out over the internet.  The LAN is working since she is making a connection out that way.

File transfer via the LAN between my PC and hers goes at full speed. Something is causing her internet access to slow to a crawl while mine is unaffected.  Also, it appears to be only part of her internet access.  I can
telnet out from her computer and connect to external servers very fast, as fast as always.  I know telnet is just simple text, but the connection to the server is very rapid too while connecting to a server via an http
browser is much much slower and then, once connected, the data flows so slow it’s crazy.

Also, dial-up and connect to the internet via AOL and then use her mail client and (external to AOL) browser works fine and is as speedy as you would expect for a 56K modem.  What gives?

I tried reinstalling windows over the existing set-up (did not do anything) and finally started over from “bare metal” as some like to say.  Reformat the C drive.  Reinstall Windows 98, reinstall all the drivers, apps, tweak the configuration, get it all working correctly.  Guess what?  Same slow speed via the aDSL LAN connection even though my computer zips out via the
same connection.  Any suggestions?

Sincerely,

Bruce W. Edwards

~~~~~~~~~~

Hi Bruce,

The best thing I can think of is your MTU setting–have you run any of those MTU optimization programs? Those can have precisely the effect you describe at times. Try setting yor MTU back to 1500 and see what that does. While I wholeheartedly recommend them for dialup connections, MTU tweaking and any sort of LAN definitely don’t mix–to the point that I almost regret even mentioning the things in Optimizing Windows.

Short of that, I’d suggest ripping out all of your networking protocols and adapters from the Network control panel and add back in TCP/IP and only the other things you absolutely need. This’ll keep Windows from getting confused and trying to use the wrong transport, and eliminate the corrupted TCP/IP possibility. These are remote, but possible. Though your reinstall should have eliminated that possibility…

If it’s neither of those things, I’d start to suspect hardware. Make sure you don’t have an interrupt conflict (rare these days, but I just saw one a couple weeks ago so I don’t rule them out). Also try swapping in a different cable or NIC in your wife’s machine. Cables of course go bad more frequently than NICs, though I’ve had horrible luck with cheap NICs. At this point I won’t buy any ethernet NIC other than a Bay Netgear, 3Com or Intel.

I hope that helps. Let me know how it goes for you.

WordPress Appliance - Powered by TurnKey Linux