Home » DEC » Page 5

DEC

I feel this sudden urge to prove I really exist…

Do one thing every day that scares you.
Sing.
Don’t be reckless with other people’s hearts.
Don’t put up with people who are reckless with yours.
–Mary Schmich, “Everybody’s Free to Wear Sunscreen”

I want to prove I really exist, and I’m trying to figure out how I can do it. What are the tell-tale signs of a hoax? Lack of pictures and a claim of hating to have your picture taken. Well, I hate having my picture taken. Gatermann’s got an album full of pictures of me holding my hands in front of my face. He collects ’em or something. I know of four pictures of me floating around on the Web, total, and two of them were scans off newsprint.

Another sign: Lots of people claiming to have talked to me via e-mail or even over the phone, but not in person. Dan Bowman and I have talked a lot, and I consider him a close friend. Other Daynoters or Webloggers? Tom Syroid and I used to talk on the phone. But that’s it. I’ve had conversations over e-mail with Doc Jim, and with JHR, and with Matt Beland, and with Brian Bilbrey. But who’s seen me in person? Well, Steve DeLassus and Tom Gatermann, both of whom I claim to have known for more than 10 years, but I could have fabricated them too.

Debilitating problem? Well, carpal tunnel syndrome is very small potatoes compared to leukemia, but it is a death sentence for a writer. I disappeared for about six months over it.

Really, it’s pretty hard to prove I’m not a hoax. I can link to my old writings from college that are online, circa 1996, (I published under “Dave Farquhar” in those days) and of course there’s that O’Reilly book and those Computer Shopper UK articles. Those will establish a consistency of writing style. My relatives that I mention don’t Weblog, and their writing styles are pretty distinct from mine–both my mom and sister are pretty good writers but I’ve got a lot of quirks they don’t. And neither have made many appearances on these pages.

I’m going to hold back a lot of personal details, because someone I hadn’t spoken to in about 10 months freaked me out back in January and, after reading my weblogs in their entirety, recited to me virtually every detail of my life based on what I’d written and a few educated guesses. Some of the details were wrong, but not enough of them were.

But if anyone really wants to check, I was born in Kansas City, Mo. I lived a lot of places, but most notably in Farmington, Mo., from 1983 to 1988, and in Fenton, Mo., from 1988 to 1993 (and I continued to call Fenton my home through 1996 when I was in college). I graduated from Lutheran High School South, St. Louis, in 1993. I graduated from the University of Missouri-Columbia, with a degree in journalism (no minor) in 1997. I was employed by the University of Missouri in 1997 and 1998, so I’m even listed in the 1998 issue of the Official Manual of the State of Missouri. All of this should be pretty easily verifiable.

Or you can just take me at my word. It comes down to honesty, and futility. Why would anyone hoax a 20-something systems administrator? And why would they publish a book and a bunch of magazine articles under my name? It would be pointless. A pile of computer tips isn’t a compelling enough story to fake.

So what is compelling? A struggle. This past weekend’s struggle with a system upgrade showed I was human and don’t really care if people think I’m a computer genius or not. I guess that’s kind of compelling, because most of us can’t get our computers working quite right. Netscape cofounder Marc Andreesen endeared himself to thousands when he admitted in a magazine interview that his home PC crashes a lot and he never did get his printer working right. But an underdog is better. Noah Grey is a whole lot more compelling than me, because we’ve all felt a little shy sometimes, so his agoraphobia is something we can somewhat relate to. He can reach out to the world and we can share a little in his struggle and root for him. And Kaycee Nicole Swenson, well, she was just too good to be true–a 19-year-old who was wise and mature well beyond her years, a great writer, insightful, broken-hearted, sincere… Every male over 35 wanted her to be his daughter. As for the males under 35, she’d have made a great kid sister. But I suspect a good percentage of them would have wanted to date her, or someone just like her.

I don’t remember if this was exactly how she put it, but an old classmate once observed that the Internet allows us to safely pick our friends from a pool of millions, and usually we can find people who at least seem to be a whole lot more interesting (or better matches for us) than the people we can meet face-to-face, and we can quickly and painlessly get new ones and dispose of them on a whim. She wrote those words in 1997, but aren’t they a perfect description of Kaycee and the rest of the Weblogging phenomenon?

Steve DeLassus raised an interesting point this afternoon. He asked why a 19-year-old dying of leukemia or complications from leukemia would weblog at all. Wouldn’t she have better things to do? That’s an honest question, but I know if something like that were happening to me, I’d weblog. It’s cathartic, for one thing. When I was struggling with depression, I wrote about it in my newspaper column. I found it a whole lot easier to just pour my heart and soul into my word processor than to talk to someone about what I was feeling. I needed to get it out of my system, but you never know how people are going to react. When you can detach yourself from the words, it doesn’t matter. Some will scoff, but you won’t know. Some will totally understand, and you won’t know. Others will totally get it, and they’ll reach out to you, and then it’s all totally worth it. You know there’s something to them, because they had to make an effort to find your words, probably, and then they had to make an effort to communicate with you. You find special people that way.

Yeah, it’s kinda selfish. But it’s safe, and when you’re vulnerable, you need safe.

I’ve given zero enlightenment into the whole Kaycee Nicole hoax. I know a lot of people are hurting. I never got attached to her, because I only read her a couple of times a month. Over the weekend, I went back to Week 1 and started reading from there, to see what I missed. I guess I figured catching the reruns was better than missing it entirely. And I started to understand her appeal a bit more. And now I understand the hurt. It’s not nice to play with people’s hearts.

And some people will probably put up their walls and vow never to be hurt that way again. It’d be hard not to blame them.

But I hope they don’t. Because the only thing worse than the feeling after someone played with your heart is the feeling of being alone.

More Like This: Personal Weblogs

Craig Mundie’s infamous speech

I haven’t said anything about Microsoft Executive Craig Mundie’s speech yet. Everyone’s heard of it, of course, and the typical response has been something along the lines of “Now we know Microsoft’s stance on Open Source.”

No, we’ve always known Microsoft’s stance on that. They’re scared of it. Remember the stereotype of open-source programmers: college students and college dropouts writing software in their basements that a lot of people are using, with the goal of toppling an industry giant. Seem far-fetched? Friends, that’s the story of Microsoft itself. Microsoft became an underground sensation in the late 1970s with Microsoft Basic, a programming language for the Altair and other kit computers and later for CP/M. And while we’ll probably never know the entire story of how and why this happened, when IBM decided to outsource the operating system for the IBM PC, they went to Microsoft and got both an OS and the must-have Microsoft Basic. Ten years later, IBM was just another hardware maker–really big, but getting squeezed. Today, 20 years later, IBM’s still a huge force in the computing industry, but in the PC industry, aside from selling ThinkPads, IBM’s a nobody. There may be hardware enthusiasts out there who’d be surprised to hear IBM makes and sells more than just hard drives.

Ironically, Microsoft’s response to this new threat is to act more and more like the giant it toppled. Shared Source isn’t a new idea. IBM was doing that in the 1960s. If you were big enough, you could see the source code. DEC did it too. At work, we have the source code to most of the big VMS applications we depend on day-to-day. Most big operations insist on having that kind of access, so their programmers can add features and fix bugs quickly. If Windows 2000 is ever going to get beyond the small server space, they really have no choice. But they do it with strings attached and without going far enough. An operation the size of the one I work for can’t get the source and fix bugs or optimize the code for a particular application. You’re only permitted to use the source code to help you develop drivers or applications. Meet the new Microsoft: same as the old Microsoft.

Some people have read this speech and concluded that Microsoft believes open-source software killed the dot-com boom. That’s ludicrous, and I don’t see that in the text. OSS was very good for the dot-com boom. OSS lowered the cost of entry: Operating systems such as FreeBSD and Linux ran on cheap PCs, rather than proprietary hardware. The OSs themselves were free, and there was lots of great free software available, such as the Apache Web server, and scripting languages like Python and Perl. You could do all this cool stuff, the same cool stuff you could do with a Sun or SGI server, for the price of a PC. And not only was it cheaper than everybody else, it was also really reliable.

The way I read it, Microsoft didn’t blame OSS for the dot-com bust. Microsoft blamed the advertising model, valuing market share over revenue, and giving stuff away now and then trying to get people to pay later.

I agree. The dot-com boom died because companies couldn’t find ways to make money. But I’m not convinced the dot-com boom was a big mistake. It put the Internet on the map. Before 1995, when the first banner ad ran, there wasn’t much to the Internet. I remember those early days. As a college student in 1993, the Internet was a bonanza to me, even though I wasn’t using it to the extent a lot of my peers were. For me, the Internet was FTP and Gopher and e-mail. I mostly ignored Usenet and IRC. That was pretty much the extent of the Internet. You had to be really determined or really bored or really geeky to get much of anything out of it. The World Wide Web existed, but that was a great mystery to most of us. The SGI workstations on campus had Web browsers. We knew that Mosaic had been ported to Windows, but no one in the crowd I ran in knew how to get it working. When we finally got it running on some of our PCs in 1994, what we found was mostly personal homepages. “Hi, my name is Darren and this is my homepage. Here are some pictures of my cat. Here’s a listing of all the CDs I own. Here are links to all my friends who have homepages.” The running joke then was that there were only 12 pages on the Web, and the main attraction of the 12 was links to the other 11.

By 1995, we had the first signs of business. Banner ads appeared, and graduating students (or dropouts) started trying to build companies around their ideas. The big attraction of the Web was that there was all this information out there, and it was mostly free. Online newspapers and magazines sprung up. Then vendors sprung up, offering huge selections and low prices. You could go to Amazon.com and find any book in print, and you’d pay less for it than you would at Barnes & Noble. CDNow.com did the same thing for music. And their ads supported places that were giving information away. So people started buying computers so they could be part of the show. People flocked from closed services like CompuServe and Prodigy to plain-old Internet, which offered so much more and was cheaper.

Now the party’s ending as dot-coms close up shop, often with their content gone forever. To me, that’s a loss only slightly greater than the loss of the Great Library. There’s some comfort for me: Five years from now, most of that information would be obsolete anyway. But its historical value would remain. But setting sentiment aside, that bonanza of freebies was absolutely necessary. When I was selling computers in 1994, people frequently asked me what a computer was good for. In 1995, it was an easier sell. Some still asked that question, but a lot of people came in wanting “whatever I need to get to be able to get on the Internet.” Our best-selling software package, besides Myst, was Internet In A Box, which bundled dialup software, a Web browser, and access to some nationwide provider. I imagine sales were easier still in 1996 and beyond, but I was out of retail by then. Suddenly, you could buy this $2,000 computer and get all this stuff for free. A lot of companies made a lot of money off that business model. Microsoft made a killing. Dell and Gateway became behemoths. Compaq made enough to buy DEC. AOL made enough to buy Time Warner. Companies like Oracle and Cisco, who sold infrastructure, had licenses to print money. Now the party’s mostly over and these companies have massive hangovers, but what’s the answer to the Ronald Reagan question? Hangover or no hangover, yes, they’re a whole heck of a lot better off than they were four years ago.

I’m shocked that Microsoft thinks the dot-com phenomenon was a bad thing.

If, in 1995, the Web came into its own but every site had been subscription-based, this stuff wouldn’t have happened. It was hard enough to swallow $2,000 for a new PC, plus 20 bucks a month for Internet. Now I have to pay $9.95 a month to read a magazine? I could just subscribe to the paper edition and save $2,500!

The new Internet would have been the same as the old Internet, only you’d have to be more than just bored, determined, and geeky to make it happen. You’d also have to have a pretty big pile of cash.

The dot-com boom put the Internet on the map, made it the hot ticket. The dot-com bust hurt. Now that sites are dropping out of the sky or at least scaling operations way back, more than half of the Web sites I read regularly are Weblogs–today’s new and improved personal home page. People just like me. The biggest difference between 1994 and 2001? The personal home pages are better. Yeah, the pictures of the cat are still there sometimes, but at least there’s wit and wisdom and insight added. When I click on those links to the left, I usually learn something.

But there is another difference. Now we know why it would make sense to pay for a magazine on the Internet instead of paper. Information that takes a month to make it into print goes online in minutes. It’s much easier and faster to type a word into a search engine than to leaf through a magazine. We can hear any baseball game we want, whether a local radio station carries our favorite team or not. The world’s a lot smaller and faster now, and we’ve found we like it.

The pump is primed. Now we have to figure out how to make this profitable. The free ride is pretty much over. But now that we’ve seen what’s possible, we’re willing to start thinking about whipping out the credit cards again and signing up, provided the cost isn’t outrageous.

The only thing in Mundie’s speech that I can see that Linus Torvalds and Alan Cox and Dan Gillmor should take offense to is Microsoft’s suspicion of anyone giving something away for free. Sure, Microsoft gives lots of stuff away, but always with ulterior motives. Internet Explorer is free because Microsoft was afraid of Netscape. Outlook 98 was free for a while to hurt Lotus Notes. Microsoft Money was free for a while so Microsoft could get some share from Quicken. It stopped being free when Microsoft signed a deal with Intuit to bundle Internet Explorer with Quicken instead of Netscape. And there are other examples.

Microsoft knows that you can give stuff away with strings attached and make money off the residuals. What Microsoft hasn’t learned is that you can give stuff away without the strings attached and still make money off the residuals. The dot-com bust only proves that you can’t necessarily make as much as you may have thought, and that you’d better spend what you do make very wisely.

The Internet needs to be remade, yes, and it needs to find some sustainable business models (one size doesn’t fit all). But if Mundie thinks the world is chomping at the bit to have Microsoft remake the Internet their way, he’s in for a rude awakening.

More Like This: Microsoft Linux Weblogs Internet Commentary

Optimizing BIOSes and optimizing DOS

Optimizing the BIOS. Dustin Cook sent in a link to Adrian’s Rojak Pot, at www.adriansrojakpot.com , which includes a BIOS tweaking guide. It’s an absolute must-read. I have a few minor quibbles with a couple of the things he says, particularly about shadowing and caching your ROMs with Windows 9x. He says you shouldn’t do it. He’s right. He says you shouldn’t do it because Microsoft says not to do it with Windows NT, and Windows 9x “shares the same Win32 architecture.” It does and it doesn’t, but that’s flawed logic. Shadowing ROMs isn’t always a bad thing; on some systems that eats up some usable memory and on others it doesn’t, depending on the chipset and BIOS it uses. But it’s pointless because Windows doesn’t use the BIOS for anything, unless you’re in safe mode. Caching ROMs makes very little sense; there’s only so much caching bandwidth to go around so you should spend it on caching memory that’s actually being used for something productive. So who cares about architecture, you shouldn’t cache and shadow your ROMs because Windows will ignore it one way or the other, so those facilities are better spent elsewhere. The same thing is true of Linux.

Still, in spite of this minor flaw I found in a couple of different spots, this is an invaluable guide. Perfect BIOS settings won’t make a Pentium-90 run like a Pentium III, but poor BIOS settings certainly can make a Pentium III run more like a 386DX-40. Chances are your BIOS settings aren’t that bad, but they can probably use some improvement. So if you want the best possible performance from your modern PC, visit Adrian’s. If you want to optimize your 386 or 486 or low-end Pentium, visit the site I mentioned yesterday.

Actually, it wouldn’t be a half-bad idea to take the downloadable versions of both guides, print them, and stick them in a binder for future reference. You’ll never know when you might want to take them with you.

Optimizing DOS again. An awful lot of system speed is psychological. I’d say maybe 75% of it is pure psychology. It doesn’t matter so much whether the system really is fast, just as long as it feels fast. I mentioned yesterday keyboard and screen accelerators. Keyboard accelerators are great for people like me who spend a lot of time in long text files, because you can scroll through them so much faster. A keyboard accelerator makes a big difference in how an old DOS system feels, and it can improve the responsiveness of some DOS games. (Now I got your attention I’m sure.)

Screen accelerators are a bit more of a stretch. Screen accelerators intercept the BIOS calls that write to the screen and replace them with faster, more efficient code. I’d estimate the speedup is anywhere from 10 to 50 percent, depending on how inefficient the PC’s BIOS is and whether it’s shadowing the BIOS into RAM. They don’t speed up graphics at all, just text mode, and then, only those programs that are using the BIOS–some programs already have their own high-speed text routines they use instead. Software compatibility is potentially an issue, but PC power users have been using these things since at least 1985, if not longer, so most of the compatibility issues have long since been fixed.

They only take a couple of kilobytes of memory, and they provide enough of a boost for programs that use the BIOS that they’re more than worth it. With keyboard and screen accelerators loaded in autoexec.bat, that old DEC 386SX/20 feels an awful lot faster. If I had a copy of a DOS version of Microsoft Word, I could use it for writing and it wouldn’t cramp my style much.

Troubleshooting Mac extensions

Troubleshooting Macintosh extensions. An extensions conflict is where you lose your innocence with fixing a Mac. Not all extensions and control panels get along, and certain combinations can have disastrous results.

Here’s my method. Create a folder on the desktop. Drag exactly half the extensions out of System Folder:Extensions and drop them in the folder. Select all the extensions in that new folder and give them a label, so they stand out (it makes them a different color). Now reboot and see if the problem goes away. If it doesn’t, create another folder, move the remaining extensions into it and give them a label. Move the first batch back into the extensions folder and reboot.

Now, add half your extensions back from the folder on the desktop to the extensions folder. If the problem comes back, move that half back into the second folder on the desktop and move the now-known good half into the extensions folder. After each test, remove the labels from the extensions in the extensions folder. Just keep swapping halves until you narrow it down to one bad extension, using labels to keep yourself from getting lost.

I don’t recommend Conflict Catcher because all it does is move the extensions around for you–it’s no easier than this method, and this method doesn’t cost $50.

This is how we build ’em in St. Louis. Neither Gatermann nor I are really in the habit of naming our PCs unless a name is just painfully obvious. In the case of his Linux gateway, the name was painfully obvious. One name and one name only fits: Mir.

This is how we build computers in St. Louis. This is Tom Gatermann’s Linux gateway: a Micronics P75 board with a Cirrus Logic PCI SVGA card, a Kingston PCI NE2000 clone connecting to the Internet, and a Bay Netgear 310TX PCI 10/100 (DEC Tulip chipset) connecting to the local LAN. Yes, that AT case was as cheap as it looks. Maybe cheaper.

Inside the case, there’s an IMES 8X IDE CD-ROM, an ancient 1.44 MB floppy drive of unknown origin, and a 1.2 GB Quantum Bigfoot HD, of which about 1.5 MB is used (booting’s much faster off the HD than off the floppy).

“>

Mir is made from, well, a pile of junk. A Micronics P75 board. A Cirrus Logic PCI SVGA card. Whatever 72-pin SIMMs we had laying around. A Quantum Bigfoot 1.2-gig HD. A really trashed 3.5″ floppy drive. The cheapest-looking AT case ever. But we did skip the Linksys NICs. The NICs are a Kingston PCI NE2000 clone and a Bay Netgear 10/100 based on the DEC Tulip chipset.

We assembled it outside the case because we had so much trouble getting it going correctly–it’s much easier to swap components when they’re accessible. Once we got it going, we never bothered to put everything back inside the case. Maybe we’re trend-setters and this is the next fad in computing. After all, what’s the logical next step after translucency?

Early experiments in building gateways

Gateways. I worked with Gatermann last night after I got back from church (three Macs and an NT server died yesterday–I needed it last night) on trying to get his Linux gateway running under FloppyFW . We were finally able to get it working with dual NICs, able to ping both inside and outside his LAN (I finally found an old Pentium-75 board that didn’t have compatibility issues). But we weren’t able to actually get his Web browsers working.

I suspect something about the IP masquerading configuration just isn’t right, but it’s been so long since I wrote one of those by hand (and it was really just copycating an existing configuration), so since I have working Linux boxes at home I finally just gave up and downloaded the shell script version of Coyote Linux and ran it. It’s not foolproof because you have to know what kernel module your Ethernet cards use, but assuming you know that (make it easy on yourself–get a pair of Netgear 10/100 cards, which use the Tulip module), but it’s definitely a two-edged sword. It makes it a little harder to configure, but it means it’ll work with a much wider variety of cards. If Linux supports it, so does Coyote, whereas a lot of the other single-floppy distributions just support the three most common types (NE2000, 3Com 3c509, and DEC Tulip). So an old DEC Etherworks3 card will work just fine with Coyote, while getting it to work with some of the others can be a challenge.

I’m disappointed that Coyote doesn’t include the option to act as a caching DNS, because you can fit caching DNS on the disk, and it’s based on the Linux Router Project, for which a BIND tarball is certainly available. I’ll have to figure out how to add BIND in and document that, because there’s nothing cooler than a caching nameserver.

I was messing around briefly with PicoBSD , a microdistribution of FreeBSD, but the configuration is just different enough that I wasn’t comfortable with it. FreeBSD would be ideal for applications like this though, because its networking is slightly faster than Linux. But either Linux or FreeBSD will outperform Windows ICS by a wide margin, and the system requirements are far lower–a 386, 8 megs of RAM, floppy drive, and two NICs. Can’t beat that.

Rarely used trivia department: Using Linux to create disk images. To create an image of a floppy under Unix, use this command: dd if=/dev/fd0 of=filename.img bs=10k . There’s no reason why this command couldn’t also be used to clone other disks, making a single-floppy Linux or FreeBSD distribution an alternative to DriveImage or Ghost, so long as the disks you’re cloning have the same geometry.

Test this before you rely on it, but the command to clone disk-to-disk should be dd if=/dev/hda of=/dev/hdb while the command to clone disk-to-image should be dd if=/dev/hda of=filename.img and image-to-disk should be dd if=filename.img of=/dev/hda .

And yesterday. While the computers (and I’ll use that phrase loosely when referring to those Macs) were going down all around me at work, the mail was pouring in. Needless to say, some people agree and others don’t. We’ll revisit it tomorrow. I’ve gotta go to work.

01/26/2001

Hey hey! It works! The server was down all day yesterday, which was a shame. I wanted to try a new experiment. So I’ll try it today.

I saw criticism over at Storage Review on Wednesday morning for their critiques of other hardware sites’ reviews. I disagree with this criticism; many of the reviews out there are atrocities, with poor methodology, hearsay, reviewer ignorance, and other shortcomings. Sometimes these reviews are more misleading than the information in the products’ advertising or packaging! I believe Storage Review is well within professional bounds to point out these shortcomings when they find them.

The mainstream media does this all the time. Columnists and editors will criticize the reporting done in other publications. Most newspapers also employ one person, known as the ombudsman, whose job it is to criticize and/or defend, as appropriate, the publication’s own work.

Seeing as the hardware sites out there often do very sloppy work, even compared to the mainstream media, some policing of it is a very good thing.

Then, over lunch, the idea hit me. Why not do some critiquing myself? I’m trained in editorial writing and editing. I have some experience as a reviewer. And I’ve published a fair bit of my own work in the arena of technology journalism–newspaper columns, a book, individual magazine articles, a series… So I’m qualified to do it, even though I’m not the biggest name out there. And that kind of content is certainly more useful than the “this is how my day went” stuff I’ve been posting way too often.

I’m not so arrogant as to assume that the webmasters of these large sites are in my readership and would take my advice. I don’t expect to change them directly. What I do expect to do is to raise people’s expectations a little. By pointing out what’s good and what’s not so good, hopefully I can raise the public consciousness a little, and indirectly influence some of these sites. If not, then at least my readers are better informed than they otherwise would be, and that’s definitely a good thing.

KT-133A roundup (Tom’s Hardware Guide)

This is a roundup of six VIA KT133a boards. Good review overall. It doesn’t get bogged down in three pages of history that tend to look like a cut-and-paste job from the last similar review, unlike some sites. But it does give just enough history to give proper perspective, though it would have been nice to have mentioned it took EDO and SDRAM some time to show their advantages as well–DDR is no more a failure than the technologies that came before. Unusual for Tom’s, this review isn’t obsessed with overclocking either. Lots of useful information, such as the memory modules tested successfully with each board. Inclusion of the DFI AK74-AC, which will never be released, is questionable. I can see including a reference design, but a cancelled commercial board doesn’t seem to make much sense. You can get an idea from its scores why it got the axe; it was consistently one of the bottom two boards in the roundup.

Emphasis was on performance, not stability, but Pabst and Schmid noted they had no compatibility or stability problems with these boards. Stability in benchmarks doesn’t guarantee stability in the real world, but it’s usually a good indication. As tight as the race is between these boards, stability is more important than speed anyway, and since the majority of people don’t overclock, the attempt to at at least mention compatibility and stability is refreshing.

Socket 7 Upgrade Advice (AnandTech)

This is a collection of upgrade advice for Socket 7 owners. This review, too, doesn’t get too bogged down in history, but the mention of fake cache is noteworthy. This was a PC Chips dirty trick, dating back to 1995 or so, before the K6 series. It wasn’t a very common practice and didn’t last very long–certainly not as long as the article suggests.

Lots of good upgrade advice, including a short compatibility list and pitfalls you can expect. Also included are some benchmarks, but it would have been nice if they’d included more vintage chips. The oldest chip included was the K6-2/450, and AMD sold plenty of slower chips. You can’t extrapolate the performance of a K6-2/300 under the same conditions based on the 450’s score.

Also, the rest of the hardware used is hardly vintage–you’re not likely to find an IBM 75GXP drive and a GeForce 2 video card in an old Socket 7 system. Using vintage hardware would have given more useful results, plus it would have given the opportunity to show what difference upgrading the video card and/or CPU makes, which no doubt some Socket 7 owners are wondering about. Testing these chips with a GeForce does demonstrate that a more modern architecture will give better peformance–it exposes the weaknesses of the CPU–but indication of how much a new CPU would improve a three-year-old PC would be more useful to most people. Few people have the delusion that a K6-3+ is going to challenge an Athlon or P3. They just want to know the best way to spend their money.

No deceiving graphics or lack of knowledge here; what’s in this article is good stuff and well written. It’s just too bad the testing didn’t more closely resemble the real world, which would have made it infinitely more useful.

Memory Tweaking Guide (Sharky Extreme)

This is a nice introduction to the art of memory tweaking, and it explains all those weird acronyms we hear about all the time but rarely see explained. Good advice on how to tweak, and good advice on how to spend your memory money wisely. They disclosed their testbed and included the disclaimer that your results will vary from theirs–their benchmarks are for examples only. The only real gripe I have is that the benchmark graphs, like all too many on the Web, don’t start at zero. From looking at the graph, it would seem that Quake 3 runs six times as fast at 640x480x16 than at 1600x1200x16, when in reality it runs about twice as fast. Graphing this way, as any statistics professor will tell you, is a no-no because it exaggerates the differences way too much.

Asus CUSL2C Review (Trainwrecker)

This is a review of the Asus CUSL2C, an i815-based board intended for the average user. This review has lots of good sources for further information, but unfortunately it also has a little too much hearsay and speculation. Some examples:

“Of course, Asus won’t support this [cable] mod and we’re pretty sure that doing it will void your warranty.” Of course modifying the cable on an Asus product, or any other manufacturer’s product, will void your warranty. So will overclocking, which they didn’t mention. Overclockers are either unaware or apathetic of this. In matters like this, assertiveness is your friend–it gives a review credibility. One who is assertive and wrong than is more believable than one who is wishy-washy and right.

“Arguably, Asus provides the best BIOS support in the business. We believe Asus develops their BIOS’s at their facility in Germany.” Indeed, Asus claims to have re-written over half the code in their BIOSes, which is one reason why Asus boards perform well historically. Most motherboard manufacturers make at least minor modifications to the Award, AMI, or Phoenix BIOS when they license it, but Asus generally makes more changes than most. This claim is fairly well known.

I was also disappointed to see a section heading labeled “Windows 2000,” which simply consisted of a statement that they didn’t have time to test under Windows 2000, followed by lots of hearsay, but at least they included workarounds for the alleged problems. Including hearsay is fine, and some would say even beneficial, as long as you test the claims yourself. This review would have been much more useful if they had delayed the review another day and tested some of the claims they’ve heard.

There’s some good information here, particularly the links to additional resources for this board, but this review is definitely not up to par with the typical reviews on the better-known sites.

DDR Analysis (RealWorldTech)

Good perspective here, in that DDR is an incremental upgrade, just like PC133, PC100, PC66 SDRAM, and EDO DRAM were before it. But I don’t like the assertion that faster clock speeds would make DDR stand out. Why not actually test it with higher-speed processors to show how each of the technologies scale? Testing each chipset at least at 1 GHz in addition to 800 MHz would have been nice; you can’t get a P3 faster than 1 GHz but testing the Athlon chipsets at 1.2 would add to the enlightenment. Why settle for assertions alone when you can have hard numbers?

Also, the assertion “And don’t forget, even though things like DDR, AGP, ATA/100 and other advancements don’t amount to a significant gain all on their own, using all of latest technology may add up to a significant gain,” is interesting, but it’s better if backed up with an example. It’s possible to build two otherwise similar systems, one utilizing AGP, ATA-100 and DDR and another utilizing a PCI version of the same video card, a UDMA-33 controller, and PC133 SDRAM, and see the difference. Unfortuantely you can’t totally isolate the chipsets, so minor differences in the two motherboards will keep this from being totally scientific, but they’ll suffice for demonstrating the trend. Ideally, you’d use two boards from the same manufacturer, using chipsets of like vintage from the same manufacturer. That pretty much limits us to the VIA Apollo Pro series and a Pentium III CPU.

And if you’re ambitious, you can test each possible combination of parts. It’s a nice theory that the whole may be greater than the sum of the parts, and chances are a lot of people will buy it at face value. Why not test it?

This reminds me of a quote from Don Tapscott, in a Communication World interview from Dec. 1999, where he spelled out a sort of communication pecking order. He said, “If you provide structure to data, you get information. And if you provide context to information, you get knowledge. And if you provide human judgment and trans-historical insights, perhaps we can get wisdom.”

This analysis has good human judgment and trans-historical insights. It has context. It has structure. The problem is it doesn’t have enough data, and that’s what keeps this from being a landmark piece. Built on a stronger foundation, this had the potential to be quoted for years to come.

01/13/2001

Have I been brainwashed by Redmond? In the wake of MacWorld, Al Hawkins wrote a piece that suggested maybe so. My post from Thursday doesn’t suggest otherwise.

So let’s talk about what’s wrong with the PC industry. There are problems there as well–problems across the entire computer industry, really. The biggest difference, I think, is that the big guns in the PC industry are better prepared to weather the storm.

IBM’s PC business has been so bad for so long, they’ve considered pulling out of the very market they created. They seem to be turning it around, but it may only be temporary, and their profits are coming at the expense of market share. They retreated out of retail and eliminated product lines. Sound familiar? Temporary turnarounds aren’t unheard of in this industry. IBM as a whole is healthy now, but the day when they were known as Big Black & Blue isn’t so distant as to be forgotten. But IBM’s making their money these days by selling big Unix servers, disk drives, PowerPC CPUs and other semiconductors, software, and most of all, second-to-none service. The PC line can be a loss leader, if need be, to introduce companies to the other things IBM has to offer.

Compaq is a mess. That’s why they got a new CEO last year. But Compaq is a pretty diverse company. They have DEC’s old mini/mainframe biz, they have DEC’s OpenVMS and Digital Unix (now Tru64 Unix) OSs, they have DEC’s Alpha CPU architecture, and DEC’s widely acclaimed service division, which was the main thing that kept DEC afloat and independent in its day. Compaq also has its thriving server business, a successful line of consumer PCs and a couple of lines of business PCs. The combined Compaq/DEC was supposed to challenge IBM as the 800-pound gorilla of the industry, and that hasn’t happened. Compaq’s a big disappointment and they’re having growing pains. They should survive.

HP’s not exactly in the best of shape either. They’ve made a lot of lunkhead decisions that have cost them a lot of customers, most notably by not releasing drivers for their widely popular printers and scanners for newer Microsoft operating systems. While developing these drivers costs money, this will cost them customers in the long run so it was probably a very short-sighted decision. But HP’s inkjet printers are a license to print money, with the cartridges being almost pure profit, and HP and Compaq are the two remaining big dogs in retail. Plus they have profitable mainframe, Unix, and software divisions as well. They’ve got a number of ways to return to profitability.

The holidays weren’t kind to Gateway. They actually had to resort to selling some of their surplus inventory in retail stores, rather than using the stores as a front for their build-to-order business as intended.

Dell’s not happy with last year’s results either, so they’re looking to diversify and give themselves less dependence on desktop PCs. They’re growing up, in other words. They’re killing IBM and Compaq in PCs, and those companies are still surviving. Dell wants a piece of that action.

Intel botched a number of launches this year. They had to do everything wrong and AMD had to do everything right in order for AMD to continue to exist. That happened. AMD’s past problems may have been growing pains, and maybe they’re beyond it now. We shall see. Intel can afford to have a few bad quarters.

As for their chips, we pay a certain price for backward compatibility. But, despite the arguments of the Apple crowd, x86 chips as a rule don’t melt routinely or require refrigerants unless you overclock. All of my x86 chips have simple fans on them, along with smaller heatsinks than a G4 uses. I’ve seen many a Pentium III run on just a heatsink. The necessity of a CPU fan depends mostly on case design. Put a G4 in a cheap case with poor airflow and it’ll cook itself too.

Yes, you could fry an egg on the original Pentium-60 and -66. Later revisions fixed this. Yet I still saw these original Pentiums run on heat sinks smaller than the sinks used on a G4. The Athlon is a real cooker, so that argument holds, but as AMD migrates to ever-smaller trace widths, that should improve. Plus AMD CPUs are cheap as dirt and perform well. The Athlon gives G4-like performance and high clock speeds at a G3 price, so its customers are willing to live with some heat.

And Microsoft… There are few Microsoft zealots left today. They’re rarer and rarer. Microsoft hasn’t given us anything, yet we continue to buy MS Office, just like Mac users. We curse Microsoft and yet send millions and billions their way, just like Mac users. We just happen to buy the OS from them too. And while we curse Microsoft bugs and many of us make a living deploying Windows-based PCs (but the dozen or so Macs I’m responsible for keep me busier than the couple of hundred PCs I’m responsible for), for the most part Windows works. Mac owners talk about daily blue screens of death, but I don’t know when I last got one. I probably get one or two a year. I currently have eight applications running on my Windows 98 box. OS/2 was a far better system than Windows, but alas, it lost the war.

I can’t stand Microsoft’s imperialism and I don’t like them fighting their wars on my hardware. They can pay for their own battlefield. So I run Linux on some of my boxes. But sometimes I appreciate Windows’ backward compatibility.

I always look for the best combination of price, performance, and reliability. That means I change platforms a lot. I flirted with the Mac in 1991, but it was a loveless relationship. The PCs of that era were wannabes. I chose Amiga without having used one, because I knew it couldn’t possibly be as bad as Windows 3.0 or System 7.0. I was right. By 1994, Commodore had self-destructed and the Amiga was perpetually on the auction block, so I jumped ship and bought a Compaq. Windows 3.1 was the sorriest excuse I’d seen for a multitasking environment since System 7.0 and Windows 3.0. I could crash it routinely. So I switched to OS/2 and was happy again. I reluctantly switched to Windows 95 in 1996. I took a job that involved a lot of Macs in 1998, but Mac OS 8.5 failed to impress me. It was prettier than System 7 and if you were lucky you could use it all day without a horrible crash, but with poor memory management and multitasking, switching to it on an everyday basis would have been like setting myself back 12 years, so the second date wasn’t any better than the first.

Linux is very interesting, and I’ve got some full-time Linux PCs. If I weren’t committed to writing so much about Windows 9x (that’s where the money is), Linux would probably be my everyday OS. Microsoft is right to consider Linux a threat, because it’s cheaper and more reliable. Kind of like Windows is cheaper and more reliable than Mac OS. Might history repeat itself? I think it could.

The computer industry as a whole isn’t as healthy this year as it was last year. The companies with the most resources will survive, and some of the companies with fewer will fold or be acquired. The reason the industry press is harder on Apple than on the others is that Apple is less diversified than the others, and thus far more vulnerable.

01/02/2001

Mailbag:

IE shortcut; Optimizing WinME; Partition; 10/100 NIC; Mobos

Trimming down Windows 2000. Someone else observed last week that, among other things, Windows’ included games are now critical system components. That’s messed up. Fortunately, it’s fixable.

Open the file C:WinntInfsysoc.inf in your favorite text editor, after making a backup copy of course. Search for the string “HIDE,” (without quotes, but including the comma). Delete all references to this string. Save the file. Reboot. Now open Control Panel, Add/Remove Programs, and go down to Windows System Components. You can now cleanly uninstall the Windows components that may not be useful to you, such as the Space Cadet Pinball game, or the Accessibility Options. I’m in the habit of just banging on the shift key several times to turn off my screen blanker. Why shift? Because it won’t send weird keystrokes to whatver application I left running in the foreground. Unfortunately, hitting shift five times usually pops up the Accessibility options, much to my annoyance. So I was very glad to finally be able to uninstall that feature.

And a bargain NIC. This week only, Circuit City is selling the D-Link DFE-530TX+ 10/100 NIC for $14.99 with a $9.99 mail-in rebate. While I prefer the DEC Tulip chipset for inexpensive 10/100 NICs, the Realtek chipset in this D-Link works with Linux and Windows, and that’s an absolute giveaway price. I mean, come on, most of us spend that much every week on soda.

I’ve got a D-Link laying around as a spare, but I had a Circuit City gift card with about $7 left on it, so I picked one up. Besides, I needed a stereo miniplug-to-dual-RCA cable, so suddenly I had two semi-compelling reasons to go to the shark-infested cave. It’s good to have some spare parts, and the D-Links have much better compatibility than the NDC card with the obscure Macronix 98715 chipset I still have in at least one of my systems.

I’ve seen some ludicrous claims that D-Link gives you 3Com and Intel quality at a Linksys price. I don’t buy it for a minute. But for a small home-based network, why pay $40-$60 for a NIC if you don’t have to?

And somehow I managed to avoid the sharks as well. I guess I just didn’t have Pentium 4 tattooed across my forehead.

Amazon now seems to be selling Optimizing Windows at its full retail price of $24.95. Obviously sales are slower now than when it was selling at (sometimes deeply) discounted prices, but still much better than November levels. If you’ve bought it, my heartfelt thanks go out to you. If you’ve posted a review, another thank you.

If you’ve read it and like it and feel like writing a review, either at Amazon or another online bookseller such as Barnes & Noble, Borders, Bookpool or Fatbrain, please feel free to do so. I appreciate it greatly. And if you have comments or questions on the book, feel free to e-mail me.

If you’re wanting to do a price compare on Optimizing Windows, visit www2.bestbookbuys.com/cgi-bin/bbb.cgi?ISBN=1565926773.

Mailbag:

IE shortcut; Optimizing WinME; Partition; 10/100 NIC; Mobos

12/23/2000

The presidency again. The story that won’t die. I thought it was over! When will it end? This is the most ridiculous recount story I’ve heard yet.

New adventures in Linux. I was trying last night to make a Linux gateway out of a single-floppy distribution for the first time. I looked at a number of distributions and finally settled on floppyfw. Why that one in particular, I never decided completely.

Gatermann and I put a minimalist system together: a vintage 1994 Socket 5 Pentium mobo, a P75, 24 MB of 72-pin SIMMs, a floppy drive, a 2 MB PCI Trident video card, and two Bay Netgear 310TX NICs in a beat-up case. Neither of us normally names our computers, but looking at it, we decided this computer’s name was most definitely going to be Mir.

It booted up and seemed to detect the two cards, most of the time. Once it told me eth0 was sitting at IRQ 149 and had a MAC address of FF FF FF FF FF FF, which disturbed me greatly for obvious reasons. Fortunately, this board’s AMI BIOS allows you to manually assign resources to the PCI slots, so I went in and did that: PCI slot #1 got IRQ 9, up through PCI slot #4, which got IRQ 12. That gave me some consistency, but I never did get it to successfully ping any address except 127.0.0.1, the loopback address.

We may be dealing with a hardware problem. We’ll tackle it again soon, possibly with a more complete distribution. I have no shortage of small hard drives. I also have no shortage of other parts.

These projects never go smoothly but I always get them running eventually.

Picking a single-floppy distribution. The big thing is finding one that supports the hardware you have. There’s not enough room on a floppy disk to support every kitchen sink and hairdryer that you might want to use in a Linux box, so any old distribution might not work with your hardware. When Steve DeLassus and I were making a gateway out of his 486SX, we couldn’t find any distribution that didn’t require a math coprocessor, for instance. (There are some now.) If you’re using NICs based on the DEC Tulip chipset or NE2000 clones, you shouldn’t have any trouble, but if you’ve got exotic NICs, not every distribution will support them.

Plus, some of these projects have to be built under Linux. Gatermann doesn’t have a working Linux box at the moment. Others build on any old PC running DOS or Windows. Each distro has its own specialty, so you just have to find one that matches your hardware.

This search over at Freshmeat can give you a headstart if you’re interested in this kind of thing.

Scanner troubleshooting secrets

~Mail Follows Today’s Post~

Scanner wisdom. One of the things I did last week was set up a Umax scanner on a new iMac DV. The scanner worked perfectly on a Windows 98 PC, but when I connected it to the Mac it developed all sorts of strange diseases–not warming up properly, only scanning 1/3 of the page before timing out, making really loud noises, crashing the system…

I couldn’t resolve it, so I contacted Umax technical support. The tech I spoke with reminded me of a number of scanner tips I’d heard before but had forgotten, and besides that, I rarely if ever see them in the scanner manuals.

  • Plug scanners directly into the wall, not into a power strip. I’ve never heard a good explanation of why scanners are more sensitive to this than any other peripheral, but I’ve seen it work.
  • Plug USB scanners into a powered hub, or better yet, directly into the computer. USB scanners shouldn’t need power from the USB port, since they have their own power source, but this seems to make a difference.
  • Download the newest drivers, especially if you have a young operating system like MacOS 9, Mac OS X, Windows ME, or Windows 2000. It can take a little while for the scanner drivers to completely stabilize. Don’t install off the CD that came with the scanner, because it might be out of date. Get the newest stuff from the manufacturer’s Web site.
  • Uninstall old drivers before installing the new ones. This was the problem that bit me. The new driver didn’t totally overwrite the old one, creating a conflict that made the scanner go goofy.
  • Buy your scanner from a company that has a track record of providing updated drivers. Yes, that probably means you shouldn’t buy the $15 scanner with the $25 mail-in rebate. Yes, that means don’t buy HP. Up until a couple of years ago, getting NT drivers out of HP was like pulling teeth; now HP is charging for Windows 2000 drivers. HP also likes to abandon and then pick back up Mac support on a whim. Terrible track record.

Umax’s track record is pretty darn good. I’ve downloaded NT drivers for some really ancient Umax scanners after replacing old Macs with NT boxes. I once ran into a weird incompatibility with a seven-year-old Umax scanner–it was a B&W G3 with a wide SCSI controller (why, I don’t know) running Mac OS 8.6. Now that I think about it, I think the incompatibility was with the controller card. The scanner was discontinued years ago (before Mac OS 8 came out), so expecting them to provide a fix was way out of line.
m I’ve ever had with a Umax that they didn’t resolve, so when I spec out a scanner at work, Umax is always on my short list.

And here’s something I just found interesting. Maybe I’m the only one. But in reading the mail on Jerry Pournelle’s site, I found this. John Klos, administrator of sixgirls.org, takes Jerry to task for saying a Celeron can’t be a server. He cites his 66 MHz 68060-based Amiga 4000, which apparently acts as a mail and Web server, as proof. Though the most powerful m68k-based machine ever made, its processing power pales next to any Celeron (spare the original cacheless Celeron 266 and 300).

I think the point he was trying to make was that Unix plays by different rules. Indeed, when your server OS isn’t joined at the hip to a GUI and a Web browser and whatever else Gates tosses in on a whim, you can do a lot more work with less. His Amiga would make a lousy terminal server, but for serving up static Web pages and e-mail, there’s absolutely nothing wrong with it. Hosting a bunch of Web sites on an Amiga 4000 just because I could sounds very much like something I’d try myself if I had the hardware available or was willing to pay for the hardware necessary.

But I see Jerry Pournelle’s point as well.

It’s probably not the soundest business practice to advertise that you’re running off a several-year-old sub-100 MHz server, because that makes people nervous. Microsoft’s done a pretty admirable job of pounding everything slower than 350 MHz into obsolescence and the public knows this. And Intel and AMD have done a good job of marketing their high-end CPUs, resulting in people tending to lay blame at the CPU’s feet if it’s anything but a recent Pentium III. And, well, if you’re running off a shiny new IBM Netfinity, it’s very easy to get it fixed, or if need be, to replace it with another identical one. I know where to get true-blue Amiga parts and I even know which ones are interchangeable with PCs, but you might well be surprised to hear you can still get parts and that some are interchangeable.

But I’m sure there are far, far more sub-100 MHz machines out there in mission-critical situations functioning just fine than anyone wants to admit. I know we had many at my previous employer, and we have several at my current job, and it doesn’t make me nervous. The biggest difference is that most of them have nameplates like Sun and DEC and Compaq and IBM on them, rather than Commodore. But then again, Commodore’s reputation aside, it’s been years since I’ve seen a computer as well built as my Amiga 2000. (The last was the IBM PS/2 Model 80, which cost five times as much.) If I could get Amiga network cards for a decent price, you’d better believe I’d be running that computer as a firewall/proxy and other duties as assigned. I could probably get five years’ uninterrupted service from old Amy. Then I’d just replace her memory and get another ten.

The thing that makes me most nervous about John Klos’ situation is the business model’s dependence on him. I have faith in his A4000. I have faith in his ability to fix it if things do go wrong (anyone running NetBSD on an Amiga knows his machine better than the onsite techs who fix NetFinity servers know theirs). But there’s such thing as too much importance. I don’t let Apple certified techs come onsite to fix our Macs anymore at work, because I got tired of them breaking other things while they did warranty work and having to fix three things after they left. I know their machines better than they do. That makes me irreplaceable. A little job security is good. Too much job sercurity is bad, very bad. I’ll be doing the same thing next year and the year after that. It’s good to be able to say, “Call somebody else.” But that’s his problem, not his company’s or his customers’.

~~~~~~~~~~

From: rock4uandme
To: dfarq@swbell.net
Sent: Wednesday, October 25, 2000 1:22 PM
Subject: i`m having trouble with my canon bjc-210printer…

i`m having trouble with my canon bjc210 printer it`s printing every thing all red..Can you help???
 
 
thank you!!    john c
 
~~~~~~~~~

Printers aren’t my specialty and I don’t think I’ve ever seen a Canon BJC210, but if your printer has replacable printheads (some printers make the printhead part of the ink cartridge while others make them a separate component), try replacing them. That was the problem with the only Canon printer I’ve ever fixed.
 
You might try another color ink cartridge too; sometimes those go bad even if they still have ink in them.
 
If that fails, Canon does have a tech support page for that printer. I gave it a quick look and it’s a bit sketchy, but maybe it’ll help. If nothing else, there’s an e-mail address for questions. The page is at http://209.85.7.18/techsupport.php3?p=bjc210 (to save you from navigating the entire www.ccsi.canon.com page).
 

I hope that helps.

Dave
 
~~~~~~~~~~
 

From: Bruce Edwards
Subject: Crazy Win98 Networking Computer Problem

Dear Dave:

I am having a crazy computer problem which I am hoping you or your readers
may be able to give me a clue to.  I do have this posted on my daily
journal, but since I get very little traffic, I thought your readership or
yourself may be able to help.  Here’s the problem:

My wife’s computer suddenly and inexplicably became very slow when accessing
web sites and usually when accessing her e-mail.  We access the internet
normally through the LAN I installed at home.  This goes to a Wingate
machine which is connected to the aDSL line allowing shared access to the
internet.

My computer still sends and receives e-mail and accesses the web at full
speed.  Alice’s computer now appears to access the web text at about the
speed of a 9600 baud modem with graphics coming down even more slowly if at
all.  Also, her e-mail (Outlook Express) usually times out when going
through the LAN to the Wingate machine and then out over the internet. 
The LAN is working since she is making a connection out that way.

File transfer via the LAN between my PC and hers goes at full speed.
Something is causing her internet access to slow to a crawl while mine is
unaffected.  Also, it appears to be only part of her internet access.  I can
telnet out from her computer and connect to external servers very fast, as
fast as always.  I know telnet is just simple text, but the connection to
the server is very rapid too while connecting to a server via an http
browser is much much slower and then, once connected, the data flows so slow
it’s crazy.

Also, dial-up and connect to the internet via AOL and then use her mail
client and (external to AOL) browser works fine and is as speedy as you
would expect for a 56K modem.  What gives?

I tried reinstalling windows over the existing set-up (did not do anything)
and finally started over from “bare metal” as some like to say.  Reformat
the C drive.  Reinstall Windows 98, reinstall all the drivers, apps, tweak
the configuration, get it all working correctly.  Guess what?  Same slow
speed via the aDSL LAN connection even though my computer zips out via the
same connection.  Any suggestions?

Sincerely,

Bruce W. Edwards
e-mail:  bruce@BruceEdwards.com
Check www.BruceEdwards.com/journal  for my daily journal.

Bruce  🙂
Bruce W. Edwards
Sr. I.S. Auditor  
~~~~~~~~~~

From: Dave Farquhar [mailto:dfarq@swbell.net]Sent: Monday, October 23, 2000 6:16 PM
To: Edwards, Bruce
Cc: Diana Farquhar
Subject: Re: Crazy Win98 Networking Computer Problem

Hi Bruce,
 
The best thing I can think of is your MTU setting–have you run any of those MTU optimization programs? Those can have precisely the effect you describe at times. Try setting yor MTU back to 1500 and see what that does. While I wholeheartedly recommend them for dialup connections, MTU tweaking and any sort of LAN definitely don’t mix–to the point that I almost regret even mentioning the things in Optimizing Windows.
 
Short of that, I’d suggest ripping out all of your networking protocols and adapters from the Network control panel and add back in TCP/IP and only the other things you absolutely need. This’ll keep Windows from getting confused and trying to use the wrong transport, and eliminate the corrupted TCP/IP possibility. These are remote, but possible. Though your reinstall should have eliminated that possibility…
 
If it’s neither of those things, I’d start to suspect hardware. Make sure you don’t have an interrupt conflict (rare these days, but I just saw one a couple weeks ago so I don’t rule them out). Also try swapping in a different cable or NIC in your wife’s machine. Cables of course go bad more frequently than NICs, though I’ve had horrible luck with cheap NICs. At this point I won’t buy any ethernet NIC other than a Bay Netgear, 3Com or Intel.
 
I hope that helps. Let me know how it goes for you.

Dave 
~~~~~~~~~~
From: Bruce Edwards

Hi Dave:
 
Thank you for posting on your web site. I thought you would like an update.
 
I verified the MTU setting was still at 1500 (it was).  I have not used one of the optimizing programs on this PC.
 
I removed all the adapters from the PC via the control panel.  Rebooted and only added back TCP/IP on the Ethernet card. 
 
I double checked the interrupts in the control panel, there do not appear to be any conflicts and all devices report proper function.
 
I still need to 100% verify the wiring/hubs.  I think they are O.K. since that PC, using the same adapter, is able to file share with other PCs on the network.  That also implies that the adapter is O.K.
 
I will plug my PC into the same hub and port as my wife’s using the same cable to verify that the network infrastructure is O.K.
 
Then, I’ll removed the adapter and try a different one.
 
Hopefully one of these things will work.
 
Cheers,
 
Bruce
~~~~~~~~~~

This is a longshot, but… I’m wondering if maybe your DNS settings are off, or if your browser might be set to use a proxy server that doesn’t exist. That’s the only other thing I can think of that can cause sporadic slow access, unless the problem is your Web browser itself. Whichever browser you’re using, have you by any chance tried installing and testing the other one to see if it has the same problems?
 
In my experience, IE 5.5 isn’t exactly the greatest of performers, or when it does perform well, it seems to be by monopolizing CPU time. I’ve gotten much better results with IE 5.0. As for Netscape, I do wish they’d get it right again someday…
 
Thanks for the update. Hopefully we can find an answer.

Dave 
~~~~~~~~~~