Disappointment… Plus Linux vs. The World

It was looking like I’d get to call a l337 h4x0r to the carpet and lay some smackdown at work, but unfortunately I had a prior commitment. Too many things to do, not enough Daves to go around. It’s the story of my life.
And I see Infoworld’s Bob Lewis is recommending companies do more than give Linux a long, hard look–he’s saying they should consider it on the desktop.

He’s got a point. Let’s face it. None of the contenders get it right. So-called “classic” Mac OS isn’t a modern OS–it has no protected memory architecture, pre-emptive multitasking, and limited threading support. It’s got all the disadvantages of Windows 3.1 save being built atop the crumbling foundation of MS-DOS. I could run Windows 3.1 for an afternoon without a crash. I can run Windows 95 for a week or two. I can usually coax about 3-4 days out of Mac OS. Mac users sometimes seem to define “crash” differently, so I’ll define what I mean here. By a crash, I mean an application dying with an error Type 1, Type 2, or Type 10. Or the system freezing and not letting you do anything. Or a program quitting unexpectedly.

But I digress. Mac OS X has usability problems, it’s slow, and it has compatibility problems. It has promise, but it’s been thrust into duty that it’s not necessarily ready for. Like System 7 of the early ’90s, it’s a radical change from the past, and it’s going to take time to get it ready for general use. Since compilers and debuggers are much faster now, I don’t think it’ll take as long necessarily, but I don’t expect Mac OS X’s day to arrive this year. Developers also have to jump on the bandwagon, which hasn’t happened.

Windows XP… It’s slow, it’s way too cutesy, and only time will tell if it will actually succeed at displacing both 9x and NT/2000. With Product Activation being an upgrader’s nightmare, Microsoft may shoot themselves in the foot with it. Even if XP is twice as good as people say it’s going to be, a lot of people are going to stay away from it. Users don’t like Microsoft policing what they do with their computers, and that’s the perception that Product Activation gives. So what if it’s quick and easy? We don’t like picking up the phone and explaining ourselves.

Linux… It hasn’t lived up to its hype. But when I’ve got business users who insist on using Microsoft Works because they find Office too complicated, I have a hard time buying the argument that Linux can’t make it in the business environment without Office. Besides, you can run Office on Linux with Win4Lin or VMWare. But alternatives exist. WordPerfect Office gets the job done on both platforms–and I know law offices are starting to consider the move. All a lawyer or a lawyer’s secretary needs to be happy, typically, is a familiar word processor, a Web browser, and a mail client. The accountant needs a spreadsheet, and maybe another financial package. Linux has at least as many Web browsers as Windows does, and plenty of capable mail clients; WP Office includes Quattro Pro, which is good enough that I’ve got a group of users who absolutely refuse to migrate away from it. I don’t know if I could run a business on GnuCash. But I’m not an accountant. The increased stability and decreased cost makes Linux make a lot of sense in a law firm though. And in the businesses I count as clients, anywhere from 75-90% of the users could get their job done in Linux just as productively. Yes, the initial setup would be more work than Windows’ initial setup, but the same system cloning tricks will work, mitigating that. So even if it takes 12 hours to build a Linux image as opposed to 6 hours to build a Windows image, the decreased cost and decreased maintenance will pay for it.

I think Linux is going to get there. As far as Linux looking and acting like Windows, I’ve moved enough users between platforms that I don’t buy the common argument that that’s necessary. Most users save their documents wherever the program defaults to. Linux defaults to your home directory, which can be local or on a server somewhere. The user doesn’t know or care. Most users I support call someone for help when it comes time to save something on a floppy (or do anything remotely complicated, for that matter), then they write down the steps required and robotically repeat them. When they change platforms, they complain about having to learn something new, then they open up their notebook, write down new steps, and rip out the old page they’ve been blindly following for months or years and they follow that new process.

It amuses me that most of the problems I have with Linux are with recent distributions that try to layer Microsoft-like Plug and Play onto it. Linux, unlike Windows, is pretty tolerant of major changes. I can install TurboLinux 6.0 on a 386SX, then take out the hard drive and put it in a Pentium IV and it’ll boot. I’ll have to reconfigure XFree86 to take full advantage of the new architecture, but that’s no more difficult than changing a video driver in Windows–and that’s been true since about 1997, with the advent of Xconfigurator. Linux needs to look out for changes of sound cards and video cards, and, sometimes, network cards. The Linux kernel can handle changes to just about anything else without a hiccup. Once Red Hat and Mandrake realize that, they’ll be able to develop a Plug and Play that puts Windows to shame.

The biggest thing that Linux lacks is applications, and they’re coming. I’m not worried about Linux’s future.

What can I say about Tuesday…?

Photography. Tom sent me links to the pictures he took on the roof of Gentry’s Landing a couple of weeks ago. He’s got a shot of downtown, the dome, and the warehouse district, flanked by I-70 on the west and the Mississippi River on the east.
I’m tired. I spent yesterday fighting Mac OS X for a couple of hours. It still feels like beta software. I installed it on a new dual-processor G4/533 with 384 MB RAM, and it took four installation attempts to get one that worked right. Two attempts just flat-out failed, and the installation said so. A third attempt appeared successful, but it felt like Windows 95 on a 16-MHz 386SX with 4 megs of RAM. We’re talking a boot time measured in minutes here. The final attempt was successful and it booted in a reasonable time frame–not as fast as Windows 2000 on similar hardware and nowhere near the 22 seconds I can make Win9x boot in, but faster, I think, than OS 9.1 would boot on the same hardware–and the software ran, but it was sluggish. All the eye candy certainly wasn’t helping. Scrolling around was really fast, but window-resizing was really clunky, and the zooming windows and the menus that literally did drop down from somewhere really got on my nerves.

All told, I’m pretty sure my dual Celeron-500 running Linux would feel faster. Well, I know it’d be faster because I’d put a minimalist GUI on it and I’d run a lot of text apps. But I suspect even if I used a hog of a user interface like Enlightenment, it would still fare reasonably well in comparison.

I will grant that the onscreen display is gorgeous. I’m not talking the eye candy and transparency effects, I’m talking the fonts. They’re all exceptionally crisp, like you’d expect on paper. Windows, even with font smoothing, can’t match it. I haven’t seen Linux with font smoothing. But Linux’s font handling up until recently was hideous.

It’s promising, but definitely not ready for prime time. There are few enough native apps for it that it probably doesn’t matter much anyway.

Admittedly, I had low expectations. About a year ago, someone said something to me about OS X, half in jest, and I muttered back, “If anyone can ruin Unix, it’s Apple.” Well, “ruin” is an awfully harsh word, because it does work, but I suspect a lot of people won’t have the patience to stick with it long enough to get it working, and they may not be willing to take the extreme measures I ultimately took, which was to completely reformat the drive to give it a totally clean slate to work from.

OS X may prove yet to be worth the wait, but anyone who thinks the long wait is over is smoking crack.

Frankly, I don’t know why they didn’t just compile NeXTStep on PowerPC, slap in a Mac OS classic emulation layer, leave the user interface alone (what they have now is an odd hybrid of the NeXT and Mac interfaces that just feels really weird, even to someone like me who’s spent a fair amount of time using both), and release it three years ago.

But there are a lot of things I don’t know.

I spent the rest of the day fighting Linux boot disks. I wanted the Linux equivalent of a DOS boot disk with Ghost on it. Creating one from scratch proved almost impossible for me, so I opted instead to modify an existing one. The disks provided at partimage.org were adequate except they lacked sfdisk for dumping and recreating partition tables. (See Friday if you don’t have the foggiest idea what I’m talking about right about now, funk soul brother.) I dumped the root filesystem to the HD by booting off the two-disk set, mounting the hard drive (mount -t ext2 /dev/hda1 /mnt) and copying each directory (cp -a [directory name] [destination]). Then I made modifications. But nothing would fit, until I discovered the -a switch. The vanilla cp command had been expanding out all the symlinks, bloating the filesystem to a wretched 10 megs. It should have been closer to 4 uncompressed, 1.4 megs compressed. Finally I got what I needed in there and copied it to a ramdisk in preparation for dumping it to a floppy. (You’ve gotta compress it first and make sure it’ll fit.) I think the command was dd if=/dev/ram0 bs=1k | gzip -v9 > [temporary file]. The size was 1.41 MB. Excellent. Dump it to floppy: dd if=[same temporary file from before] of=/dev/fd0 bs=1k

And that’s why my mind feels fried right now. Hours of keeping weird commands like that straight will do it to you. I understand the principles, but the important thing is getting the specifics right.

03/23/2001

I’m looking for inspiration and having a terrible time finding any. That’s what happens when you only do one or two things all week. I’ve beaten Squid to death. There seems to be no incantation I can recite to make Office 4.2.1 run under Mac OS 9. And that pretty much sums up my week.

AMD released 1.3 and 1.33 GHz Athlons this week. They’re priced at around $320 and $350. For software development they’d be great. For video editing they’d be great. For emulating Amigas at wicked speed they’d be great. But what else would you do with that kind of processing power?

For me, the best thing about this chip is it means fewer people will want 800 MHz CPUs, so I’ll be able to get an 800 cheaper. That’s still insanely fast.

Reactions on the hardware sites are mostly predictable. The biggest surprise I saw was Tom Pabst over at Tom’s Hardware, once the most outspoken critic of the P4, is now calling it “certainly no bad product whatsoever.” Last year he made it sound like the spawn of Satan. But he still likes the 1.33 GHz Athlon better.

One nice thing about the hardware sites: when they overclock, you get a nice preview of what future CPU speeds will give you. The Athlon at 1.466 GHz severely outperforms the 1.5 GHz P4, not that most people will be able to tell a difference.

Apple releases OS X tomorrow. It will get mixed reviews. Finally there’s an Apple OS that has a prayer of being stable. Software compatibility is likely to be lousy. There are capabilities that are missing, such as DVD support–and wasn’t Apple the one who’s been saying DVD is so important all along that they made it next to impossible to buy a Mac without a DVD drive?

Chances are I’ll end up running it on a machine at work, and I’m sure I’ll like it better than OS 9. Whether I’ll like it better than Windows NT or Linux, I have no idea.

Historically, it’s always been better to wait for Apple’s dot-one releases. System 7 was an atrocity, while 7.1 was actually a decent OS for its day. OS 8 was promising but buggy, while 8.1 is probably the best version of the old-style Mac OS ever released. I never found anything to like about OS 9. I don’t have a whole lot of experience with 9.1 yet–we’ve still got a lot of machines running 8.6 at work because there wasn’t ever any reason to move them, and once I managed to get 9 working decently the last thing I wanted to do was go back in and change things.

I suspect OS X won’t come into its own until the dot-one release, or possibly even dot-five. This is a much, much bigger change than System 7 or OS 8 were.

02/20/2001

Windows Me Too? I’ve read the allegations that Microsoft aped Mac OS X with the upcoming Windows XP. Maybe I’m dense, but I don’t see much resemblance beyond the resemblance between two cars made by different manufacturers. The Start menu has a new neon look, which is probably Apple-inspired to some degree. The Windows taskbar has had Dock-like functionality for several years now–it was added with IE4. The biggest change seems to be the Start menu–they’ve taken the Windows 2000 initiative, where only commonly used stuff is shown, to an extreme, and now the Start menu, at least in some screenshots, looks bigger. I don’t know if it really is or not–I saw another 1024×768 screenshot in which the Start menu actually takes a little less real estate than my current box at the same resolution. And they’ve re-drawn some icons.

As a whole there’s a more textured look now, but some of the Unixish Window managers have been doing that stuff since 1997. The login screen bears a definite resemblance to some of the Unixish login screens I’ve seen of late.

Microsoft is claiming this is the most significant user interface change since Windows 95. That’s true, but it’s not the big step that Windows 95 was from Windows 3.x. It’s an evolutionary step, and one that should have been expected, given that the Windows 9x Explorer interface is now older than the Program Manager interface was when it was replaced. Had 24-bit displays been common in 1995, Microsoft probably would have gone with a textured look then–they’ve always liked such superficialities.

Stress tests. New hardware, or suspect hardware, should always be stress-tested to make sure it’s up to snuff. Methods are difficult to find, however, especially under Windows. Running a benchmark repeatedly can be a good way to test a system–overclockers frequently complain that their newly overclocked systems can’t finish benchmark suites–but is it enough? And when the system can’t finish, the problem can be an OS or driver issue as well.

Stress testing with Linux would seem to be a good solution. Linux is pretty demanding anyway; run it hard and it’ll generally expose a system’s weaknesses. So I did some looking around. I found a stress test employed by VA-Linux at http://sourceforge.net/projects/va-ctcs/ that looked OK. And I found another approach at http://www.eskimo.com/~pygmy/stress.txt that just speaks of experience stress testing by repeatedly compiling the Linux kernel, which gives the entire system (except for the video card) a really good workout.

And the unbelievable… Someone at work mentioned an online President’s Day poll, asking who was the best president? Several obvious candidates are up on Mt. Rushmore: Washington, Lincoln, Jefferson, Teddy Roosevelt. Most people would add FDR and possibly Harry Truman and Woodrow Wilson to that list. I was talking with a good friend the other day about just this issue, and I argued in favor of Lincoln. Washington had a tough job of setting a standard, and he was great, but Lincoln had an even tougher job of holding a bitterly divided country together. So if I had to rank them, I’d probably say Lincoln, Washington, Jefferson, Teddy Roosevelt, and then we have a mess. I don’t agree with their politics, but FDR and Woodrow Wilson probably belong in there. James Madison and James Monroe belong in there, the question is where. Then it starts to get really tough. Was Harry Truman in those guys’ league? Not really, but he’s worlds better than Warren G. Harding and Bill Clinton. Fine, pencil him in at 9. Now who gets #10? Some would give it to Ronald Reagan. It seems to me that Reagan is at once overappreciated and underappreciated. A lot of people put him at the very bottom, which I think is unfair. But then there was this poll  that put him at the very top, by a very wide margin. When I looked, Reagan had 44% of the vote, followed by George Washington at 29% and Abraham Lincoln a distant third at 14%.

When I speak of the hard right in the media, that’s what I’m referring to: blind allegiance to an icon, however flawed. Don’t get me wrong, Reagan was no Warren G. Harding–he did win the Cold War after all. Conservatives say his economic policies saved the country, while liberals say it very nearly wrecked it. All I can tell you is my college economics professor taught that Reagan at the very least had the right idea–the big problem with the theory behind Reagan’s policies is the impossibility of knowing whether you’d gone too far or not far enough. Fine. FDR played a similar game. Both are revered by their parties and hated by the other party. But as president, neither Ronald Reagan nor FDR are in the Washington and Lincoln league. As a man, FDR probably was in that league, and if he was not the last, he was very close to it. But with the truly great presidents, there is very little doubt about them–and in the cases of Lincoln and Jefferson, their greatest critics were the voices inside their own heads.

Great people just don’t run for president anymore, and they rarely run for political office, period. It’s easy to see why. Anyone truly qualified to be President of the United States is also qualified to be en executive at a large multinational corporation, and that’s a far more profitable and less frustrating job. And the truly great generally aren’t willing to compromise as much as a politician must in order to get the job.

Early on, we had no shortage whatsoever of great minds in politics: Washington, Jefferson, Madison and Monroe certainly. Plus men who never were president, like Benjamin Franklin and Alexander Hamilton. We had, in effect, from Washington to Monroe, a string of men who met Socrates’ qualifications to be Philosopher-King. (Yes, John Adams was single-term, but he was a cut above most of those who were to follow.)

But as our country developed, so many better things for a great mind to do sprung up. Today you can be an executive at a large company, or you can be a researcher, or a pundit, or the president of a large and prestigious university. In 1789, there weren’t as many things to aspire to.

If we’ve got any Benjamin Franklins and Thomas Jeffersons and George Washingtons and Abraham Lincolns out there today (and I believe we do), they’ve got better things to do than waste time in Washington, D.C.

No, our greatest president wasn’t Ronald Reagan, just as it wasn’t Dwight Eisenhower or John Kennedy. That’s nostalgia talking.

01/11/2001

Mailbag:

My docs; Apple; Lost cd rom drive

It’s that time of year again. MacWorld time. I work with Macs way too much, so of course I have opinions. If you expect me to withhold them, you don’t know me very well.

Let’s face it: Apple’s in serious trouble. Serious trouble. They can’t move inventory. The Cube is a bust–unexpandable, defect-ridden, and overpriced. The low-end G4 tower costs less than the Cube but offers better expandability.  Buying a Cube is like marrying a gorgeous airhead. After the looks fade in a few years, you’re permanently attached to an airhead. So people buy a G4 tower, which has better expandability, or they get an iMac, which costs less.

Unfortunately, that gorgeous airhead metaphor goes a long way with Apple. The Mac’s current product line is more about aesthetics than anything else. So they’ve got glitzy, glamorous cases (not everyone’s cup of tea, but hey, I hear some people lust after Britney Spears too), but they’re saddled with underpowered processors dragged down by an operating system less sophisticated under the hood than the OS Commodore shipped with the first Amiga in 1985. I don’t care if your PowerPC is more efficient than an equivalently-clocked Pentium IV (so’s a VIA Cyrix III but no one’s talking about it), because if your OS can’t keep that CPU fed with a steady stream of tasks, it just lost its real-world advantage.

But let’s set technical merit aside. Let’s just look at pure practicalities. You can buy an iMac for $799. Or, if you’re content with a low-end computer, for the same amount of money you can buy a low-end eMachine and pair it up with a 19-inch NEC monitor and still have a hundred bucks left over to put towards your printer. Yeah, so the eMachine doesn’t have the iMac’s glitzy looks. I’ll trade glitz for a 19-inch monitor. Try working with a 19-inch and then switch to a 15-inch like the iMac has. You’ll notice a difference.

So the eMachine will be obsolete in a year? So will the iMac. You can spend $399 for an accelerator board for your iMac. Or you can spend $399 for a replacement eMachine (the 19-inch monitor will still be nice for several years) and get a hard drive and memory upgrade while you’re at it.

On the high end, you’ve got the PowerMac G4 tower. For $3499, you get a 733 MHz CPU, 256 MB RAM, 60 GB HD, a DVD-R/CD-R combo drive, internal 56K modem, gigabit Ethernet you won’t use, and an nVidia GeForce 2 MX card. And no monitor. Software? Just the OS and iMovie, which is a fun toy. You can order one of these glitzy new Macs today, but Apple won’t ship it for a couple of months.

Still, nice specs. For thirty-five hundred bucks they’d better be nice! Gimme thirty-five hundred smackers and I can build you something fantabulous.

But I’m not in the PC biz, so let’s see what Micron might give me for $3500. For $3514, I configured a Micron ClientPro DX5000. It has dual 800 MHz Pentium III CPUs (and an operating system that actually uses both CPUs!), 256 MB of RDRAM, a 7200 RPM 60 GB hard drive, a DVD-ROM and CD-RW (Micron doesn’t offer DVD-R, but you can get it third-party if you must have one), a fabulous Sound Blaster Live! card, a 64 MB nVidia GeForce 2 MX, and in keeping with Apple tradition, no monitor. I skipped the modem because Micron lets me do that. If you must have a modem and stay under budget, you can throttle back to dual 766 MHz CPUs and add a 56K modem for $79. The computer also includes Intel 10/100 Ethernet, Windows 2000, and Office 2000.

And you can have it next week, if not sooner.

I went back to try to configure a 1.2 GHz AMD Athlon-based system, and I couldn’t get it over $2500. So just figure you can get a machine with about the same specs, plus a 19-inch monitor and a bunch more memory.

Cut-throat competition in PC land means you get a whole lot more bang for your buck with a PC. And PC upgrades are cheap. A Mac upgrade typically costs $400. With PCs you can often just replace a CPU for one or two hundred bucks down the road. And switching out a motherboard is no ordeal–they’re pretty much standardized at this point, and PC motherboards are cheap. No matter what you want, you’re looking at $100-$150. Apple makes it really hard to get motherboard upgrades before the machines are obsolete.

It’s no surprise at all to me that the Mac OS is now the third most-common OS on the desktop (fourth if you count Windows 9x and Windows NT/2000 as separate platforms), behind Microsoft’s offerings and Linux. The hardware is more powerful (don’t talk to me about the Pentium 4–we all know it’s a dog, that’s why only one percent of us are buying it), if only by brute force, and it’s cheaper to buy and far cheaper to maintain.

Apple’s just gonna have to abandon the glitz and get their prices down. Or go back to multiple product lines–one glitzy line for people who like that kind of thing, and one back-to-basics line that uses standard ATX cases and costs $100 less off the top just because of it. Apple will never get its motherboard price down to Intel’s range, unless they can get Motorola to license the Alpha processor bus so they can use the same chipsets AMD uses. I seriously doubt they’ll do any of those things.

OS X will finally start to address the technical deficiencies, but an awful lot of Mac veterans aren’t happy with X.

Frankly, it’s going to take a lot to turn Apple around and make it the force it once was. I don’t think Steve Jobs has it in him, and I’m not sure the rest of the company does either, even if they were to get new leadership overnight. (There’s pressure to bring back the legendary Steve Wozniak, the mastermind behind the Apple II who made Apple great in the 1970s and 1980s.)

I don’t think they’ll turn around because I don’t think they care. They’ll probably always exist as a niche player, selling high-priced overdesigned machines to people who like that sort of thing, just as Jaguar exists as a niche player, selling high-priced swanky cars to people who like that sort of thing. And I think the company as a whole realizes that and is content with it. But Jaguar’s not an independent company anymore, nor is it a dominant force in the auto industry. I think the same fate is waiting for Apple.

Mailbag:

My docs; Apple; Lost cd rom drive

Plextor bargains, and Year 2000 in review

A bargain Plextor CD-RW. I just spotted this great tip in a link to a link to a link in the StorageReview forums. The Iomega ZipCD 12x10x32 appears to be a relabeled Plextor drive, and it sometimes sells for around $100. So if you’re looking for the best CD-R on the market at a great price, go get it.
Details are at www.roundsparrow.com/comp/iomega1 if you want to have a look-see.

The $99 price seems to be a CompUSA special sale. Check local availability at www.compusa.com/products/product_info.asp?product_code=280095 if you’re interested.

Incidentally, the IDE 12x10x32 drives from TDK and Creative are also reported to be re-branded Plextors. Regular retail price on these four “twin” drives is similar, around $300. The TDK and Creative drives come with Nero Burning ROM, however, making them more desirable than the Plextor model. Iomega bundles Adaptec’s CD suite.

Happy New Year. An ancient Chinese curse says, “May you live in interesting times.” Well, 2000 certainly was interesting. So, my toast to you this year is this: May 2001 be less interesting than 2000. Boring isn’t always bad. Just usually.

Linux 2.4 almost made it. Yesterday, Linus Torvalds released linux2.4-prerelease and vowed there won’t be a prerelease1, prerelease2, etc.–this is it. Bugs get fixed in this one, then the final 2.4 comes out (to be immediately followed by linux2.4ac1, no doubt–Alan Cox always releases a patched kernel swatting a couple of bugs within hours of Linus releasing the new kernel. It happened with 2.0 and with 2.2, and history repeats itself).

Anyway, the 2.2 prerelease turned into a series in spite of Linus’ vows, so Linus isn’t always right, but I expect 2.4 will be out this month, if not this week.

Linux 2.4 will increase performance, especially on high-memory and SMP machines, but I ran a 2.3 series kernel (basically the Linux equivalent of an alpha release of 2.4) on my P120 for a long time and found it to be faster than 2.2, even on a machine that humble. I also found it to be more stable than Microsoft’s final releases, but hey.

I ought to download 2.4prerelease and put it on my dual Celeron box to see how far it’s come, but I doubt I get around to it today.

Other lowlights of 2000. Windows 2000 flopped. It’s not a total disaster, but sales aren’t meeting Microsoft’s expectations. PC sales flopped, and that was a disaster. The Pentium 4 was released to awful reviews. Nvidia bought the mortal remains of 3dfx for a song. Similarly, Aureal departed from this mortal coil, purchased by longtime archrival Creative Labs after bankruptcy. (In a former incarnation, before bankruptcy and being run into the ground, Aureal was known as MediaVision. PC veterans probably remember them.) A federal judge ordered the breakup of Microsoft, but the appeals process promises to at least delay it, if not prevent it. We’ll hear a lot about that in 2001, but 2001 probably won’t bring any closure.

Hmm, other highlights. Apple failed to release OS X this year, and saw its new product line flop. Dotcom after dotcom shuttered its doors, much to Wall Street’s dismay. Linux companies didn’t topple MS, much to Wall Street’s dismay. And speaking of Wall Street, Larry Ellison (Oracle) and Bill Gates (Microsoft) flip-flopped in the rankings of richest man in the world several times.

And two of my favorite pundits, Bob Metcalfe and G. Burgess Alison, called it quits last year. They are sorely missed.

And once again, 2000 wasn’t the year of the NC.

I know I missed a few. But those were the highlights, as I see them.

01/01/2001

Mailbag:

Partition; IDE/SCSI; Lost CD ROM; Optimizing ME; Win 98/ME

A bargain Plextor CD-RW. I just spotted this great tip in a link to a link to a link in the StorageReview forums. The Iomega ZipCD 12x10x32 appears to be a relabeled Plextor drive, and it sometimes sells for around $100. So if you’re looking for the best CD-R on the market at a great price, go get it.

Details are at www.roundsparrow.com/comp/iomega1 if you want to have a look-see.

The $99 price seems to be a CompUSA special sale. Check local availability at www.compusa.com/products/product_info.asp?product_code=280095 if you’re interested.

Incidentally, the IDE 12x10x32 drives from TDK and Creative are also reported to be re-branded Plextors. Regular retail price on these four “twin” drives is similar, around $300. The TDK and Creative drives come with Nero Burning ROM, however, making them more desirable than the Plextor model. Iomega bundles Adaptec’s CD suite.

Happy New Year. An ancient Chinese curse says, “May you live in interesting times.” Well, 2000 certainly was interesting. So, my toast to you this year is this: May 2001 be less interesting than 2000. Boring isn’t always bad. Just usually.

Linux 2.4 almost made it. Yesterday, Linus Torvalds released linux2.4-prerelease and vowed there won’t be a prerelease1, prerelease2, etc.–this is it. Bugs get fixed in this one, then the final 2.4 comes out (to be immediately followed by linux2.4ac1, no doubt–Alan Cox always releases a patched kernel swatting a couple of bugs within hours of Linus releasing the new kernel. It happened with 2.0 and with 2.2, and history repeats itself).

Anyway, the 2.2 prerelease turned into a series in spite of Linus’ vows, so Linus isn’t always right, but I expect 2.4 will be out this month, if not this week.

Linux 2.4 will increase performance, especially on high-memory and SMP machines, but I ran a 2.3 series kernel (basically the Linux equivalent of an alpha release of 2.4) on my P120 for a long time and found it to be faster than 2.2, even on a machine that humble. I also found it to be more stable than Microsoft’s final releases, but hey.

I ought to download 2.4prerelease and put it on my dual Celeron box to see how far it’s come, but I doubt I get around to it today.

Other lowlights of 2000. Windows 2000 flopped. It’s not a total disaster, but sales aren’t meeting Microsoft’s expectations. PC sales flopped, and that was a disaster. The Pentium 4 was released to awful reviews. Nvidia bought the mortal remains of 3dfx for a song. Similarly, Aureal departed from this mortal coil, purchased by longtime archrival Creative Labs after bankruptcy. (In a former incarnation, before bankruptcy and being run into the ground, Aureal was known as MediaVision. PC veterans probably remember them.) A federal judge ordered the breakup of Microsoft, but the appeals process promises to at least delay it, if not prevent it. We’ll hear a lot about that in 2001, but 2001 probably won’t bring any closure.

Hmm, other highlights. Apple failed to release OS X this year, and saw its new product line flop. Dotcom after dotcom shuttered its doors, much to Wall Street’s dismay. Linux companies didn’t topple MS, much to Wall Street’s dismay. And speaking of Wall Street, Larry Ellison (Oracle) and Bill Gates (Microsoft) flip-flopped in the rankings of richest man in the world several times.

And two of my favorite pundits, Bob Metcalfe and G. Burgess Alison, called it quits last year. They are sorely missed.

And once again, 2000 wasn’t the year of the NC.

I know I missed a few. But those were the highlights, as I see them.

Mailbag:

Partition; IDE/SCSI; Lost CD ROM; Optimizing ME; Win 98/ME

Mac mice, PC data recovery

A two-button Mac mouse!? Frank McPherson asked what I would think of the multibutton/scroll wheel support in Mac OS X. Third-party multibutton mice have been supported via extensions for several years, but not officially from Ye Olde Apple. So what do I think? About stinkin’ time!

I use 3-button mice on my Windows boxes. The middle button double-clicks. Cuts down on clicks. I like it. On Unix, where the middle button brings up menus, I’d prefer a fourth button for double-clicking. Scroll wheels I don’t care about. The page up/down keys have performed that function just fine for 20 years. But some people like them; no harm done.

Data recovery. One of my users had a disk yesterday that wouldn’t read. Scandisk wouldn’t fix it. Norton Utilities 2000 wouldn’t fix it. I called in Norton Utilities 8. Its disktool.exe includes an option to revive a disk, essentially by doing a low-level format in place (presumably it reads the data, formats the cylinder, then writes the data back). That did the trick wonderfully. Run Disktool, then run NDD, then copy the contents to a fresh disk immediately.

So, if you ever run across an old DOS version of the Norton Utilities (version 7 or 8 certainly; earlier versions may be useful too), keep them! It’s something you’ll maybe need once a year. But when you need them, you need them badly. (Or someone you support does, since those in the know never rely on floppies for long-term data storage.) Recent versions of Norton Utilities for Win32 don’t include all of the old command-line utilities.

Hey, who was the genius who decided it was a good idea to cut, copy and paste files from the desktop? One of the nicest people in the world slipped up today copying a file. She hit cut instead of copy, then when she went to paste the file to the destination, she got an error message. Bye-bye file. Cut/copy-paste works fine for small files, but this was a 30-meg PowerPoint presentation. My colleague who supports her department couldn’t get the file back. I ride in on my white horse, Norton Utilities 4.0 for Windows in hand, and run Unerase off the CD. I get the file back, or so it appears. The undeleted copy won’t open. On a hunch, I hit paste. Another copy comes up. PowerPoint chokes on it too.

I tried everything. I ran PC Magazine’s Unfrag on it, which sometimes fixes problematic Office documents. No dice. I downloaded a PowerPoint recovery program. The document crashed the program. Thanks guys. Robyn never did you any harm. Now she’s out a presentation. Not that Microsoft cares, seeing as they already have the money.

I walked away wondering what would have happened if Amiga had won…

And there’s more to life than computers. There’s songwriting. After services tonight, the music director, John Scheusner, walks up and points at me. “Don’t go anywhere.” His girlfriend, Jennifer, in earshot, asks what we’re plotting. “I’m gonna play Dave the song that he wrote. You’re more than welcome to join us.”

Actually, it’s the song John and I wrote. I wrote some lyrics. John rearranged them a little (the way I wrote it, the song was too fast–imagine that, something too fast from someone used to writing punk rock) and wrote music.

I wrote the song hearing it sung like The Cars, (along the lines of “Magic,” if you’re familiar with their work) but what John wrote and played sounded more like Joe Jackson. Jazzy. I thought it was great. Jennfier thought it was really great.

Then John tells me they’re playing it Sunday. They’re what!? That will be WEIRD. And after the service will be weird too, seeing as everybody knows me and nobody’s ever seen me take a lick of interest in worship music before.

I like it now, but the lyrics are nothing special, so I don’t know if I’ll like it in six months. We’ll see. Some people will think it’s the greatest thing there ever was, just because two people they know wrote it. Others will call it a crappy worship song, but hopefully they’ll give us a little credit: At least we’re producing our own crappy worship songs instead of playing someone else’s.

Then John turns to me on the way out. “Hey, you’re a writer. How do we go about copyrighting this thing?” Besides writing “Copyright 2000 by John Scheusner and Dave Farquhar” on every copy, there’s this.  That’s what the Web is for, friends.

~~~~~~~~~~

Note: I post this letter without comment, since it’s a response to a letter I wrote. My stuff is in italics. I’m not sure I totally agree with all of it, but it certainly made me think a lot and I can’t fault the logic.

From: John Klos
Subject: Re: Your letter on Jerry Pournelle’s site

Hello, Dave,

I found both your writeup and this letter interesting. Especially interesting is both your reaction and Jerry’s reaction to my initial letter, which had little to do with my server.To restate my feelings, I was disturbed about Jerry’s column because it sounded so damned unscientific, and I felt that he had a responsibility to do better.
His conclusion sounded like something a salesperson would say, and in fact did sound like things I have heard from salespeople and self-promoted, wannabe geeks. I’ve heard all sorts of tales from people like this, such as the fact that computers get slower with age because the ram wears out…

Mentioning my Amiga was simply meant to point out that not only was I talking about something that bothered me, but I am running systems that “conventional wisdom” would say are underpowered. However, based upon what both you and Jerry have replied, I suppose I should’ve explained more about my Amiga.

I have about 50 users on erika (named after a dear friend). At any one moment, there are anywhere from half a dozen to a dozen people logged on. Now, I don’t claim to know what a Microsoft Terminal Server is, nor what it does, but it sounds something like an ’80s way of Microsoft subverting telnet.

My users actually telnet (technically, they all use ssh; telnet is off), they actually do tons of work is a shell, actually use pine for email and links (a lynx successor) for browsing. I have a number of developers who do most of their development work in any of a number of languages on erika (Perl, C, C++, PHP, Python, even Fortran!).

Most of my users can be separated into two groups: geeks and novices. Novices usually want simple email or want to host their domain with a minimum of fuss; most of them actually welcome the simplicity, speed, and consistency of pine as compared to slow and buggy webmail. Who has used webmail and never typed a long letter only to have an error destroy the entire thing?

The geeks are why sixgirls.org got started. We all
had a need for a place
to call home, as we all have experienced the nomadic life of being a geek
on the Internet with no server of our own. We drifted from ISP to ISP
looking for a place where our Unix was nice, where our sysadmins listened,
and where corporate interests weren’t going to yank stuff out from underneath us at any moment. Over the years, many ISPs have stopped
offering shell access and generally have gotten too big for the comfort of
geeks.

If Jerry were replying to this now, I could see him saying that shells are
old school and that erika is perhaps not much more than a home for  orphans and die-hard Unix fans. I used to think so, too, but the more novice users I add, the more convinced I am that people who have had no shell experience at all prefer the ease, speed, and consistency of the shell
over a web browser type interface. They’re amazed at the speed. They’re
surprised over the ability to instantly interact with others using talk and ytalk.

The point is that this is neither a stopgap nor a dead end; this IS the
future. I read your message to Jerry and it got me thinking a lot. An awful
lot. First on the wisdom of using something other than what Intel calls a server, then on the wisdom of using something other than a Wintel box as a server. I probably wouldn’t shout it from the mountaintops if I were doing it, but I’ve done it myself. As an Amiga veteran (I once published an article in Amazing Computing), I smiled when I saw what you were doing with your A4000. And some people no doubt are very interested in that. I wrote some about that on my Weblogs site (address below if you’re interested).

I am a Unix Systems Administrator, and I’ve set up lots of servers. I made
my decision to run everything on my Amiga based upon several
criteria:
One, x86 hardware is low quality. I stress test all of the servers I
build, and most x86 hardware is flawed in one way or another. Even if
those flaws are so insignificant that they never affect the running of a
server, I cannot help but wonder why my stress testing code will run just
fine on one computer for months and will run fine on another computer for
a week, but then dump a core or stop with an error. But this is quite
commonplace with x86 hardware.

For example, my girlfriend’s IBM brand FreeBSD computer can run the stress testing software indefinitely while she is running the GIMP, Netscape, and all sorts of other things. This is one of the few PCs that never has any problems with this stress testing software. But most of the other servers I set up, from PIIIs, dual processor PIIIs and dual Celerons, to Cyrix 6×86 and MII, end up having a problem with my software after anywhere from a few days to a few weeks. But they all have remarkable uptimes, and none crash for any reason other than human error (like kicking the cord).

However, my Amigas and my PowerMacs can run this software indefinitely.

So although I work with x86 extensively, it’s not my ideal choice. So what
else is there? There’s SPARC, MIPS, m68k, PowerPC, Alpha, StrongARM… pleanty of choices.

I have a few PowerMacs and a dual processor Amiga (68060 and 200 mhz PPC 604e); however, NetBSD for PowerMacs is not yet as mature as I need it to be. For one, there is no port of MIT pthreads, which is required for MySQL. Several of my users depend on MySQL, so until that is fixed, I can’t consider using my PowerMac. Also, because of the need to boot using Open Firmware, I cannot set up my PowerMac to boot unattended. Since my machine is colocated, I would have to be able to run down to the colocation facility if anything ever happened to it. That’s
fine if I’m in the city, but what happens when I’m travelling in Europe?

SPARC is nice, but expensive. If I could afford a nice UltraSPARC, I
would. However, this porject started as a way to have a home for
geeks; coming up with a minimum of $3000 for something I didn’t even plan to charge for wasn’t an option.

Alpha seems too much like PC hardware, but I’d certainly be willing to
give it a try should send me an old Alpha box.

With MIPS, again, the issue is price. I’ve always respected the quality of
SGI hardware, so I’d definitely set one up if one were donated.

StrongARM is decent. I even researched this a bit; I can get an ATX
motherboard from the UK with a 233 mhz StrongARM for about 310 quid. Not too bad.

But short of all of that, I had a nice Amiga 4000 with a 66 mhz 68060, 64
bit ram, and wide ultra SCSI on board. Now what impresses me about this
hardware is that I’ve run it constantly. When I went to New Orleans last
year during the summer, I left it in the apartment, running, while the
temperatures were up around 100 degrees. When I came back, it was
fine. Not a complaint.

That’s the way it’s always been with all of my Amigas. I plug them in,
they run; when I’m done, I turn off the monitor. So when I was considering
what computer to use as a server when I’d be paying for a burstable 10
Mbps colocation, I wanted something that would be stable and consistent.

 Hence Amiga.

One of my users, after reading your letter (and, I guess, Jerry’s),
thought that I should mention the load average of the server; I assume
this is because of the indirectly stated assumption that a 66 mhz 68060 is
just squeaking by. To clarify that, a 66 mhz 68060 is faster per mhz than
any Pentium by a measurable margin when using either optimised code (such as a distributed.net client) or straight compiled code (such as LAME). We get about 25,000 hits a day, for a total of about 200 megs a day, which accounts for one e

ighth of one percent of the CPU time. We run as a Stratum 2 time server for several hundred computers, we run POP and IMAP services, sendmail, and we’re the primary nameserver for perhaps a hundred machines. With a distributed.net client running, our load average hovers arount 1.18, which means that without the dnet client, we’d be idle most of the time.

If that weren’t good enough, NetBSD 1.5 (we’re running 1.4.2) has a much
improved virtual memory system (UVM), improvements and speedups in the TCP stack (and complete IPv6 support), scheduler enhancements, good softdep support in the filesystem (as if two 10k rpm 18 gig IBM wide ultra drives aren’t fast enough), and more.

In other words, things are only going to get better.

The other question you raise (sort of) is why Linux gets so much more
attention than the BSD flavors. I’m still trying to figure that one
out. Part of it is probably due to the existance of Red Hat and
Caldera and others. FreeBSD gets some promotion from Walnut
Creek/BSDi, but one only has to look at the success of Slackware to
see how that compares.

It’s all hype; people love buzz words, and so a cycle begins: people talk
about Linux, companies spring up to provide Linux stuff, and people hear
more and talk more about Linux.

It’s not a bad thing; anything that moves the mainstream away from
Microsoft is good. However, the current trend in Linux is not good. Red
Hat (the company), arguably the biggest force in popularising Linux in the
US, is becoming less and less like Linux and more and more like a software company. They’re releasing unstable release after unstable release with no apologies. Something I said a little while ago, and someone has been using as his quote in his email:
In the Linux world, all of the major distributions have become
companies. How much revenue would Red Hat generate if their product was flawless? How much support would they sell?

I summarise this by saying that it is no longer in their best interest to
have the best product. It appears to be sufficient to have a working
product they can use to “ride the wave” of popularity of Linux.

I used Linux for a long time, but ultimately I was always frustrated with
the (sometimes significant) differences between the distributions, and
sometimes the differences between versions of the same distribution. Why
was it that an Amiga running AmigaDOS was more consistent with Apache and Samba docs than any particular Linux? Where was Linux sticking all of
these config files, and why wasn’t there documentation saying where the
stuff was and why?

When I first started using BSD, I fell in love with its consistency, its
no bull attitude towards ports and packa
ges, and its professional and
clean feel. Needless to say, I don’t do much linux anymore.

It may well be due to the people involved. Linus Torvalds is a
likeable guy, a smart guy, easily identifiable by a largely computer
illiterate press as an anti-Gates. And he looks the part. Bob Young is
loud and flambouyant. Caldera’s the company that sued Microsoft and probably would have won if it hadn’t settled out of court. Richard
Stallman torques a lot of people off, but he’s very good at getting
himself heard, and the GPL seems designed at least in part to attract
attention. The BSD license is more free than the GPL, but while
freedom is one of Stallman’s goals, clearly getting attention for his
movement is another, and in that regard Stallman succeeds much more than the BSD camp. The BSD license may be too free for its own good.

Yes, there aren’t many “figureheads” for BSD; most of the ones I know of
don’t complain about Linux, whereas Linux people often do complain about the BSD folks (the major complaint being the license).

I know Jerry pays more attention to Linux than the BSDs partly because Linux has a bigger audience, but he certainly knows more about Linux than about any other Unix. Very soon after he launched his website, a couple of Linux gurus (most notably Moshe Bar, himself now a Byte columnist) started corresponding with him regularly, and they’ve made Linux a reasonably comfortable place for him, answering his questions and getting him up and going.

So then it should be their responsibility, as Linux advocates, to give
Jerry a slightly more complete story, in my opinion.

As for the rest of the press, most of them pay attention to Linux only because of the aforementioned talking heads. I have a degree in journalism from supposedly the best journalism school in the free world, which gives me some insight into how the press works (or doesn’t, as is usually the case). There are computer journalists who get it, but a g

ood deal of them are writing about computers for no reason in particular, and their previous job and their next job are likely to be writing about something else. In journalism, if three sources corroborate something, you can treat it as fact. Microsoft-sympathetic sources are rampant, wherever you are. The journalist probably has a Mac sympathy since there’s a decent chance that’s what he uses. If he uses a Windows PC, he may or may not realize it. He’s probably heard of Unix, but his chances of having three local Unix-sympathetic sources to use consistently are fairly slim. His chances of having three Unix-sympathetic sources who agree enough for him to treat what they say as fact (especially if one of his Microsofties contradicts it) are probably even more slim.

Which furthers my previous point: Jerry’s Linux friends should be more
complete in their advocacy.

Media often seems to desire to cater to the lowest common denominator, but it is refreshing to see what happens when it doesn’t; I can’t stand US
news on TV, but I’ll willingly watch BBC news, and will often learn more
about US news than if I had watched a US news program.

But I think that part of the problem, which is compounded by the above, is
that there are too many journaists that are writing about computers,
rather than computer people writing about computers.

After all, which is more presumptuous: a journaist who thinks that he/she
can enter the technical world of computing and write authoritatively about
it, or a computer person who attempts to be a part time journalist? I’d
prefer the latter, even if it doesn’t include all of the accoutrements
that come from the writings of a real journalist.

And looking at the movement as a whole, keep in mind that journalists look for stories. Let’s face it: A college student from Finland writing an operating system and giving it away and millions of people thinking it’s better than Windows is a big story. And let’s face it, RMS running
around looking like John the Baptist extolling the virtues of something called Free Software is another really good story, though he’d get a lot more press if he’d talk more candidly about the rest of his life, since that might be the hook that gets the story. Can’t you see this one now?

Yes. Both of those stories would seem much more interesting than, “It’s
been over three years and counting since a remote hole was found in
OpenBSD”, because it’s not sensationalistic, nor is it interesting, nor
can someone explain how you might end up running OpenBSD on your
appliances (well, you might, but the fact that it’s secure means that it’d
be as boring as telling you why your bathtub hasn’t collapsed yet).

Richard Stallman used to keep a bed in his office at the MIT Artificial Intelligence Lab.

He slept there. He used the shower down the hall. He didn’t have a home outside the office. It would have distracted him from his cause: Giving away software.

Stallman founded the Free Software movement in 1983. Regarded by many as the prophet of his movement (and looking the part, thanks to his long, unkempt hair and beard), Stallman is both one of its most highly regarded programmers and perhaps its most outspoken activist, speaking at various functions around the world.

Linux was newsworthy, thanks to the people behind it, way back in 1993 when hardly anyone was using it. Back then, they were the story. Now, they can still be the story, depending on the writer’s approach.

If there are similar stories in the BSD camp, I’m not aware of them. (I can tell you the philosophical differences between OpenBSD,  NetBSD and FreeBSD and I know a little about the BSD directory structure, but that’s where my knowledge runs up against its limits. I’d say I’m more familiar with BSD than the average computer user but that’s not saying much.) But I can tell you my editor would have absolutely eaten this up. After he or she confirmed it wasn’t fiction.

The history is a little dry; the only “juicy” part is where Berkeley had
to deal with a lawsuit from AT&T (or Bell Labs; I’m not doing my research
here) before they could make their source free.

Nowadays, people are interested because a major layer of Mac OS X is BSD, and is taken from the FreeBSD and NetBSD source trees. Therefore, millions of people who otherwise know nothing about BSD or its history will end up running it when Mac OS X Final comes out in January; lots of people already are running Mac OS X Beta, but chances are good that the people who bought the Beta know about the fact that it’s running on BSD.

And it’s certainly arguable that BSD is much more powerful and robust than Windows 2000. So there’s a story for you. Does that answer any of your question?

Yes; I hope I’ve clarified my issues, too.

Neat site! I’ll have to keep up on it.

Thanks,
John Klos

Scanner troubleshooting secrets

~Mail Follows Today’s Post~

Scanner wisdom. One of the things I did last week was set up a Umax scanner on a new iMac DV. The scanner worked perfectly on a Windows 98 PC, but when I connected it to the Mac it developed all sorts of strange diseases–not warming up properly, only scanning 1/3 of the page before timing out, making really loud noises, crashing the system…

I couldn’t resolve it, so I contacted Umax technical support. The tech I spoke with reminded me of a number of scanner tips I’d heard before but had forgotten, and besides that, I rarely if ever see them in the scanner manuals.

  • Plug scanners directly into the wall, not into a power strip. I’ve never heard a good explanation of why scanners are more sensitive to this than any other peripheral, but I’ve seen it work.
  • Plug USB scanners into a powered hub, or better yet, directly into the computer. USB scanners shouldn’t need power from the USB port, since they have their own power source, but this seems to make a difference.
  • Download the newest drivers, especially if you have a young operating system like MacOS 9, Mac OS X, Windows ME, or Windows 2000. It can take a little while for the scanner drivers to completely stabilize. Don’t install off the CD that came with the scanner, because it might be out of date. Get the newest stuff from the manufacturer’s Web site.
  • Uninstall old drivers before installing the new ones. This was the problem that bit me. The new driver didn’t totally overwrite the old one, creating a conflict that made the scanner go goofy.
  • Buy your scanner from a company that has a track record of providing updated drivers. Yes, that probably means you shouldn’t buy the $15 scanner with the $25 mail-in rebate. Yes, that means don’t buy HP. Up until a couple of years ago, getting NT drivers out of HP was like pulling teeth; now HP is charging for Windows 2000 drivers. HP also likes to abandon and then pick back up Mac support on a whim. Terrible track record.

Umax’s track record is pretty darn good. I’ve downloaded NT drivers for some really ancient Umax scanners after replacing old Macs with NT boxes. I once ran into a weird incompatibility with a seven-year-old Umax scanner–it was a B&W G3 with a wide SCSI controller (why, I don’t know) running Mac OS 8.6. Now that I think about it, I think the incompatibility was with the controller card. The scanner was discontinued years ago (before Mac OS 8 came out), so expecting them to provide a fix was way out of line.
m I’ve ever had with a Umax that they didn’t resolve, so when I spec out a scanner at work, Umax is always on my short list.

And here’s something I just found interesting. Maybe I’m the only one. But in reading the mail on Jerry Pournelle’s site, I found this. John Klos, administrator of sixgirls.org, takes Jerry to task for saying a Celeron can’t be a server. He cites his 66 MHz 68060-based Amiga 4000, which apparently acts as a mail and Web server, as proof. Though the most powerful m68k-based machine ever made, its processing power pales next to any Celeron (spare the original cacheless Celeron 266 and 300).

I think the point he was trying to make was that Unix plays by different rules. Indeed, when your server OS isn’t joined at the hip to a GUI and a Web browser and whatever else Gates tosses in on a whim, you can do a lot more work with less. His Amiga would make a lousy terminal server, but for serving up static Web pages and e-mail, there’s absolutely nothing wrong with it. Hosting a bunch of Web sites on an Amiga 4000 just because I could sounds very much like something I’d try myself if I had the hardware available or was willing to pay for the hardware necessary.

But I see Jerry Pournelle’s point as well.

It’s probably not the soundest business practice to advertise that you’re running off a several-year-old sub-100 MHz server, because that makes people nervous. Microsoft’s done a pretty admirable job of pounding everything slower than 350 MHz into obsolescence and the public knows this. And Intel and AMD have done a good job of marketing their high-end CPUs, resulting in people tending to lay blame at the CPU’s feet if it’s anything but a recent Pentium III. And, well, if you’re running off a shiny new IBM Netfinity, it’s very easy to get it fixed, or if need be, to replace it with another identical one. I know where to get true-blue Amiga parts and I even know which ones are interchangeable with PCs, but you might well be surprised to hear you can still get parts and that some are interchangeable.

But I’m sure there are far, far more sub-100 MHz machines out there in mission-critical situations functioning just fine than anyone wants to admit. I know we had many at my previous employer, and we have several at my current job, and it doesn’t make me nervous. The biggest difference is that most of them have nameplates like Sun and DEC and Compaq and IBM on them, rather than Commodore. But then again, Commodore’s reputation aside, it’s been years since I’ve seen a computer as well built as my Amiga 2000. (The last was the IBM PS/2 Model 80, which cost five times as much.) If I could get Amiga network cards for a decent price, you’d better believe I’d be running that computer as a firewall/proxy and other duties as assigned. I could probably get five years’ uninterrupted service from old Amy. Then I’d just replace her memory and get another ten.

The thing that makes me most nervous about John Klos’ situation is the business model’s dependence on him. I have faith in his A4000. I have faith in his ability to fix it if things do go wrong (anyone running NetBSD on an Amiga knows his machine better than the onsite techs who fix NetFinity servers know theirs). But there’s such thing as too much importance. I don’t let Apple certified techs come onsite to fix our Macs anymore at work, because I got tired of them breaking other things while they did warranty work and having to fix three things after they left. I know their machines better than they do. That makes me irreplaceable. A little job security is good. Too much job sercurity is bad, very bad. I’ll be doing the same thing next year and the year after that. It’s good to be able to say, “Call somebody else.” But that’s his problem, not his company’s or his customers’.

~~~~~~~~~~

From: rock4uandme
To: dfarq@swbell.net
Sent: Wednesday, October 25, 2000 1:22 PM
Subject: i`m having trouble with my canon bjc-210printer…

i`m having trouble with my canon bjc210 printer it`s printing every thing all red..Can you help???
 
 
thank you!!    john c
 
~~~~~~~~~

Printers aren’t my specialty and I don’t think I’ve ever seen a Canon BJC210, but if your printer has replacable printheads (some printers make the printhead part of the ink cartridge while others make them a separate component), try replacing them. That was the problem with the only Canon printer I’ve ever fixed.
 
You might try another color ink cartridge too; sometimes those go bad even if they still have ink in them.
 
If that fails, Canon does have a tech support page for that printer. I gave it a quick look and it’s a bit sketchy, but maybe it’ll help. If nothing else, there’s an e-mail address for questions. The page is at http://209.85.7.18/techsupport.php3?p=bjc210 (to save you from navigating the entire www.ccsi.canon.com page).
 

I hope that helps.

Dave
 
~~~~~~~~~~
 

From: Bruce Edwards
Subject: Crazy Win98 Networking Computer Problem

Dear Dave:

I am having a crazy computer problem which I am hoping you or your readers
may be able to give me a clue to.  I do have this posted on my daily
journal, but since I get very little traffic, I thought your readership or
yourself may be able to help.  Here’s the problem:

My wife’s computer suddenly and inexplicably became very slow when accessing
web sites and usually when accessing her e-mail.  We access the internet
normally through the LAN I installed at home.  This goes to a Wingate
machine which is connected to the aDSL line allowing shared access to the
internet.

My computer still sends and receives e-mail and accesses the web at full
speed.  Alice’s computer now appears to access the web text at about the
speed of a 9600 baud modem with graphics coming down even more slowly if at
all.  Also, her e-mail (Outlook Express) usually times out when going
through the LAN to the Wingate machine and then out over the internet. 
The LAN is working since she is making a connection out that way.

File transfer via the LAN between my PC and hers goes at full speed.
Something is causing her internet access to slow to a crawl while mine is
unaffected.  Also, it appears to be only part of her internet access.  I can
telnet out from her computer and connect to external servers very fast, as
fast as always.  I know telnet is just simple text, but the connection to
the server is very rapid too while connecting to a server via an http
browser is much much slower and then, once connected, the data flows so slow
it’s crazy.

Also, dial-up and connect to the internet via AOL and then use her mail
client and (external to AOL) browser works fine and is as speedy as you
would expect for a 56K modem.  What gives?

I tried reinstalling windows over the existing set-up (did not do anything)
and finally started over from “bare metal” as some like to say.  Reformat
the C drive.  Reinstall Windows 98, reinstall all the drivers, apps, tweak
the configuration, get it all working correctly.  Guess what?  Same slow
speed via the aDSL LAN connection even though my computer zips out via the
same connection.  Any suggestions?

Sincerely,

Bruce W. Edwards
e-mail:  bruce@BruceEdwards.com
Check www.BruceEdwards.com/journal  for my daily journal.

Bruce  🙂
Bruce W. Edwards
Sr. I.S. Auditor  
~~~~~~~~~~

From: Dave Farquhar [mailto:dfarq@swbell.net]Sent: Monday, October 23, 2000 6:16 PM
To: Edwards, Bruce
Cc: Diana Farquhar
Subject: Re: Crazy Win98 Networking Computer Problem

Hi Bruce,
 
The best thing I can think of is your MTU setting–have you run any of those MTU optimization programs? Those can have precisely the effect you describe at times. Try setting yor MTU back to 1500 and see what that does. While I wholeheartedly recommend them for dialup connections, MTU tweaking and any sort of LAN definitely don’t mix–to the point that I almost regret even mentioning the things in Optimizing Windows.
 
Short of that, I’d suggest ripping out all of your networking protocols and adapters from the Network control panel and add back in TCP/IP and only the other things you absolutely need. This’ll keep Windows from getting confused and trying to use the wrong transport, and eliminate the corrupted TCP/IP possibility. These are remote, but possible. Though your reinstall should have eliminated that possibility…
 
If it’s neither of those things, I’d start to suspect hardware. Make sure you don’t have an interrupt conflict (rare these days, but I just saw one a couple weeks ago so I don’t rule them out). Also try swapping in a different cable or NIC in your wife’s machine. Cables of course go bad more frequently than NICs, though I’ve had horrible luck with cheap NICs. At this point I won’t buy any ethernet NIC other than a Bay Netgear, 3Com or Intel.
 
I hope that helps. Let me know how it goes for you.

Dave 
~~~~~~~~~~
From: Bruce Edwards

Hi Dave:
 
Thank you for posting on your web site. I thought you would like an update.
 
I verified the MTU setting was still at 1500 (it was).  I have not used one of the optimizing programs on this PC.
 
I removed all the adapters from the PC via the control panel.  Rebooted and only added back TCP/IP on the Ethernet card. 
 
I double checked the interrupts in the control panel, there do not appear to be any conflicts and all devices report proper function.
 
I still need to 100% verify the wiring/hubs.  I think they are O.K. since that PC, using the same adapter, is able to file share with other PCs on the network.  That also implies that the adapter is O.K.
 
I will plug my PC into the same hub and port as my wife’s using the same cable to verify that the network infrastructure is O.K.
 
Then, I’ll removed the adapter and try a different one.
 
Hopefully one of these things will work.
 
Cheers,
 
Bruce
~~~~~~~~~~

This is a longshot, but… I’m wondering if maybe your DNS settings are off, or if your browser might be set to use a proxy server that doesn’t exist. That’s the only other thing I can think of that can cause sporadic slow access, unless the problem is your Web browser itself. Whichever browser you’re using, have you by any chance tried installing and testing the other one to see if it has the same problems?
 
In my experience, IE 5.5 isn’t exactly the greatest of performers, or when it does perform well, it seems to be by monopolizing CPU time. I’ve gotten much better results with IE 5.0. As for Netscape, I do wish they’d get it right again someday…
 
Thanks for the update. Hopefully we can find an answer.

Dave 
~~~~~~~~~~ 

10/26/2000

That Mac problem again. I snuck up to that Mac server I was referring to earlier this week, you know, the one that was beyond my capability to fix? I continue to maintain that no Mac is beyond any experienced tech’s capability to fix (just like no PC is beyond an experienced tech’s ability to fix). However, a Mac is a high-maintenance machine. Normal, typical use of a Mac will cause numerous filesystem errors to build up over time. I’ve heard of people running the full battery of utilities suites once a week to keep their Macs happy. Once a month is probably adequate for most.

Anyway, on with the story. The whole department was out for a long lunch, which gave me an hour and a half. That’s long enough to run DiskWarrior and TechTool Pro (this machine has seven drives on it, and some of them are in excess of seven years old so it takes a while). They marveled that afternoon about how their problem had corrected itself. Can a PC do that?

Grrr…. Sure it can. Especially if a tech can get some time alone with it.

Ray, Ray, Ray… Why aren’t you running your site on Lotus Domino and Solaris or AIX? I try to connect to Groovenetworks.com, and what do I get?

Error Occurred While Processing Request
Error Diagnostic Information
An error occurred while attempting to establish a connection to the service.

The most likely cause of this problem is that the service is not currently running. You can use the ‘Services’ Control Panel to verify that the service is running and to restart it if necessary.

Windows NT error number 2 occurred.

On the other hand, an up-and-down Web server is a very good indication that people want your product.

Now that I’ve seen some screenshots and read some more interviews to get into Ray Ozzie’s mind a little deeper, I’m starting to get Groove. This isn’t so much warmed-over Notes, I think, as it is instant messaging done right. It’s instant messaging. It’s file transfer. It’s got a sketchpad for drawing ideas for sharing. Of course there’s a Web browser window. It’s infinitely extensible (like Notes). And it’ll let you talk voice if you prefer.

I did finally manage to download it but I haven’t had a chance to install and play with it yet. But I can already think of ways I’d use it. I’d even collaborate with myself. Leave stuff I need frequently in my Groove space on my machines at work. So what if I’m in the wrong building, or worse yet, I’m at home? No problem, connect to myself and get it. (Don’t laugh; if you think instant messaging myself is bad, you’ll love the fact that I sometimes send packages to myself.) For people like me who are called upon to do three peoples’ jobs, this could be a Godsend. And once those other two people get hired, we can still use Groove to work together.

Bigger than Netscape 1.0? I think those predictors have gotten sucked into the Cult of Ray a little too deep. Is it a big deal? Oh yeah. Bigger than Notes? Probably.

As for multiplatform support… Groove opted to support, for now, the 85-90 percent of the market that runs Windows. In interviews, Ray Ozzie has said Linux and Mac OS X ports will appear. Seeing as three years ago, Linux looked poised to become a force in the server arena but not yet on the desktop and the classic Mac OS looked to be replaced any day with what became Mac OS X, it looks to me like the company made a practical decision to go with what they knew would be around and adjust course as needed. Notes eventually appeared on every major platform, which contributed to its success. I have no doubt Groove will probably follow. I’m certainly not going to hold it against him that the only version to have reached beta is Win32.