Much ado about nothing and other stuff

Much ado about nothing. The most recent report I read indicates that AOL/Time Warner and Red Hat are talking, but not about an acquisition. Sanity has entered the building…
Good thing User Friendly got a chance to get its two cents’ worth in. I got a couple bucks’ worth of laughter from it.
Much ado about something. On Sunday, Gentoo Linux developer Daniel Robbins announced that an obscure AMD Athlon bug slipped past Linux kernel developers, resulting in serious problems with Athlon- and Duron-based systems with AGP cards. This confirms some suspicions I’ve heard–one of the Linux mailing lists I subscribe to occasionally has rumblings about obscure and difficult-to-track-down Athlon problems.

The result was that Gentoo’s site was slashdotted into oblivion for a while, but hopefully it also resulted in some extra exposure for the distribution. Gentoo is another source-based distro. Lately I’ve been resigned to just using Debian to build my Linux boxes, but I’m still awfully fond of the idea of compiling your own stuff. As CPUs get faster and faster, I expect that to become more commonplace.

But I digress. The bug involves the CPU’s paging function. Older x86 CPUs used 4K pages. Starting with the Pentium, CPUs started allowing 4MB pages. But a bug in the Athlon’s implementation of this extended paging causes memory corruption when used in conjunction with an AGP cards.
Alan Cox is working on a workaround. I’m a bit surprised a patch isn’t already out there.

CPU bugs are discovered all the time, but it’s fairly rare for them to be serious. If you ever run across a Pentium-60 or Pentium-66 system, boot up Linux on it sometime and run the command dmesg. You’ll find workarounds for at least two serious bugs. A TI engineer named Robert Collins gained a fair bit of notoriety in the last decade by researching, collecting, and investigating CPU bugs. Part of it was probably due to his irreverant attitude towards Intel. (As you can see from this Wayback machine entry.) Sadly, I can’t find the story on the site anymore, since he was bought out by Dr. Dobb’s.
Catching up. I haven’t been making my rounds lately. The reason why is fairly obvious. I used my day off yesterday to have lunch with someone from my small group, then when I got home I read the e-mail I absolutely had to read, responded to those that absolutely had to get responses, answered a couple of voice messages, wrote and sent out a couple of other messages, looked up, and it was 5 p.m.

“Alright God,” I muttered. “I just gave the day to Your people. Time to go spend some time with You.” So I whipped out my handy-dandy Today’s Light Bible and read about Moses. Seemed appropriate. The inadequacy and jumping the gun and making excuses, that is. The Biblical “superheroes” were human just like us, and the book doesn’t gloss over that. Today’s Light is designed to divide the Bible into pieces so you can read the whole thing in two years. I can’t decide if I want to get through it in a year or in six months. A few years ago I read it in its entirety in four months, but that pace is a bit much. If you’re willing to spend as much time reading the Bible every day as the average person does watching TV, you can make it through in a few months. But it’s not exactly light reading, and I’m not sure I recommend that pace. If you’re willing to dedicate that kind of time to Bible study you’re probably better served by learning Greek so you can read the New Testament in the original. Then if you’ve still got your sanity you can think about tackling Hebrew.

I finally got around to reading Charlie Sebold’s entries for the last few days. One especially poignant observation: “I continue to be surprised at how much I remember about computers, and how much I forget about everything else (including far more important things).”

I sure can relate. I wish I could trade everything I remember about IBM PS/2s and Microchannel for something more useful. But I remember goofy baseball statistics too–I can recite the starting lineup and pitching rotation of the 1980 Kansas City Royals (I’ll spare you). But I can’t tell you the names of all seven people I met Sunday night.

What on earth is going on?

AOL-Time Warner in talks to buy Red Hat? I found this this morning. It’s intriguing, but I can’t decide if a buyout would be a good thing or a bad thing. After all, Netscape was in decline when AOL bought it. It nosedived afterward. Obviously, the problem was twofold. When AOL acquired Netscape, they didn’t acquire all of its mindshare. Some of the most talented people got fed up and left. You can take Jim Barksdale or you can leave him. The loss of Marc Andreesen and Jamie Zawinski, though, was substantial.
The second problem was that AOL wasn’t serious about competing. They bought a browser technology and basically sat on it. Netscape 4.x was fundamentally flawed, as even Zawinski acknowledges, although I would argue it was no more fundamentally flawed than IE 4.x. The Gecko engine, on which Netscape 6.x is based, is solid technology, even though it took longer to get to market than anyone had hoped. Although Netscape 6.x won’t bowl anyone over, other browsers based on the technology, such as Galeon, are absolutely fantastic. But AOL chose to release a half-hearted browser with the Netscape name on it and continued to use the IE engine in its flagship product even after the favorable agreement with Microsoft that prompted AOL to do so in the first place expired.

That begs the question of what AOL would do with Red Hat if it owned it. Red Hat is still the big-name player in the Linux field, but Red Hat is concentrating on the server market. You can still buy Red Hat at retail, but on the desktop, Red Hat is arguably #3 in popularity now behind France’s Mandrake and Germany’s SuSE. Red Hat is the only Linux company that’s making money, but that’s largely by selling consulting. That’s not AOL’s core business. At this point, AOL is more of a media company than a technology company. Software just gives AOL more outlets to sell its media content. Consulting doesn’t do that.

The best possible scenario for a Red Hat buyout would be for AOL to, as Microsoft puts it, “eat its own dog food,” that is, rip out the infrastructure it bought from other companies and replace it with the technology it just developed or acquired. Since AOL is largely powered by Sun servers, it wouldn’t be terribly difficult to migrate the infrastructure to Red Hat running on Intel. Then AOL could give a big boost to its newly-acquired services division by saying, “We did it and we can help you do it too.” They can also cite Amazon’s recent successes in moving its infrastructure to Red Hat Linux. There is precedence for that; after AOL bought Time Warner, the entire company started using AOL for e-mail, a move widely questioned by anyone who’s used anything other than AOL for mail.

Of course, it would be expected that AOL would port its online service to Linux, which would create the truly odd couple of the computing field. AOL, meet sed and awk. Red Hat would certainly lose its purity and much of its credibility among the Linux die-hards. AOL would bank on making up the loss by gaining users closer to the mainstream. AOL could potentially put some Linux on its corporate desktops, but being a media company, an all-out migration to Linux everywhere within is very far-fetched.

To really make this work, AOL would either have to enter the hardware business and sell PCs at retail using its newly acquired Red Hat distribution and newly ported AOL for Linux and possibly an AOL-branded office suite based on OpenOffice, or it would have to partner with a hardware company. Partnering with a big name seems unlikely–a Compaq or an HP or an IBM wouldn’t do it for fear of retaliation from Microsoft. Sun has never expressed any interest in entering the retail computer business, and even though Sun loves to take opportunities to harm Microsoft, Sun probably wouldn’t cooperate with AOL if AOL replaced its Sun infrastructure with Red Hat Linux. Struggling eMachines might be the best bet, since it’s strictly a consumer brand, has a large presence, but hasn’t consistently turned a profit. But AOL could just as easily follow eMachines’ example, buying and re-branding low-end Far East clones and selling them at retail as loss-leaders, taking advantage of its lack of need for Windows (which accounts for roughly $75 of the cost of a retail PC) and making its profit off new subscribers to its dialup and broadband customers. A $349 PC sold at retail with a flashy GUI, decent productivity software and AOL is all the computer many consumers need.

The advantage to this scenario for everyone else is that AOL would probably dump more development into either the KDE or GNOME projects in order to give itself more and higher-quality software to offer. The official trees can either take these changes or leave them. Undoubtedly, some of the changes would be awful, and the official trees would opt to leave them. But with its 18 years’ worth of experience developing GUIs, some of the changes would likely be a good thing as well.

The more likely scenario: AOL will buy out Red Hat, not have a clue what to do with it, and Red Hat Linux will languish just like Netscape.

The even more likely scenario: AOL will come to its senses, realize that Red Hat Linux has nothing to do with its core business, and the two companies will go their separate ways.

Building 98 boxes

I knuckled down yesterday at work and started building a new laptop image for some deployed users. What they’re using now isn’t stable and it isn’t fast, and much of the software is dated. So rather than patch yet again, we’re starting over. I built a 98 install, leaving out anything I could (such as Drive Converter, since we’re already using FAT32 over my protests, and Disk Compression, which isn’t compatible with FAT32 and I just know it’s only a matter of time before some end user decides he’s too short on disk space and runs it only to be greeted by a PC that won’t boot).
Law #1: The more you install, the slower the system runs, and no amount of disk or registry optimization will completely make up for that.

After I got a decent 98 install down, I did some cleanup. All the .txt files in the Windows directory? Gone. All the BMP files? See ya. Channel Screen Saver? B’bye. I got the C:Windows directory down under 150 entries without losing any functionality. There are probably some GIF and JPEG files in there, and some WAVs possibly, that can also go. I’ll have to check. And of course I did my standard MSDOS.SYS tweaks.

Then I defragmented the drive, mostly to get the directories compressed, rebooted, and timed it. 18 seconds. Not bad for a P2-300.

Next, I installed Office 2000. Once I got all that in place, Windows’ boot time ballooned to 32 seconds, which just goes to show how Microsoft apps screw around on the OS’s turf entirely too much–Office makes more changes to the OS than Internet Explorer–but the boot time is still well below what we’ve come to expect from a P2-300.

One of my coworkers had the nerve to say, “Don’t forget to run Cacheman!” Cacheman my ass. I can put vcache entries in system.ini myself, thank you very much. And I can change the file and path cache in the Registry myself, without having to use some lame program to do it. And cleaning up the directories makes a much bigger difference than those hacks do. It just doesn’t make you feel l33t or anything. Heaven forbid we should ever do anything simple and effective to improve system performance.

Law #2: Most of the tweaks floating around there on the ‘Net do little more than let you feel like you’ve done something. I condensed the useful tricks into a single book chapter. And I also told you what those tricks really do, and the side effects they have, unlike a certain multi-megabyte Web site hosted on AOL… You can do the majority of the things you need to do by practicing restraint and judiciously using just a small number of software tools.

I know how to make a fast Win98 PC. It’s not like I wrote a book about that or anything…

Oh, but how am I ensuring stability? I’m forcing the issue. Yes, I see that list of 47 software packages they have to have. Here’s Windows and Office 2000 and ACT!. Now they have to test it. Does it crash? OK. Now we’ll add the remaining 44 things, one at a time and see which one is breaking stuff. If it’s unstable by the time all of that’s done, it’s because the end users who were testing were sloppy with their testing.

Craig Mundie’s infamous speech

I haven’t said anything about Microsoft Executive Craig Mundie’s speech yet. Everyone’s heard of it, of course, and the typical response has been something along the lines of “Now we know Microsoft’s stance on Open Source.”

No, we’ve always known Microsoft’s stance on that. They’re scared of it. Remember the stereotype of open-source programmers: college students and college dropouts writing software in their basements that a lot of people are using, with the goal of toppling an industry giant. Seem far-fetched? Friends, that’s the story of Microsoft itself. Microsoft became an underground sensation in the late 1970s with Microsoft Basic, a programming language for the Altair and other kit computers and later for CP/M. And while we’ll probably never know the entire story of how and why this happened, when IBM decided to outsource the operating system for the IBM PC, they went to Microsoft and got both an OS and the must-have Microsoft Basic. Ten years later, IBM was just another hardware maker–really big, but getting squeezed. Today, 20 years later, IBM’s still a huge force in the computing industry, but in the PC industry, aside from selling ThinkPads, IBM’s a nobody. There may be hardware enthusiasts out there who’d be surprised to hear IBM makes and sells more than just hard drives.

Ironically, Microsoft’s response to this new threat is to act more and more like the giant it toppled. Shared Source isn’t a new idea. IBM was doing that in the 1960s. If you were big enough, you could see the source code. DEC did it too. At work, we have the source code to most of the big VMS applications we depend on day-to-day. Most big operations insist on having that kind of access, so their programmers can add features and fix bugs quickly. If Windows 2000 is ever going to get beyond the small server space, they really have no choice. But they do it with strings attached and without going far enough. An operation the size of the one I work for can’t get the source and fix bugs or optimize the code for a particular application. You’re only permitted to use the source code to help you develop drivers or applications. Meet the new Microsoft: same as the old Microsoft.

Some people have read this speech and concluded that Microsoft believes open-source software killed the dot-com boom. That’s ludicrous, and I don’t see that in the text. OSS was very good for the dot-com boom. OSS lowered the cost of entry: Operating systems such as FreeBSD and Linux ran on cheap PCs, rather than proprietary hardware. The OSs themselves were free, and there was lots of great free software available, such as the Apache Web server, and scripting languages like Python and Perl. You could do all this cool stuff, the same cool stuff you could do with a Sun or SGI server, for the price of a PC. And not only was it cheaper than everybody else, it was also really reliable.

The way I read it, Microsoft didn’t blame OSS for the dot-com bust. Microsoft blamed the advertising model, valuing market share over revenue, and giving stuff away now and then trying to get people to pay later.

I agree. The dot-com boom died because companies couldn’t find ways to make money. But I’m not convinced the dot-com boom was a big mistake. It put the Internet on the map. Before 1995, when the first banner ad ran, there wasn’t much to the Internet. I remember those early days. As a college student in 1993, the Internet was a bonanza to me, even though I wasn’t using it to the extent a lot of my peers were. For me, the Internet was FTP and Gopher and e-mail. I mostly ignored Usenet and IRC. That was pretty much the extent of the Internet. You had to be really determined or really bored or really geeky to get much of anything out of it. The World Wide Web existed, but that was a great mystery to most of us. The SGI workstations on campus had Web browsers. We knew that Mosaic had been ported to Windows, but no one in the crowd I ran in knew how to get it working. When we finally got it running on some of our PCs in 1994, what we found was mostly personal homepages. “Hi, my name is Darren and this is my homepage. Here are some pictures of my cat. Here’s a listing of all the CDs I own. Here are links to all my friends who have homepages.” The running joke then was that there were only 12 pages on the Web, and the main attraction of the 12 was links to the other 11.

By 1995, we had the first signs of business. Banner ads appeared, and graduating students (or dropouts) started trying to build companies around their ideas. The big attraction of the Web was that there was all this information out there, and it was mostly free. Online newspapers and magazines sprung up. Then vendors sprung up, offering huge selections and low prices. You could go to Amazon.com and find any book in print, and you’d pay less for it than you would at Barnes & Noble. CDNow.com did the same thing for music. And their ads supported places that were giving information away. So people started buying computers so they could be part of the show. People flocked from closed services like CompuServe and Prodigy to plain-old Internet, which offered so much more and was cheaper.

Now the party’s ending as dot-coms close up shop, often with their content gone forever. To me, that’s a loss only slightly greater than the loss of the Great Library. There’s some comfort for me: Five years from now, most of that information would be obsolete anyway. But its historical value would remain. But setting sentiment aside, that bonanza of freebies was absolutely necessary. When I was selling computers in 1994, people frequently asked me what a computer was good for. In 1995, it was an easier sell. Some still asked that question, but a lot of people came in wanting “whatever I need to get to be able to get on the Internet.” Our best-selling software package, besides Myst, was Internet In A Box, which bundled dialup software, a Web browser, and access to some nationwide provider. I imagine sales were easier still in 1996 and beyond, but I was out of retail by then. Suddenly, you could buy this $2,000 computer and get all this stuff for free. A lot of companies made a lot of money off that business model. Microsoft made a killing. Dell and Gateway became behemoths. Compaq made enough to buy DEC. AOL made enough to buy Time Warner. Companies like Oracle and Cisco, who sold infrastructure, had licenses to print money. Now the party’s mostly over and these companies have massive hangovers, but what’s the answer to the Ronald Reagan question? Hangover or no hangover, yes, they’re a whole heck of a lot better off than they were four years ago.

I’m shocked that Microsoft thinks the dot-com phenomenon was a bad thing.

If, in 1995, the Web came into its own but every site had been subscription-based, this stuff wouldn’t have happened. It was hard enough to swallow $2,000 for a new PC, plus 20 bucks a month for Internet. Now I have to pay $9.95 a month to read a magazine? I could just subscribe to the paper edition and save $2,500!

The new Internet would have been the same as the old Internet, only you’d have to be more than just bored, determined, and geeky to make it happen. You’d also have to have a pretty big pile of cash.

The dot-com boom put the Internet on the map, made it the hot ticket. The dot-com bust hurt. Now that sites are dropping out of the sky or at least scaling operations way back, more than half of the Web sites I read regularly are Weblogs–today’s new and improved personal home page. People just like me. The biggest difference between 1994 and 2001? The personal home pages are better. Yeah, the pictures of the cat are still there sometimes, but at least there’s wit and wisdom and insight added. When I click on those links to the left, I usually learn something.

But there is another difference. Now we know why it would make sense to pay for a magazine on the Internet instead of paper. Information that takes a month to make it into print goes online in minutes. It’s much easier and faster to type a word into a search engine than to leaf through a magazine. We can hear any baseball game we want, whether a local radio station carries our favorite team or not. The world’s a lot smaller and faster now, and we’ve found we like it.

The pump is primed. Now we have to figure out how to make this profitable. The free ride is pretty much over. But now that we’ve seen what’s possible, we’re willing to start thinking about whipping out the credit cards again and signing up, provided the cost isn’t outrageous.

The only thing in Mundie’s speech that I can see that Linus Torvalds and Alan Cox and Dan Gillmor should take offense to is Microsoft’s suspicion of anyone giving something away for free. Sure, Microsoft gives lots of stuff away, but always with ulterior motives. Internet Explorer is free because Microsoft was afraid of Netscape. Outlook 98 was free for a while to hurt Lotus Notes. Microsoft Money was free for a while so Microsoft could get some share from Quicken. It stopped being free when Microsoft signed a deal with Intuit to bundle Internet Explorer with Quicken instead of Netscape. And there are other examples.

Microsoft knows that you can give stuff away with strings attached and make money off the residuals. What Microsoft hasn’t learned is that you can give stuff away without the strings attached and still make money off the residuals. The dot-com bust only proves that you can’t necessarily make as much as you may have thought, and that you’d better spend what you do make very wisely.

The Internet needs to be remade, yes, and it needs to find some sustainable business models (one size doesn’t fit all). But if Mundie thinks the world is chomping at the bit to have Microsoft remake the Internet their way, he’s in for a rude awakening.

More Like This: Microsoft Linux Weblogs Internet Commentary

01/19/2001

Software of the day: SecurePC, from www.citadel.com . I spent most of yesterday evaluating it. The biggest thing it does that system policies won’t do is prevent the installation of software–in other words, it makes NT live up to the hype it’s had forever. I tried installing about 20 or so programs, using different methods to try to get around it, and I couldn’t. The setup programs either gave bogus error messages, told me installing software had been disabled, died outright, or crashed. In one instance, the setup program started, asked some questions, then told me installing software had been disabled. Nice.

The only things it won’t block are standalone programs, such as Steve Gibson’s self-contained gems, that don’t require any installation. But I’m not so concerned about those. For one, they’re rare. For two, they usually don’t conflict with anything because they don’t venture outside themselves. Their only danger is that they might be virus-infected, but that’s why we install always-on virus protection and push virus definitions.

The goal is to be able to set up PCs for use in the field, get them working right, then lock them down so as to keep people from breaking them by installing AOL and Webshots and every piece of beta software under the sun and break it.

SecurePC will do a few things system policies will as well, and its user interface is much nicer than Microsoft’s Poledit. Poledit will allow finer control of the control panels, so SecurePC doesn’t totally replace it, but the combination of the two will let you really lock a machine down. And frankly, even Windows 95 is pretty reliable as long as it’s running on good hardware and the user doesn’t mess with it.

But SecurePC is obviously targeting companies used to paying someone $100 an hour or more to fix PCs, because it runs $99. A 10-pack of the network version is $550. That’s a bargain for a company, but this would be incredibly useful in public computer labs in schools, libraries and churches, who frequently can’t afford that. It’s a shame. Hey, if it were priced lower I’ll bet some people would even buy it for home use. I have one friend who could really use it–it’d keep his 20-year-old brother from messing up his PC.

Tyrannical Security. This kind of software is a draconian measure, but what people all too often forget is that when a PC is sitting on a desk at work, it ceases to be a PC. It’s a CC–corporate computer, not personal computer. It’s a corporate asset, set up the way the corporation dictates. If the corporation says no screen savers, no Webshots, no stupid Yahoo news ticker, no RealAudio, then that’s law. Problem is, that’s impossible to enforce with the tools that come with Windows. But a third-party product to enforce them is a Godsend. Computer toys eat memory and CPU cycles, slowing it down and thus hurting productivity, and many of these toys are so poorly written as to make Microsoft look like a model of stability. Personally, I can’t wait for the day when Real Networks goes out of business. So these programs go in, break stuff, and then there’s lost productivity while waiting for the tech to arrive, then still more while an overworked tech tries to fix it. If we were to buy 1,000 copies of some security program that works and roll it out to everyone on our network, I’d be willing to bet it would pay for itself in three months.

The number of the day: 146. I use the Al Gore method of taking IQ tests. I keep taking them over and over again until I like the results. They say the 135-145 range looks like a genius to most people; the 145-165 range is a true genius. I’m accused of being a genius frequently enough that I’m probably at least a 135.

So since I climbed 22 points in a day, I can assume I’ll climb another 22 points today if I take another one, which will put me at 168–high genius level. Then I can take another one tomorrow, gain another 22 points, and apply for Mensa membership.

Or I can forget about it and get on with life. I think I like that idea better.

Scanner troubleshooting secrets

~Mail Follows Today’s Post~

Scanner wisdom. One of the things I did last week was set up a Umax scanner on a new iMac DV. The scanner worked perfectly on a Windows 98 PC, but when I connected it to the Mac it developed all sorts of strange diseases–not warming up properly, only scanning 1/3 of the page before timing out, making really loud noises, crashing the system…

I couldn’t resolve it, so I contacted Umax technical support. The tech I spoke with reminded me of a number of scanner tips I’d heard before but had forgotten, and besides that, I rarely if ever see them in the scanner manuals.

  • Plug scanners directly into the wall, not into a power strip. I’ve never heard a good explanation of why scanners are more sensitive to this than any other peripheral, but I’ve seen it work.
  • Plug USB scanners into a powered hub, or better yet, directly into the computer. USB scanners shouldn’t need power from the USB port, since they have their own power source, but this seems to make a difference.
  • Download the newest drivers, especially if you have a young operating system like MacOS 9, Mac OS X, Windows ME, or Windows 2000. It can take a little while for the scanner drivers to completely stabilize. Don’t install off the CD that came with the scanner, because it might be out of date. Get the newest stuff from the manufacturer’s Web site.
  • Uninstall old drivers before installing the new ones. This was the problem that bit me. The new driver didn’t totally overwrite the old one, creating a conflict that made the scanner go goofy.
  • Buy your scanner from a company that has a track record of providing updated drivers. Yes, that probably means you shouldn’t buy the $15 scanner with the $25 mail-in rebate. Yes, that means don’t buy HP. Up until a couple of years ago, getting NT drivers out of HP was like pulling teeth; now HP is charging for Windows 2000 drivers. HP also likes to abandon and then pick back up Mac support on a whim. Terrible track record.

Umax’s track record is pretty darn good. I’ve downloaded NT drivers for some really ancient Umax scanners after replacing old Macs with NT boxes. I once ran into a weird incompatibility with a seven-year-old Umax scanner–it was a B&W G3 with a wide SCSI controller (why, I don’t know) running Mac OS 8.6. Now that I think about it, I think the incompatibility was with the controller card. The scanner was discontinued years ago (before Mac OS 8 came out), so expecting them to provide a fix was way out of line.
m I’ve ever had with a Umax that they didn’t resolve, so when I spec out a scanner at work, Umax is always on my short list.

And here’s something I just found interesting. Maybe I’m the only one. But in reading the mail on Jerry Pournelle’s site, I found this. John Klos, administrator of sixgirls.org, takes Jerry to task for saying a Celeron can’t be a server. He cites his 66 MHz 68060-based Amiga 4000, which apparently acts as a mail and Web server, as proof. Though the most powerful m68k-based machine ever made, its processing power pales next to any Celeron (spare the original cacheless Celeron 266 and 300).

I think the point he was trying to make was that Unix plays by different rules. Indeed, when your server OS isn’t joined at the hip to a GUI and a Web browser and whatever else Gates tosses in on a whim, you can do a lot more work with less. His Amiga would make a lousy terminal server, but for serving up static Web pages and e-mail, there’s absolutely nothing wrong with it. Hosting a bunch of Web sites on an Amiga 4000 just because I could sounds very much like something I’d try myself if I had the hardware available or was willing to pay for the hardware necessary.

But I see Jerry Pournelle’s point as well.

It’s probably not the soundest business practice to advertise that you’re running off a several-year-old sub-100 MHz server, because that makes people nervous. Microsoft’s done a pretty admirable job of pounding everything slower than 350 MHz into obsolescence and the public knows this. And Intel and AMD have done a good job of marketing their high-end CPUs, resulting in people tending to lay blame at the CPU’s feet if it’s anything but a recent Pentium III. And, well, if you’re running off a shiny new IBM Netfinity, it’s very easy to get it fixed, or if need be, to replace it with another identical one. I know where to get true-blue Amiga parts and I even know which ones are interchangeable with PCs, but you might well be surprised to hear you can still get parts and that some are interchangeable.

But I’m sure there are far, far more sub-100 MHz machines out there in mission-critical situations functioning just fine than anyone wants to admit. I know we had many at my previous employer, and we have several at my current job, and it doesn’t make me nervous. The biggest difference is that most of them have nameplates like Sun and DEC and Compaq and IBM on them, rather than Commodore. But then again, Commodore’s reputation aside, it’s been years since I’ve seen a computer as well built as my Amiga 2000. (The last was the IBM PS/2 Model 80, which cost five times as much.) If I could get Amiga network cards for a decent price, you’d better believe I’d be running that computer as a firewall/proxy and other duties as assigned. I could probably get five years’ uninterrupted service from old Amy. Then I’d just replace her memory and get another ten.

The thing that makes me most nervous about John Klos’ situation is the business model’s dependence on him. I have faith in his A4000. I have faith in his ability to fix it if things do go wrong (anyone running NetBSD on an Amiga knows his machine better than the onsite techs who fix NetFinity servers know theirs). But there’s such thing as too much importance. I don’t let Apple certified techs come onsite to fix our Macs anymore at work, because I got tired of them breaking other things while they did warranty work and having to fix three things after they left. I know their machines better than they do. That makes me irreplaceable. A little job security is good. Too much job sercurity is bad, very bad. I’ll be doing the same thing next year and the year after that. It’s good to be able to say, “Call somebody else.” But that’s his problem, not his company’s or his customers’.

~~~~~~~~~~

From: rock4uandme
To: dfarq@swbell.net
Sent: Wednesday, October 25, 2000 1:22 PM
Subject: i`m having trouble with my canon bjc-210printer…

i`m having trouble with my canon bjc210 printer it`s printing every thing all red..Can you help???
 
 
thank you!!    john c
 
~~~~~~~~~

Printers aren’t my specialty and I don’t think I’ve ever seen a Canon BJC210, but if your printer has replacable printheads (some printers make the printhead part of the ink cartridge while others make them a separate component), try replacing them. That was the problem with the only Canon printer I’ve ever fixed.
 
You might try another color ink cartridge too; sometimes those go bad even if they still have ink in them.
 
If that fails, Canon does have a tech support page for that printer. I gave it a quick look and it’s a bit sketchy, but maybe it’ll help. If nothing else, there’s an e-mail address for questions. The page is at http://209.85.7.18/techsupport.php3?p=bjc210 (to save you from navigating the entire www.ccsi.canon.com page).
 

I hope that helps.

Dave
 
~~~~~~~~~~
 

From: Bruce Edwards
Subject: Crazy Win98 Networking Computer Problem

Dear Dave:

I am having a crazy computer problem which I am hoping you or your readers
may be able to give me a clue to.  I do have this posted on my daily
journal, but since I get very little traffic, I thought your readership or
yourself may be able to help.  Here’s the problem:

My wife’s computer suddenly and inexplicably became very slow when accessing
web sites and usually when accessing her e-mail.  We access the internet
normally through the LAN I installed at home.  This goes to a Wingate
machine which is connected to the aDSL line allowing shared access to the
internet.

My computer still sends and receives e-mail and accesses the web at full
speed.  Alice’s computer now appears to access the web text at about the
speed of a 9600 baud modem with graphics coming down even more slowly if at
all.  Also, her e-mail (Outlook Express) usually times out when going
through the LAN to the Wingate machine and then out over the internet. 
The LAN is working since she is making a connection out that way.

File transfer via the LAN between my PC and hers goes at full speed.
Something is causing her internet access to slow to a crawl while mine is
unaffected.  Also, it appears to be only part of her internet access.  I can
telnet out from her computer and connect to external servers very fast, as
fast as always.  I know telnet is just simple text, but the connection to
the server is very rapid too while connecting to a server via an http
browser is much much slower and then, once connected, the data flows so slow
it’s crazy.

Also, dial-up and connect to the internet via AOL and then use her mail
client and (external to AOL) browser works fine and is as speedy as you
would expect for a 56K modem.  What gives?

I tried reinstalling windows over the existing set-up (did not do anything)
and finally started over from “bare metal” as some like to say.  Reformat
the C drive.  Reinstall Windows 98, reinstall all the drivers, apps, tweak
the configuration, get it all working correctly.  Guess what?  Same slow
speed via the aDSL LAN connection even though my computer zips out via the
same connection.  Any suggestions?

Sincerely,

Bruce W. Edwards
e-mail:  bruce@BruceEdwards.com
Check www.BruceEdwards.com/journal  for my daily journal.

Bruce  🙂
Bruce W. Edwards
Sr. I.S. Auditor  
~~~~~~~~~~

From: Dave Farquhar [mailto:dfarq@swbell.net]Sent: Monday, October 23, 2000 6:16 PM
To: Edwards, Bruce
Cc: Diana Farquhar
Subject: Re: Crazy Win98 Networking Computer Problem

Hi Bruce,
 
The best thing I can think of is your MTU setting–have you run any of those MTU optimization programs? Those can have precisely the effect you describe at times. Try setting yor MTU back to 1500 and see what that does. While I wholeheartedly recommend them for dialup connections, MTU tweaking and any sort of LAN definitely don’t mix–to the point that I almost regret even mentioning the things in Optimizing Windows.
 
Short of that, I’d suggest ripping out all of your networking protocols and adapters from the Network control panel and add back in TCP/IP and only the other things you absolutely need. This’ll keep Windows from getting confused and trying to use the wrong transport, and eliminate the corrupted TCP/IP possibility. These are remote, but possible. Though your reinstall should have eliminated that possibility…
 
If it’s neither of those things, I’d start to suspect hardware. Make sure you don’t have an interrupt conflict (rare these days, but I just saw one a couple weeks ago so I don’t rule them out). Also try swapping in a different cable or NIC in your wife’s machine. Cables of course go bad more frequently than NICs, though I’ve had horrible luck with cheap NICs. At this point I won’t buy any ethernet NIC other than a Bay Netgear, 3Com or Intel.
 
I hope that helps. Let me know how it goes for you.

Dave 
~~~~~~~~~~
From: Bruce Edwards

Hi Dave:
 
Thank you for posting on your web site. I thought you would like an update.
 
I verified the MTU setting was still at 1500 (it was).  I have not used one of the optimizing programs on this PC.
 
I removed all the adapters from the PC via the control panel.  Rebooted and only added back TCP/IP on the Ethernet card. 
 
I double checked the interrupts in the control panel, there do not appear to be any conflicts and all devices report proper function.
 
I still need to 100% verify the wiring/hubs.  I think they are O.K. since that PC, using the same adapter, is able to file share with other PCs on the network.  That also implies that the adapter is O.K.
 
I will plug my PC into the same hub and port as my wife’s using the same cable to verify that the network infrastructure is O.K.
 
Then, I’ll removed the adapter and try a different one.
 
Hopefully one of these things will work.
 
Cheers,
 
Bruce
~~~~~~~~~~

This is a longshot, but… I’m wondering if maybe your DNS settings are off, or if your browser might be set to use a proxy server that doesn’t exist. That’s the only other thing I can think of that can cause sporadic slow access, unless the problem is your Web browser itself. Whichever browser you’re using, have you by any chance tried installing and testing the other one to see if it has the same problems?
 
In my experience, IE 5.5 isn’t exactly the greatest of performers, or when it does perform well, it seems to be by monopolizing CPU time. I’ve gotten much better results with IE 5.0. As for Netscape, I do wish they’d get it right again someday…
 
Thanks for the update. Hopefully we can find an answer.

Dave 
~~~~~~~~~~ 

Abandoned intellectual property

Abandoned Intellectual Property. I read a piece on this subject at OSOpinion over the weekend, and I’ve been thinking about it ever since. There are, of course, a lot of people calling for abolition of copyright or radical changes. This is, believe it or not, one of the tamer proposals I’ve read.

I’m definitely of two minds on this one. Take my first ever publication for money, in 1991. Compute Magazine, before Bob Guccione had managed to totally ram it into the ground, opted to buy my spring break project I collaborated on with a friend. We were writing a video game for the Commodore 64 and 128 and we were getting tired of trying to draw the title screen manually with graphics commands (bad enough on the 128 which had Basic commands to do such things, but on the 64 you were talking peeks and pokes all over the place–someone really should have written this thing back in 1982!) so we wrote a program to do the work for us. You loaded the sprites, moved ’em around, hit a key, and it gave you the Basic code to re-create the screen, suitable for inclusion in your program. We never finished the game, but we got a cool $350 and international recognition (OK, so it was a dwindling audience, but how many high school kids can say they’re published authors at age 16?).

Now, the problem. General Media whittled Compute down until it was basically just another PC mag, abandoning the multiplatform support that made it so great (I read about my beloved Commie 8-bits but still got the opportunity to learn about Macs, Amigas and PCs–what could be better?), market share continued to dwindle, and eventually Guccione and GM sold out to Ziff-Davis, who fulfilled your subscription with a choice of mags (I remember I opted for PC/Computing). So the copyright went to Ziff-Davis, who never did anything with the old Compute stuff. A few years later, Ziff-Davis fell on hard times and eventually hacked itself up into multiple pieces. Who owns the old Compute stuff now? I have no idea. The copyrights are still valid and enforcable. I seriously doubt if anyone cares anymore whether you have the Nov. 1991 issue of Compute if you’re running MOB Mover on your 64/128 or emulator, but where do you go for permission?

The same goes for a lot of old software. Sure, it’s obsolete but it’s useful to someone. A 68020-based Mac would be useful to someone if they could get software for it. But unless the original owner still has his/her copies of WriteNow, Aldus SuperPaint and Aldus Persuasion (just to name a few desirable but no-longer-marketable abandoned titles) to give you, you’re out of luck. Maybe you can get lucky and find some 1995 era software to run on it, but it’ll still be a dog of a computer.

But do we have an unalienable right to abandoned intellectual property, free of charge? Sure, I want the recordings Ric Ocasek made with his bands before The Cars. A lot of people want to get their hands on that stuff, but Ocasek’s not comfortable with that work. Having published some things that I regret, I can sympathize with the guy. I like how copyright law condemns that stuff to obscurity for a time. (Hopefully it’d be obscure in the public domain too because it’s not very good, but limiting the number of copies that can exist clinches it.)

Obscurity doesn’t mean no one is exploited by stealing it. I can’t put it any better than Jerry Pournelle did.

I don’t like my inability to walk into record stores and buy Seven Red Seven’s Shelter or Pale Divine‘s Straight to Goodbye or The Caulfields’ Whirligig, but I couldn’t easily buy them in 1991 when they were still in print either. But things like that aren’t impossible to obtain: That’s what eBay and Half.com are for.

For the majority of the United States’ existence, copyright law was 26 years, renewable for another 26. This seems to me a reasonable compromise. Those who produce content can still make a living, and if it’s no longer commercially viable 26 years later, it’s freely available. If it’s still viable, the author gets another 26-year-ride. And Congress could sweeten the deal by offering tax write-offs for the premature release of copyrighted material into the public domain, which would offer a neat solution to the “But by 2019, nobody would want WriteNow anymore!” problem. Reverting to this older, simpler law also solves the “work for hire” problem that exploits musicians and some authors.

All around, this scenario is certainly more desirable for a greater number of people than the present one.

From: Bruce Edwards

Dear Dave:

I am having a crazy computer problem which I am hoping you or your readers may be able to give me a clue to.  I do have this posted on my daily journal, but since I get very little traffic, I thought your readership or
yourself may be able to help.  Here’s the problem:

My wife’s computer suddenly and inexplicably became very slow when accessing web sites and usually when accessing her e-mail.  We access the internet normally through the LAN I installed at home.  This goes to a Wingate machine which is connected to the aDSL line allowing shared access to the internet.

My computer still sends and receives e-mail and accesses the web at full speed.  Alice’s computer now appears to access the web text at about the speed of a 9600 baud modem with graphics coming down even more slowly if at
all.  Also, her e-mail (Outlook Express) usually times out when going through the LAN to the Wingate machine and then out over the internet.  The LAN is working since she is making a connection out that way.

File transfer via the LAN between my PC and hers goes at full speed. Something is causing her internet access to slow to a crawl while mine is unaffected.  Also, it appears to be only part of her internet access.  I can
telnet out from her computer and connect to external servers very fast, as fast as always.  I know telnet is just simple text, but the connection to the server is very rapid too while connecting to a server via an http
browser is much much slower and then, once connected, the data flows so slow it’s crazy.

Also, dial-up and connect to the internet via AOL and then use her mail client and (external to AOL) browser works fine and is as speedy as you would expect for a 56K modem.  What gives?

I tried reinstalling windows over the existing set-up (did not do anything) and finally started over from “bare metal” as some like to say.  Reformat the C drive.  Reinstall Windows 98, reinstall all the drivers, apps, tweak the configuration, get it all working correctly.  Guess what?  Same slow speed via the aDSL LAN connection even though my computer zips out via the
same connection.  Any suggestions?

Sincerely,

Bruce W. Edwards

~~~~~~~~~~

Hi Bruce,

The best thing I can think of is your MTU setting–have you run any of those MTU optimization programs? Those can have precisely the effect you describe at times. Try setting yor MTU back to 1500 and see what that does. While I wholeheartedly recommend them for dialup connections, MTU tweaking and any sort of LAN definitely don’t mix–to the point that I almost regret even mentioning the things in Optimizing Windows.

Short of that, I’d suggest ripping out all of your networking protocols and adapters from the Network control panel and add back in TCP/IP and only the other things you absolutely need. This’ll keep Windows from getting confused and trying to use the wrong transport, and eliminate the corrupted TCP/IP possibility. These are remote, but possible. Though your reinstall should have eliminated that possibility…

If it’s neither of those things, I’d start to suspect hardware. Make sure you don’t have an interrupt conflict (rare these days, but I just saw one a couple weeks ago so I don’t rule them out). Also try swapping in a different cable or NIC in your wife’s machine. Cables of course go bad more frequently than NICs, though I’ve had horrible luck with cheap NICs. At this point I won’t buy any ethernet NIC other than a Bay Netgear, 3Com or Intel.

I hope that helps. Let me know how it goes for you.

Computer ethics

Damsels in distress. Every time I turn around, there’s a girl who needs her computer fixed. Not that I’m complaining. I was having a beer the other night with the music director from my church and told him about it, to which he said, “That’s not a bad situation at all to be in.” He’s right.
So that’s what I was doing Thursday. I don’t exactly get it, because I always have great luck with the PCs I build myself, but when I build a PC for a friend, we always manage to get a bad power supply, or a bad video card, or something else–even though I use the same type of components in their systems as in mine. That’s why I’m not in the computer building business, and I may get out of the business of building them for my friends. I’ll find ’em a good deal if they want, and I’ll play hardball to get a good price and the best components for them, and I’ll gladly set it up for them, but when it comes to procurring all the parts and assembling them, it may be time to give it up.

But I got dinner out of it last night and got to meet some interesting people. That was good.

Computer Ethics. I found out last night that this friend once dated an IT professional I know. I don’t know him well–I didn’t put the name and the face together until she showed me a picture (he knows me better than I know him, apparently). She knew him about eight years ago.

Eight years ago, a typical date for them was him taking her to a weekly 2600 meeting. He evidently learned everything he knew by hacking. We’re not talking writing code here. We’re talking infiltration of systems illegally. At one point he had a notebook full of private phone numbers: people like the Pope and the Prime Minister of Canada. For kicks, he’d call the numbers and record the conversations. He also had her address and phone number in the notebook. One day he left the notebook on top of his car in a parking lot, then drove off. Someone found the notebook, couldn’t believe what was in it, and turned it over to the authorities. Since hers was the only non-VIP address in it, the Secret Service showed up on her doorstep. Her parents were less than amused.

I don’t really understand this. This guy isn’t the only “reformed” hacker I know who has a high-paying, high-security, high-integrity job. And that’s a real problem. If you didn’t have integrity at 18, you probably don’t have it at 25 or 26 either. You can’t count on eight years giving you any measurable amount of maturity, let alone integrity. If you have no respect for other people’s property at 18, you won’t have much a few years later. I don’t understand why anyone hires these kinds of people. You can sum up my run-ins with the law really quickly. I’ve been pulled over three times since the age of 16. I recieved two verbal warnings and a written warning. That’s the extent of it. But I’m not sure I’d trust myself in these peoples’ jobs.

———-

From: Paul S R Chisholm

There have been a series of excellent articles, written by Martin J. Furey and published at Byte.com, describing how the sound cards and microphones can effect the success of using Dragon Naturally speaking. Rough summary: 128 MB RAM or better, PIII or Athlon (speech recognition is one of the few applications that can use that much power, and the latest versions have installation options with executables tuned to those processors), very good mike or headset, very good sound card or USB headset, perhaps Win98SE. More detail:

http://www.byte.com/feature/BYT19990720S0003
http://www.byte.com/feature/BYT19991020S0004
http://www.byte.com/feature/BYT19991103S0001

In particular, a PIII or Athlon is supposed to greatly reduce the training time. It’s not clear how much its power is needed once the software is fully trained.

I ordered my Dell system based on these recommendations. (I got a 700 MHz PIII.) Since I didn’t want to spend the time putting a computer together, and since Dell didn’t have much of a sound card choice, I got the USB version of NaturallySpeaking Preferred, which comes with a USB mike in a headset form factor.

I haven’t tried writing a book this way. I did write up technical review comments for a book. In my experience, I could get a rough draft out much faster than if I’d typed it; even after making a review pass, something I probably would have done anyway, and which found some truly odd typos, I think I saved time.

It’s not STAR TREK. One Byte.com reviewer “had to speak like Queen Amidala of the Naboo to make it work right”. I wouldn’t go that far, but I’d lean in that direction.

I had less luck using NaturallySpeaking for total control of my PC. Mouse-clicking was surprisingly good. Saying “Press” and the name of a key was surprisingly bad. (My office mate tried this for a few weeks and had even less luck.)

I’ll leave the final word to John Ousterhout, creator of Tcl/Tk, who dictates even code but still “mouses by hand”:

http://www.scriptics.com/people/john.ousterhout/wrist.html
Good luck! –PSRC

———-

Yes, I read those articles myself after David Pogue suggested I try Naturally Speaking. So I’ve ordered an Andrea ANC-600 mic, which got good marks in the series, from www.speechcontrol.com (good price and quick delivery; the makers of the highest-rated mics say 6-8 weeks for delivery, while speechcontrol.com can get the ANC-600 to me in 4 days and the owner answers questions very quickly). Now that DNS 5.0 is out, I’m going to order it and an SB LIve! Platinum, the successor to the Sound Blaster card that came in second-best (I’m leery of buying the best-rated card, since it’s ISA and there’ll come a time when my fastest PC won’t have ISA slots), and we’ll see how that works. As for a P3 or Athlon system, that is something I’d probably get anyway, but I’ll see if the C400 has enough punch first.

As for ViaVoice, I guess I can hang it on the wall along with all those AOL and MSN CDs.

MP3s won’t kill the music industry

Courtney Love is right… I’m the last to bring this up, but last month Love said what every other musician is thinking. Every other sane one at least. Wanna know why Aimee Mann started her own label? Well, let’s see. She releases a record, on a major, the world yawns. It happened four times straight, from 1986 to 1996. The labels aren’t willing to play the payola game for her. She releases a record on her own label, and look at that… She’s #33 on Amazon.com. And for the first time since she first picked up a bass guitar 20 years ago and started a band, she’s making money making music.
It’s only a matter of time before the public at large tires of payola radio and the mega-trust record industry. I’m not saying they’ll implode, but they’ll be selling Hanson and Backstreet Boys and Britney Spears records (or more likely, their successors) while the more enduring artists find other means to get their work into the hands of the public. It’s good to see Love isn’t afraid of the MP3 format.

I’ve always thought, if porn stars can make money by putting up web sites peddling all the dirty pictures you can download for 10 bucks a month, why can’t rock stars make money by offering an all-you-can-download buffet of music files for a similar price? Most artists can’t keep up a song-a-month rate, true, but you don’t have to. Peddle demos. Record all of your concerts and release those tracks. Broadcast your live shows over the ‘Net. Hawk t-shirts at a discount. Set up a Shoutcast stream of your catalog, circumventing radio entirely (I seem to recall The Cure set up a pirate radio station in Britain and called it CURE-FM for this purpose–but Shoutcast, unlike pirate radio, is legal). It gives people a chance to hear your stuff before whipping out the credit card, then if they like it, they can subscribe to the site or buy a CD or eight. (I find it humorous that it’s Nullsoft, a subsidiary of AOL, that could contribute to the undoing of the music industry, of which future AOL subsidiary Time Warner is a major, major player).

True fans eat up rarities and live cuts and gladly pay for it. Yes, I’ve forked over $30 for really cruddy-sounding Joy Division live albums. I’ve also bought all their commercially available cruddy-sounding live albums. Along with the albums that sound like they were recorded in the men’s room. And the remastered boxed set that includes the albums and singles and b-sides and demos, which sounds like it was recorded in a regular studio. Everything but the out-of-print John Peel session (I’m still kicking myself for not buying that when I saw it back in 1995–I haven’t seen it since). I’m what you’d call a fanatic. But I’m not the only Joy Division fanatic out there. And Joy Division isn’t the only band with large numbers of crazy fans like me.

Joy Division milked two albums and two singles and three years of existance for a remarkable amount. You’ve probably never heard of them, but the three surviving members and the lead singer’s widow don’t care, because they’re making a lot more money than any other one-hit wonder from 1980 is. Their medium was vinyl, and later, CD. But they have a following because they made themselves available. With MP3, modern bands can make themselves available for a lot less than Joy Division paid to do it, and they can cut out most of the middlemen.