03/05/2001

Dual CPU blues. I’ve had my dual Celeron-500 apart for a while, for reasons that escape me, and over the weekend I finally got around to putting it back together. At one time this would have seemed an impressive system–Aureal Vortex 2 audio, TNT2 video, dual 500 MHz CPUs (which I’m actually running at around 510 MHz because I bumped the FSB speed up to 68 MHz, within the tolerance levels of most modern peripherals), and 320 MB RAM. But let me tell you–it’s a lot faster than it sounds. The 733-MHz Pentium IIIs at work used to make me jealous. No longer. I’ll put my dualie 500 up against them any day of the week.

Just out of curiosity, I tried my CPU stress test from last week on it. No matter what I did, I couldn’t get CPU usage up to 100 percent. I’d top out at about 96 percent. I’m not sure if that’s because of the dual CPUs or because I’m running Windows 2000 on it instead of NT4. I’m sure a complex Photoshop filter could max both chips out, but that’s not what I do. I fired up Railroad Tycoon II, and it was unbelievable. CPU usage hovered around 60 percent and it was smooth as silk, even with the more system-intensive scenarios from the Second Century add-on pack.

Unfortunately, the golden age of inexpensive multiprocessing is over, at least for now. Current Celerons won’t do SMP. I understand why–Intel doesn’t want you to buy two cheap CPUs instead of one expensive one. Like I said, I’ll take my dual 500s over a P3-733 any day of the week. A P3-733 costs about $200. My 500s were 40 bucks a pop. So, unfortunately, to get dual processing these days, you have to get a pair of P3s, which will start at about $140 apiece for a P3-667. The least expensive SMP board I know of is the VIA-based Abit VP6, which sells for about $140. So you’re looking at about $450 to get into dual CPUs by the time you get the board, CPUs and fans. That’s not an outrageous deal, but seeing as an Abit BP6 and a pair of Celerons with fans used to set you back about $350, it’s a shame.

If AMD can ever work through the problems they’re having with the AMD 760MP chipset, it’ll help a little but not as much as you may think. The AMD-based boards will be expensive–expect them to start at $200 or possibly even $250– because they use a different bus that requires a lot more pins and a lot more added expense. So while you’ll be able to multiprocess with $60 CPUs again, you’re looking at higher up-front cost. The least expensive dual-Duron rig will only cost about $50 less than the least expensive dual-P3 rig. But the dual-Duron rig stands a decent chance of outrunning the dual-P3, because the clockspeed will be higher, and the CPUs each get their own path to all the relevant buses.

And I’ve reached a new low. Last night I had a craving for a burger. So I did what any self-respecting part-time vegetarian who didn’t know any better would do: I went on a quest to find soyburgers. My friend Jeanne, who says I stole the idea of giving up meat for Lent from her (and maybe subconsciously I did) warned me they won’t taste like meat. And I’m pretty sure my dad–whose idea of four servings of vegetables a day was the pickles and ketchup on two hamburgers, beef of course–was rolling his eyes at me from Upstairs (If God has a sense of humor, which wouldn’t surprise me, He opened the portal so Dad could get a good look at the look on my face after the first bite).

And? Well, I guess soyburgers aren’t too much of an atrocity. Better than McDonald’s? Well, yeah, but then again so’s the cherry-flavored flouride treatment at the dentist’s office. They’re somewhere between beef and imitation bacon bits in both smell and taste. You definitely want to put other stuff on it to distract you–I got some good pickles, some good mustard, and ketchup, and wished I’d gone further. Hmm. Lettuce and tomato, no question. And I’m wondering if alfalfa sprouts would be good on a burger? I’m also wondering where you buy alfalfa sprouts. Oh, and get REALLY good rolls.

I can probably develop a taste for them, but it will definitely be an acquired taste. There was a time, back before I realized I wanted to live past age 27, when I could eat real hamburgers two meals a day for weeks at a time and be perfectly happy–and jokingly wondering why I didn’t eat them for breakfast too. That won’t happen with the soyburgers. I think what’s left of my package of four should get me through Lent.

Oh yeah. They aren’t as good as the real thing and they cost a lot more. What’s up with that? I thought stuff that was lower on the food chain was supposed to be cheaper. I guess that’s only when it’s not being marketed to SUV liberals. (Psst. Marketing tip: SUV liberals like unbleached paperboard. The paperboard that went into my packaging is definitely bleached. And lose the plastic wrap on the burgers. SUV liberals hate that. Good move on putting two burgers per plastic bag though–you’re at least thinking a little. But you gotta go all the way. That’s why they put two “Be Kind to Mother Earth” bumper stickers–printed on unbleached material, of course–on their Ford Excursions.)

I think I’ll be eating a lot of mushroom ravioli for the next few weeks, if I can ever find someplace that sells it again. You’d think in St. Louis, of all places–where there are almost as many good Italian restaurants as there are stop signs–you’d be able to find mushroom ravioli. I guess true blue St. Louisans like beef.

01/27/2001

More reviews of reviews. I liked how yesterday went, and I found some really good stuff yesterday, so let’s continue on and see what’s good and why.

2001 Upgrade Guide (Ace’s Hardware)

This is an outstanding upgrade guide, working from the assumption that you have an older system (a K6-2 or Celeron with a TNT2 board, which is a pretty common setup), then they test a number of upgrades so you can see what makes a difference. Unfortunately these upgrade candidates already have a modern hard disk and sound card, so they don’t closely simulate a real-world system, but they do isolate the components, so while these upgraded systems will outperform yours, you can see precisely what effect upgrading the video card will have.

For example, you can see right away from their graphs that replacing a K6-2’s TNT2 video card with a GeForce 2 GTS will only improve Half-Life frame rates slightly (up to 25.5 from 22.1), while trading up to a Duron 850 while keeping all the same peripherals increases rates to 51.8 from 22.1. How valuable is that information? I found a GTS card for $229. The same place has a Duron 850/Gigabyte 7ZX-1 bundle for $222. The upgrades cost the same amount, yet one of them increases performance significantly while the other just barely helps. It’s the difference between throwing away $240 and spending $235 wisely (after shipping).

The other great thing about this guide is that it tests more than just first-person shooters. For FPS, DDR gives marginal improvements indeed, but for other types of games, its improvement can be immense. Mercedes-Benz Truck Racing and Formula One 2000, for instance, are faster with a DDR-equipped Duron 850 than it is with a PC133-equipped Athlon 1100.

This guide shows when a GHz+ CPU and new memory technology makes sense, and when it doesn’t, letting you decide when it makes sense to buy the latest and greatest.

Overall: great methodology, nice balance of real-world tests (assuming gaming’s your thang, which it probably is if you read this stuff, since you won’t see much difference between a Celeron 667 and a 1.2 GHz Athlon for office apps). A lot of work goes into guides like this, but it’s worth it. Maybe someday articles like this will be the norm on the hardware sites, rather than the exception. Hey, I can dream, can’t I?

VIA Apollo Pro 266 (THG)

This is an analysis piece combined with a preview of VIA’s Apollo Pro 266 chipset. Good explanation of PC architecture for one who doesn’t understand what the north bridge and south bridge are, plus the benchmarks are using boards you can actually buy, rather than reference designs.

Tom Pabst takes his usual swipes at Rambus, and points out that the Pentium III isn’t really able to take advantage of DDR, as evidenced by its similar performance to Rambus- and PC133-equipped systems. Pabst concludes with an assertion that a DDR Pentium 4 chipset would prove how terrible Rambus really is, since the bottleneck with DDR seems to be the CPU, rather than the memory itself. Unfortuantely, he doesn’t provide anything at all to back up this claim, so he comes off as an anti-Rambus bigot. Has he seen a P4 run with DDR? Maybe he’s under NDA, but if he is, he can at least say, “I can’t tell you why I know this, but DDR chipsets for the P4 will prove how worthless Rambus is,” and it would be better than what he wrote. But his speculation of DDR performance with the P4 and how it will compare is no more valuable than yours.

This article does give the useful information that DDR on the Pentium III probably isn’t worth the bother.

Value Biz PC Guide (Sharky Extreme)

Unusual for hardware sites, good focus on what’s necessary for business. No benchmarks; I’d have liked to have seen illustrations of why CPU speed isn’t as important as, say, disk speed, for business apps. Hardware recommendations are solid, and I’m happy to see they don’t assume businesses overclock. They don’t. I disagree with the $100 CD-R recommendation; you’re better off with a Plextor drive with Burn-Proof, especially since such a drive will allow you to multitask. Since time is money, businesses can’t afford to waste time burning coasters. If a slower, cheaper CPU is necessary in order to afford a better CD-R, then so be it.

Some discussion of when SCSI would be appropriate on the desktop also would have been nice, as SCSI does have its place in the office.

But overall, this is a solid guide. By blindly following its advice, you’ll build a better PC than you’ll get from many of the direct PC vendors.

Internet Connection Sharing (Dan’s Data)

Nice, down-to-earth, and pretty thorough overview of what it takes to share an Internet connection whose primary target is people who are less ambitious than me–an old 386 or 486 running Linux isn’t among the options he presents. I guess he could have titled it “ICS for the Rest of Us.”

This is thorough without getting too bogged down in particulars, and it’s cross-referenced with an outstanding Networking 101 piece by the same author, and weird jargon is cross-referenced with an online dictionary. Some reviews of the various options would be nice, but he gives a good thumbnail sketch of each option’s advantages and drawbacks. The author, Dan Rutter, is a mainstream computer journalist in Australia who seems to have a very high standard for his work.

Definitely bookmark his networking piece, http://www.dansdata.com/network.htm , and if you keep a notebook, print out a copy to put there as well, as it’s an outstanding overview that answers most of the common networking questions like the difference between a hub and a switch. You may find yourself referring back to this one as well, but it’s more specialized and as such, not as generally useful.

His other stuff is useful, well-written, and downright entertaining. Few computer writers are fun to read. Dan Rutter usually is. Many people consider Ace’s Hardware the best of the hardware sites, but I really think Dan’s Data gives Ace’s a big-time run for the money.

01/26/2001

Hey hey! It works! The server was down all day yesterday, which was a shame. I wanted to try a new experiment. So I’ll try it today.

I saw criticism over at Storage Review on Wednesday morning for their critiques of other hardware sites’ reviews. I disagree with this criticism; many of the reviews out there are atrocities, with poor methodology, hearsay, reviewer ignorance, and other shortcomings. Sometimes these reviews are more misleading than the information in the products’ advertising or packaging! I believe Storage Review is well within professional bounds to point out these shortcomings when they find them.

The mainstream media does this all the time. Columnists and editors will criticize the reporting done in other publications. Most newspapers also employ one person, known as the ombudsman, whose job it is to criticize and/or defend, as appropriate, the publication’s own work.

Seeing as the hardware sites out there often do very sloppy work, even compared to the mainstream media, some policing of it is a very good thing.

Then, over lunch, the idea hit me. Why not do some critiquing myself? I’m trained in editorial writing and editing. I have some experience as a reviewer. And I’ve published a fair bit of my own work in the arena of technology journalism–newspaper columns, a book, individual magazine articles, a series… So I’m qualified to do it, even though I’m not the biggest name out there. And that kind of content is certainly more useful than the “this is how my day went” stuff I’ve been posting way too often.

I’m not so arrogant as to assume that the webmasters of these large sites are in my readership and would take my advice. I don’t expect to change them directly. What I do expect to do is to raise people’s expectations a little. By pointing out what’s good and what’s not so good, hopefully I can raise the public consciousness a little, and indirectly influence some of these sites. If not, then at least my readers are better informed than they otherwise would be, and that’s definitely a good thing.

KT-133A roundup (Tom’s Hardware Guide)

This is a roundup of six VIA KT133a boards. Good review overall. It doesn’t get bogged down in three pages of history that tend to look like a cut-and-paste job from the last similar review, unlike some sites. But it does give just enough history to give proper perspective, though it would have been nice to have mentioned it took EDO and SDRAM some time to show their advantages as well–DDR is no more a failure than the technologies that came before. Unusual for Tom’s, this review isn’t obsessed with overclocking either. Lots of useful information, such as the memory modules tested successfully with each board. Inclusion of the DFI AK74-AC, which will never be released, is questionable. I can see including a reference design, but a cancelled commercial board doesn’t seem to make much sense. You can get an idea from its scores why it got the axe; it was consistently one of the bottom two boards in the roundup.

Emphasis was on performance, not stability, but Pabst and Schmid noted they had no compatibility or stability problems with these boards. Stability in benchmarks doesn’t guarantee stability in the real world, but it’s usually a good indication. As tight as the race is between these boards, stability is more important than speed anyway, and since the majority of people don’t overclock, the attempt to at at least mention compatibility and stability is refreshing.

Socket 7 Upgrade Advice (AnandTech)

This is a collection of upgrade advice for Socket 7 owners. This review, too, doesn’t get too bogged down in history, but the mention of fake cache is noteworthy. This was a PC Chips dirty trick, dating back to 1995 or so, before the K6 series. It wasn’t a very common practice and didn’t last very long–certainly not as long as the article suggests.

Lots of good upgrade advice, including a short compatibility list and pitfalls you can expect. Also included are some benchmarks, but it would have been nice if they’d included more vintage chips. The oldest chip included was the K6-2/450, and AMD sold plenty of slower chips. You can’t extrapolate the performance of a K6-2/300 under the same conditions based on the 450’s score.

Also, the rest of the hardware used is hardly vintage–you’re not likely to find an IBM 75GXP drive and a GeForce 2 video card in an old Socket 7 system. Using vintage hardware would have given more useful results, plus it would have given the opportunity to show what difference upgrading the video card and/or CPU makes, which no doubt some Socket 7 owners are wondering about. Testing these chips with a GeForce does demonstrate that a more modern architecture will give better peformance–it exposes the weaknesses of the CPU–but indication of how much a new CPU would improve a three-year-old PC would be more useful to most people. Few people have the delusion that a K6-3+ is going to challenge an Athlon or P3. They just want to know the best way to spend their money.

No deceiving graphics or lack of knowledge here; what’s in this article is good stuff and well written. It’s just too bad the testing didn’t more closely resemble the real world, which would have made it infinitely more useful.

Memory Tweaking Guide (Sharky Extreme)

This is a nice introduction to the art of memory tweaking, and it explains all those weird acronyms we hear about all the time but rarely see explained. Good advice on how to tweak, and good advice on how to spend your memory money wisely. They disclosed their testbed and included the disclaimer that your results will vary from theirs–their benchmarks are for examples only. The only real gripe I have is that the benchmark graphs, like all too many on the Web, don’t start at zero. From looking at the graph, it would seem that Quake 3 runs six times as fast at 640x480x16 than at 1600x1200x16, when in reality it runs about twice as fast. Graphing this way, as any statistics professor will tell you, is a no-no because it exaggerates the differences way too much.

Asus CUSL2C Review (Trainwrecker)

This is a review of the Asus CUSL2C, an i815-based board intended for the average user. This review has lots of good sources for further information, but unfortunately it also has a little too much hearsay and speculation. Some examples:

“Of course, Asus won’t support this [cable] mod and we’re pretty sure that doing it will void your warranty.” Of course modifying the cable on an Asus product, or any other manufacturer’s product, will void your warranty. So will overclocking, which they didn’t mention. Overclockers are either unaware or apathetic of this. In matters like this, assertiveness is your friend–it gives a review credibility. One who is assertive and wrong than is more believable than one who is wishy-washy and right.

“Arguably, Asus provides the best BIOS support in the business. We believe Asus develops their BIOS’s at their facility in Germany.” Indeed, Asus claims to have re-written over half the code in their BIOSes, which is one reason why Asus boards perform well historically. Most motherboard manufacturers make at least minor modifications to the Award, AMI, or Phoenix BIOS when they license it, but Asus generally makes more changes than most. This claim is fairly well known.

I was also disappointed to see a section heading labeled “Windows 2000,” which simply consisted of a statement that they didn’t have time to test under Windows 2000, followed by lots of hearsay, but at least they included workarounds for the alleged problems. Including hearsay is fine, and some would say even beneficial, as long as you test the claims yourself. This review would have been much more useful if they had delayed the review another day and tested some of the claims they’ve heard.

There’s some good information here, particularly the links to additional resources for this board, but this review is definitely not up to par with the typical reviews on the better-known sites.

DDR Analysis (RealWorldTech)

Good perspective here, in that DDR is an incremental upgrade, just like PC133, PC100, PC66 SDRAM, and EDO DRAM were before it. But I don’t like the assertion that faster clock speeds would make DDR stand out. Why not actually test it with higher-speed processors to show how each of the technologies scale? Testing each chipset at least at 1 GHz in addition to 800 MHz would have been nice; you can’t get a P3 faster than 1 GHz but testing the Athlon chipsets at 1.2 would add to the enlightenment. Why settle for assertions alone when you can have hard numbers?

Also, the assertion “And don’t forget, even though things like DDR, AGP, ATA/100 and other advancements don’t amount to a significant gain all on their own, using all of latest technology may add up to a significant gain,” is interesting, but it’s better if backed up with an example. It’s possible to build two otherwise similar systems, one utilizing AGP, ATA-100 and DDR and another utilizing a PCI version of the same video card, a UDMA-33 controller, and PC133 SDRAM, and see the difference. Unfortuantely you can’t totally isolate the chipsets, so minor differences in the two motherboards will keep this from being totally scientific, but they’ll suffice for demonstrating the trend. Ideally, you’d use two boards from the same manufacturer, using chipsets of like vintage from the same manufacturer. That pretty much limits us to the VIA Apollo Pro series and a Pentium III CPU.

And if you’re ambitious, you can test each possible combination of parts. It’s a nice theory that the whole may be greater than the sum of the parts, and chances are a lot of people will buy it at face value. Why not test it?

This reminds me of a quote from Don Tapscott, in a Communication World interview from Dec. 1999, where he spelled out a sort of communication pecking order. He said, “If you provide structure to data, you get information. And if you provide context to information, you get knowledge. And if you provide human judgment and trans-historical insights, perhaps we can get wisdom.”

This analysis has good human judgment and trans-historical insights. It has context. It has structure. The problem is it doesn’t have enough data, and that’s what keeps this from being a landmark piece. Built on a stronger foundation, this had the potential to be quoted for years to come.

01/13/2001

Have I been brainwashed by Redmond? In the wake of MacWorld, Al Hawkins wrote a piece that suggested maybe so. My post from Thursday doesn’t suggest otherwise.

So let’s talk about what’s wrong with the PC industry. There are problems there as well–problems across the entire computer industry, really. The biggest difference, I think, is that the big guns in the PC industry are better prepared to weather the storm.

IBM’s PC business has been so bad for so long, they’ve considered pulling out of the very market they created. They seem to be turning it around, but it may only be temporary, and their profits are coming at the expense of market share. They retreated out of retail and eliminated product lines. Sound familiar? Temporary turnarounds aren’t unheard of in this industry. IBM as a whole is healthy now, but the day when they were known as Big Black & Blue isn’t so distant as to be forgotten. But IBM’s making their money these days by selling big Unix servers, disk drives, PowerPC CPUs and other semiconductors, software, and most of all, second-to-none service. The PC line can be a loss leader, if need be, to introduce companies to the other things IBM has to offer.

Compaq is a mess. That’s why they got a new CEO last year. But Compaq is a pretty diverse company. They have DEC’s old mini/mainframe biz, they have DEC’s OpenVMS and Digital Unix (now Tru64 Unix) OSs, they have DEC’s Alpha CPU architecture, and DEC’s widely acclaimed service division, which was the main thing that kept DEC afloat and independent in its day. Compaq also has its thriving server business, a successful line of consumer PCs and a couple of lines of business PCs. The combined Compaq/DEC was supposed to challenge IBM as the 800-pound gorilla of the industry, and that hasn’t happened. Compaq’s a big disappointment and they’re having growing pains. They should survive.

HP’s not exactly in the best of shape either. They’ve made a lot of lunkhead decisions that have cost them a lot of customers, most notably by not releasing drivers for their widely popular printers and scanners for newer Microsoft operating systems. While developing these drivers costs money, this will cost them customers in the long run so it was probably a very short-sighted decision. But HP’s inkjet printers are a license to print money, with the cartridges being almost pure profit, and HP and Compaq are the two remaining big dogs in retail. Plus they have profitable mainframe, Unix, and software divisions as well. They’ve got a number of ways to return to profitability.

The holidays weren’t kind to Gateway. They actually had to resort to selling some of their surplus inventory in retail stores, rather than using the stores as a front for their build-to-order business as intended.

Dell’s not happy with last year’s results either, so they’re looking to diversify and give themselves less dependence on desktop PCs. They’re growing up, in other words. They’re killing IBM and Compaq in PCs, and those companies are still surviving. Dell wants a piece of that action.

Intel botched a number of launches this year. They had to do everything wrong and AMD had to do everything right in order for AMD to continue to exist. That happened. AMD’s past problems may have been growing pains, and maybe they’re beyond it now. We shall see. Intel can afford to have a few bad quarters.

As for their chips, we pay a certain price for backward compatibility. But, despite the arguments of the Apple crowd, x86 chips as a rule don’t melt routinely or require refrigerants unless you overclock. All of my x86 chips have simple fans on them, along with smaller heatsinks than a G4 uses. I’ve seen many a Pentium III run on just a heatsink. The necessity of a CPU fan depends mostly on case design. Put a G4 in a cheap case with poor airflow and it’ll cook itself too.

Yes, you could fry an egg on the original Pentium-60 and -66. Later revisions fixed this. Yet I still saw these original Pentiums run on heat sinks smaller than the sinks used on a G4. The Athlon is a real cooker, so that argument holds, but as AMD migrates to ever-smaller trace widths, that should improve. Plus AMD CPUs are cheap as dirt and perform well. The Athlon gives G4-like performance and high clock speeds at a G3 price, so its customers are willing to live with some heat.

And Microsoft… There are few Microsoft zealots left today. They’re rarer and rarer. Microsoft hasn’t given us anything, yet we continue to buy MS Office, just like Mac users. We curse Microsoft and yet send millions and billions their way, just like Mac users. We just happen to buy the OS from them too. And while we curse Microsoft bugs and many of us make a living deploying Windows-based PCs (but the dozen or so Macs I’m responsible for keep me busier than the couple of hundred PCs I’m responsible for), for the most part Windows works. Mac owners talk about daily blue screens of death, but I don’t know when I last got one. I probably get one or two a year. I currently have eight applications running on my Windows 98 box. OS/2 was a far better system than Windows, but alas, it lost the war.

I can’t stand Microsoft’s imperialism and I don’t like them fighting their wars on my hardware. They can pay for their own battlefield. So I run Linux on some of my boxes. But sometimes I appreciate Windows’ backward compatibility.

I always look for the best combination of price, performance, and reliability. That means I change platforms a lot. I flirted with the Mac in 1991, but it was a loveless relationship. The PCs of that era were wannabes. I chose Amiga without having used one, because I knew it couldn’t possibly be as bad as Windows 3.0 or System 7.0. I was right. By 1994, Commodore had self-destructed and the Amiga was perpetually on the auction block, so I jumped ship and bought a Compaq. Windows 3.1 was the sorriest excuse I’d seen for a multitasking environment since System 7.0 and Windows 3.0. I could crash it routinely. So I switched to OS/2 and was happy again. I reluctantly switched to Windows 95 in 1996. I took a job that involved a lot of Macs in 1998, but Mac OS 8.5 failed to impress me. It was prettier than System 7 and if you were lucky you could use it all day without a horrible crash, but with poor memory management and multitasking, switching to it on an everyday basis would have been like setting myself back 12 years, so the second date wasn’t any better than the first.

Linux is very interesting, and I’ve got some full-time Linux PCs. If I weren’t committed to writing so much about Windows 9x (that’s where the money is), Linux would probably be my everyday OS. Microsoft is right to consider Linux a threat, because it’s cheaper and more reliable. Kind of like Windows is cheaper and more reliable than Mac OS. Might history repeat itself? I think it could.

The computer industry as a whole isn’t as healthy this year as it was last year. The companies with the most resources will survive, and some of the companies with fewer will fold or be acquired. The reason the industry press is harder on Apple than on the others is that Apple is less diversified than the others, and thus far more vulnerable.

01/11/2001

Mailbag:

My docs; Apple; Lost cd rom drive

It’s that time of year again. MacWorld time. I work with Macs way too much, so of course I have opinions. If you expect me to withhold them, you don’t know me very well.

Let’s face it: Apple’s in serious trouble. Serious trouble. They can’t move inventory. The Cube is a bust–unexpandable, defect-ridden, and overpriced. The low-end G4 tower costs less than the Cube but offers better expandability.  Buying a Cube is like marrying a gorgeous airhead. After the looks fade in a few years, you’re permanently attached to an airhead. So people buy a G4 tower, which has better expandability, or they get an iMac, which costs less.

Unfortunately, that gorgeous airhead metaphor goes a long way with Apple. The Mac’s current product line is more about aesthetics than anything else. So they’ve got glitzy, glamorous cases (not everyone’s cup of tea, but hey, I hear some people lust after Britney Spears too), but they’re saddled with underpowered processors dragged down by an operating system less sophisticated under the hood than the OS Commodore shipped with the first Amiga in 1985. I don’t care if your PowerPC is more efficient than an equivalently-clocked Pentium IV (so’s a VIA Cyrix III but no one’s talking about it), because if your OS can’t keep that CPU fed with a steady stream of tasks, it just lost its real-world advantage.

But let’s set technical merit aside. Let’s just look at pure practicalities. You can buy an iMac for $799. Or, if you’re content with a low-end computer, for the same amount of money you can buy a low-end eMachine and pair it up with a 19-inch NEC monitor and still have a hundred bucks left over to put towards your printer. Yeah, so the eMachine doesn’t have the iMac’s glitzy looks. I’ll trade glitz for a 19-inch monitor. Try working with a 19-inch and then switch to a 15-inch like the iMac has. You’ll notice a difference.

So the eMachine will be obsolete in a year? So will the iMac. You can spend $399 for an accelerator board for your iMac. Or you can spend $399 for a replacement eMachine (the 19-inch monitor will still be nice for several years) and get a hard drive and memory upgrade while you’re at it.

On the high end, you’ve got the PowerMac G4 tower. For $3499, you get a 733 MHz CPU, 256 MB RAM, 60 GB HD, a DVD-R/CD-R combo drive, internal 56K modem, gigabit Ethernet you won’t use, and an nVidia GeForce 2 MX card. And no monitor. Software? Just the OS and iMovie, which is a fun toy. You can order one of these glitzy new Macs today, but Apple won’t ship it for a couple of months.

Still, nice specs. For thirty-five hundred bucks they’d better be nice! Gimme thirty-five hundred smackers and I can build you something fantabulous.

But I’m not in the PC biz, so let’s see what Micron might give me for $3500. For $3514, I configured a Micron ClientPro DX5000. It has dual 800 MHz Pentium III CPUs (and an operating system that actually uses both CPUs!), 256 MB of RDRAM, a 7200 RPM 60 GB hard drive, a DVD-ROM and CD-RW (Micron doesn’t offer DVD-R, but you can get it third-party if you must have one), a fabulous Sound Blaster Live! card, a 64 MB nVidia GeForce 2 MX, and in keeping with Apple tradition, no monitor. I skipped the modem because Micron lets me do that. If you must have a modem and stay under budget, you can throttle back to dual 766 MHz CPUs and add a 56K modem for $79. The computer also includes Intel 10/100 Ethernet, Windows 2000, and Office 2000.

And you can have it next week, if not sooner.

I went back to try to configure a 1.2 GHz AMD Athlon-based system, and I couldn’t get it over $2500. So just figure you can get a machine with about the same specs, plus a 19-inch monitor and a bunch more memory.

Cut-throat competition in PC land means you get a whole lot more bang for your buck with a PC. And PC upgrades are cheap. A Mac upgrade typically costs $400. With PCs you can often just replace a CPU for one or two hundred bucks down the road. And switching out a motherboard is no ordeal–they’re pretty much standardized at this point, and PC motherboards are cheap. No matter what you want, you’re looking at $100-$150. Apple makes it really hard to get motherboard upgrades before the machines are obsolete.

It’s no surprise at all to me that the Mac OS is now the third most-common OS on the desktop (fourth if you count Windows 9x and Windows NT/2000 as separate platforms), behind Microsoft’s offerings and Linux. The hardware is more powerful (don’t talk to me about the Pentium 4–we all know it’s a dog, that’s why only one percent of us are buying it), if only by brute force, and it’s cheaper to buy and far cheaper to maintain.

Apple’s just gonna have to abandon the glitz and get their prices down. Or go back to multiple product lines–one glitzy line for people who like that kind of thing, and one back-to-basics line that uses standard ATX cases and costs $100 less off the top just because of it. Apple will never get its motherboard price down to Intel’s range, unless they can get Motorola to license the Alpha processor bus so they can use the same chipsets AMD uses. I seriously doubt they’ll do any of those things.

OS X will finally start to address the technical deficiencies, but an awful lot of Mac veterans aren’t happy with X.

Frankly, it’s going to take a lot to turn Apple around and make it the force it once was. I don’t think Steve Jobs has it in him, and I’m not sure the rest of the company does either, even if they were to get new leadership overnight. (There’s pressure to bring back the legendary Steve Wozniak, the mastermind behind the Apple II who made Apple great in the 1970s and 1980s.)

I don’t think they’ll turn around because I don’t think they care. They’ll probably always exist as a niche player, selling high-priced overdesigned machines to people who like that sort of thing, just as Jaguar exists as a niche player, selling high-priced swanky cars to people who like that sort of thing. And I think the company as a whole realizes that and is content with it. But Jaguar’s not an independent company anymore, nor is it a dominant force in the auto industry. I think the same fate is waiting for Apple.

Mailbag:

My docs; Apple; Lost cd rom drive

Pentium 4 performance is precedented

Thoughts on the Pentium 4 launch. No big surprises: a massively complex new processor design, limited availability, and systems from all the usual suspects, at high prices of course. And, as widely reported previously, disappointing performance.
This isn’t the first time this has happened. The Pentium Pro was a pretty lackluster performer too–it ran 32-bit software great, but Win9x was still the dominant OS at the time and it still has a lot of 16-bit code in it. So a 200 MHz Pentium Pro cost considerably more than a 200 MHz Pentium and for most of the people buying it, was significantly slower. History repeats itself…

Intel revised the Pentium Pro to create the Pentium II, with tweaks to improve 16-bit performance, but of course massive clock speed ramps made that largely irrelevant. Goose the architecture to 600 MHz and you’re going to blow away a 200 MHz previous-generation chip.

That’s what you’re going to see here. Intel fully intends to scale this chip beyond 2 GHz next year, and that’s when you’ll see this chip come into its own. Not before. And by then Intel will probably have changed their socket, (they intend to change it sometime next year) so buying a P4 today gives you no future-proofing anyway.

It never makes sense to be the first on the block with Intel’s newest chip. Never. Ever. Well, if you’re the only one on the block with a computer, then it’s OK. The P4 has issues. The P3 had issues (remember the serial number?) and was really just a warmed-over P2 anyway. The P2 was a warmed-over Pentium Pro. The Pentium Pro had serious performance issues. The Pentium had serious heat problems and it couldn’t do simple arithmetic (“Don’t divide, Intel inside!”). The last new Intel CPU whose only issue was high price was the 486, and that was in April 1989.

Unless you’re doing one of the few things the P4 really excels at (like encoding MP4 movies or high-end CAD), you’re much better off sticking with a P3 or an Athlon and sinking the extra money into more RAM or a faster hard drive. But chances are you already knew that.

Time to let the cat out of the bag. The top-secret project was to try to dual-boot WinME and Win98 (or some other earlier version) without special tools. But Win98’s DOS won’t run WinME, and WinME’s DOS seems to break Win98 (it loads, but Explorer GPFs on boot).

The best method I can come up with is to use the GPL boot manager XOSL. It just seems like more of an achievement to do it without third-party tools, but at least it’s a free third-party tool. You could also do it with LILO or with OS/2’s Boot Manager, but few people will have Boot Manager and LILO will require some serious hocus-pocus. Plus I imagine a lot of people will like XOSL’s eye candy and other gee-whiz features, though I really couldn’t care less, seeing as it’s a screen you look at for only a few seconds at boot time.

Apple. you call this tech support?

This is why I don’t like Apple. Yesterday I worked on a new dual-processor G4. It was intermittent. Didn’t want to drive the monitor half the time. After re-seating the video card and monitor cable a number of times and installing the hardware the computer needed, it started giving an error message at boot:

The built-in memory test has detected a problem with cache memory. Please contact a service technician for assistance.

So I called Apple. You get 90 days’ free support, period. (You also only get a one-year warranty unless you buy the AppleCare extended warranty, which I’m loathe to do. But I we’d probably better do it for this machine since it all but screams “lemon” every time we boot it.) So, hey, we can’t get anywhere with this, so let’s start burning up the support period.

The hold time was about 15 seconds. I mention this because that’s the only part of the call that impressed me and my mother taught me to say whatever nice things I could. I read the message to the tech, who then put me on hold, then came back in about a minute.

“That message is caused by a defective memory module. Replace the third-party memory module to solve the problem,” she said.

“But the computer is saying the problem is with cache, not with the memory,” I told her. (The cache for the G4 resides on a small board along with the CPU core, sort of like the first Pentium IIs, only it plugs into a socket.) She repeated the message to me. I was very impressed that she didn’t ask whether we’d added any memory to the system (of course we had–Apple factory memory would never go bad, I’m sure).

I seem to remember at least one of my English teachers telling me to write exactly what I mean. Obviously the Mac OS 9 programmers didn’t have any of my English teachers.

I took the memory out and cleaned it with a dollar bill, then put it back in. The system was fine for the rest of the afternoon after this, but I have my doubts about this system. If the problem returns, I’ll replace the memory. When that turns out not to be the problem, I don’t know what I’ll do.

We’ve been having some problems lately with Micron tech support as well, but there’s a big difference there. With Apple, if you don’t prove they caused the problem, well, it’s your problem, and they won’t lift a finger to help you resolve it. Compare this to Micron. My boss complained to Micron about the length of time it was taking to resolve a problem with one particular system. You know what the Micron tech said? “If this replacement CPU doesn’t work, I’ll replace the system.” We’re talking a two-year-old system here.

Now I know why Micron has more business customers than Apple does. When you pay a higher price for a computer (whether that’s buying a Micron Client Pro instead of a less-expensive, consumer-oriented Micron Millenia, or an Apple G4 instead of virtually any PC), you expect quick resolution to your computer problems because, well, your business doesn’t slow down just because your computer doesn’t work right. Micron seems to get this. Apple doesn’t.

And that probably has something to do with why our business now has 25 Micron PCs for every Mac. There was a time when that situation was reversed.

The joke was obvious, but… I still laughed really hard when I read today’s User Friendly. I guess I’m showing my age here by virtue of getting this.

Then again, three or four years back, a friend walked up to me on campus. “Hey, I finally got a 64!” I gave him a funny look. “Commodore 64s aren’t hard to find,” I told him. Then he laughed. “No, a Nintendo 64.”

It’s funny how nicknames recycle themselves.

For old times’ sake. I see that Amiga, Inc. must be trying to blow out the remaining inventory of Amiga 1200s, because they’re selling this machine at unprecedented low prices. I checked out www.softhut.com just out of curiosity, and I can get a bare A1200 for $170. A model with a 260MB hard drive is $200.  On an Amiga, a drive of that size is cavernous, though I’d probably eventually rip out the 260-megger and put in a more modern drive.

The A1200 was seriously underpowered when it came out, but at that price it’s awfully tempting. It’s less than used A1200s typically fetch on eBay, when they show up. I can add an accelerator card later after the PowerPC migration plan firms up a bit more. And Amigas tend to hold their value really well. And I always wanted one.

I’m so out of the loop on the Amiga it’s not even funny, but I found it funny that as I started reading so much started coming back. The main commands are stored in a directory called c, and it gets referred to as c: (many crucial Amiga directories are referenced this way, e.g. prefs: and devs: ). Hard drives used to be DH0:, DF1:, etc., though I understand they changed that later to HD0:, HD1:, etc.

So what was the Amiga like? I get that question a lot. Commodore released one model that did run System V Unix (the Amiga 3000UX), but for the most part it ran its own OS, known originally as AmigaDOS and later shortened to AmigaOS. Since the OS being developed internally at Amiga, Inc., and later at Commodore after they bought Amiga, wasn’t going to be ready on time for a late 1984/early 1985 release, Commodore contracted with British software developer Metacomco to develop an operating system. Metacomco delivered a Tripos-derived OS, written in MC68000 assembly language and BCPL, that offered fully pre-emptive multitasking, multithreading, and dynamic memory allocation (two things even Mac OS 9 doesn’t do yet–OS 9 does have multithreading but its multitasking is cooperative and its memory allocation static).

Commodore spent the better part of the next decade refining and improving the OS, gradually replacing most of the old BCPL code with C code, stomping bugs, adding features and improving its looks. The GUI never quite reached the level of sophistication that Mac OS had, though it certainly was usable and had a much lower memory footprint. The command line resembled Unix in some ways (using the / for subdirectories rather than ) and DOS in others (you used devicename:filename to address files). Some command names resembled DOS, others resembled Unix, and others neither (presumably they were Tripos-inspired, but I know next to nothing about Tripos).

Two modern features that AmigaOS never got were virtual memory and a disk cache. As rare as hard drives were for much of the Amiga’s existance this wasn’t missed too terribly, though Commodore announced in 1989 that AmigaDOS 1.4 (never released) would contain these features. AmigaDOS 1.4 gained improved looks, became AmigaOS 2.0, and was released without the cache or virtual memory (though both were available as third-party add-ons).

As for the hardware, the Amiga used the same MC68000 series of CPUs that the pre-PowerPC Macintoshes used. The Amiga also had a custom chipset that provided graphics and sound coprocessing, years before this became a standard feature on PCs. This was an advantage for years, but became a liability in the early 1990s. While Apple and the cloners were buying off-the-shelf chipsets, Commodore continued having to develop their own for the sake of backward compatibility. They revved the chipset once in 1991, but it was too little, too late. While the first iteration stayed state of the art for about five years, it only took a year or two for the second iteration to fall behind the times, and Motorola was having trouble keeping up with Intel in the MHz wars (funny how history repeats itself), so the Amigas of 1992 and 1993 looked underpowered. Bled to death by clueless marketing and clueless management (it’s arguable who was worse), Commodore bled engineers for years and fell further and further behind before finally running out of cash in 1993.

Though the Amiga is a noncontender today, its influence remains. It was the first commercially successful personal computer to feature color displays of more than 16 colors (it could display up to 4,096 at a time), stereo sound, and pre-emptive multitasking–all features most of us take for granted today. And even though it was widely dismissed as a gaming machine in its heyday, the best-selling titles for the computer that ultimately won the battle are, you guessed it, games.

Scanner troubleshooting secrets

~Mail Follows Today’s Post~

Scanner wisdom. One of the things I did last week was set up a Umax scanner on a new iMac DV. The scanner worked perfectly on a Windows 98 PC, but when I connected it to the Mac it developed all sorts of strange diseases–not warming up properly, only scanning 1/3 of the page before timing out, making really loud noises, crashing the system…

I couldn’t resolve it, so I contacted Umax technical support. The tech I spoke with reminded me of a number of scanner tips I’d heard before but had forgotten, and besides that, I rarely if ever see them in the scanner manuals.

  • Plug scanners directly into the wall, not into a power strip. I’ve never heard a good explanation of why scanners are more sensitive to this than any other peripheral, but I’ve seen it work.
  • Plug USB scanners into a powered hub, or better yet, directly into the computer. USB scanners shouldn’t need power from the USB port, since they have their own power source, but this seems to make a difference.
  • Download the newest drivers, especially if you have a young operating system like MacOS 9, Mac OS X, Windows ME, or Windows 2000. It can take a little while for the scanner drivers to completely stabilize. Don’t install off the CD that came with the scanner, because it might be out of date. Get the newest stuff from the manufacturer’s Web site.
  • Uninstall old drivers before installing the new ones. This was the problem that bit me. The new driver didn’t totally overwrite the old one, creating a conflict that made the scanner go goofy.
  • Buy your scanner from a company that has a track record of providing updated drivers. Yes, that probably means you shouldn’t buy the $15 scanner with the $25 mail-in rebate. Yes, that means don’t buy HP. Up until a couple of years ago, getting NT drivers out of HP was like pulling teeth; now HP is charging for Windows 2000 drivers. HP also likes to abandon and then pick back up Mac support on a whim. Terrible track record.

Umax’s track record is pretty darn good. I’ve downloaded NT drivers for some really ancient Umax scanners after replacing old Macs with NT boxes. I once ran into a weird incompatibility with a seven-year-old Umax scanner–it was a B&W G3 with a wide SCSI controller (why, I don’t know) running Mac OS 8.6. Now that I think about it, I think the incompatibility was with the controller card. The scanner was discontinued years ago (before Mac OS 8 came out), so expecting them to provide a fix was way out of line.
m I’ve ever had with a Umax that they didn’t resolve, so when I spec out a scanner at work, Umax is always on my short list.

And here’s something I just found interesting. Maybe I’m the only one. But in reading the mail on Jerry Pournelle’s site, I found this. John Klos, administrator of sixgirls.org, takes Jerry to task for saying a Celeron can’t be a server. He cites his 66 MHz 68060-based Amiga 4000, which apparently acts as a mail and Web server, as proof. Though the most powerful m68k-based machine ever made, its processing power pales next to any Celeron (spare the original cacheless Celeron 266 and 300).

I think the point he was trying to make was that Unix plays by different rules. Indeed, when your server OS isn’t joined at the hip to a GUI and a Web browser and whatever else Gates tosses in on a whim, you can do a lot more work with less. His Amiga would make a lousy terminal server, but for serving up static Web pages and e-mail, there’s absolutely nothing wrong with it. Hosting a bunch of Web sites on an Amiga 4000 just because I could sounds very much like something I’d try myself if I had the hardware available or was willing to pay for the hardware necessary.

But I see Jerry Pournelle’s point as well.

It’s probably not the soundest business practice to advertise that you’re running off a several-year-old sub-100 MHz server, because that makes people nervous. Microsoft’s done a pretty admirable job of pounding everything slower than 350 MHz into obsolescence and the public knows this. And Intel and AMD have done a good job of marketing their high-end CPUs, resulting in people tending to lay blame at the CPU’s feet if it’s anything but a recent Pentium III. And, well, if you’re running off a shiny new IBM Netfinity, it’s very easy to get it fixed, or if need be, to replace it with another identical one. I know where to get true-blue Amiga parts and I even know which ones are interchangeable with PCs, but you might well be surprised to hear you can still get parts and that some are interchangeable.

But I’m sure there are far, far more sub-100 MHz machines out there in mission-critical situations functioning just fine than anyone wants to admit. I know we had many at my previous employer, and we have several at my current job, and it doesn’t make me nervous. The biggest difference is that most of them have nameplates like Sun and DEC and Compaq and IBM on them, rather than Commodore. But then again, Commodore’s reputation aside, it’s been years since I’ve seen a computer as well built as my Amiga 2000. (The last was the IBM PS/2 Model 80, which cost five times as much.) If I could get Amiga network cards for a decent price, you’d better believe I’d be running that computer as a firewall/proxy and other duties as assigned. I could probably get five years’ uninterrupted service from old Amy. Then I’d just replace her memory and get another ten.

The thing that makes me most nervous about John Klos’ situation is the business model’s dependence on him. I have faith in his A4000. I have faith in his ability to fix it if things do go wrong (anyone running NetBSD on an Amiga knows his machine better than the onsite techs who fix NetFinity servers know theirs). But there’s such thing as too much importance. I don’t let Apple certified techs come onsite to fix our Macs anymore at work, because I got tired of them breaking other things while they did warranty work and having to fix three things after they left. I know their machines better than they do. That makes me irreplaceable. A little job security is good. Too much job sercurity is bad, very bad. I’ll be doing the same thing next year and the year after that. It’s good to be able to say, “Call somebody else.” But that’s his problem, not his company’s or his customers’.

~~~~~~~~~~

From: rock4uandme
To: dfarq@swbell.net
Sent: Wednesday, October 25, 2000 1:22 PM
Subject: i`m having trouble with my canon bjc-210printer…

i`m having trouble with my canon bjc210 printer it`s printing every thing all red..Can you help???
 
 
thank you!!    john c
 
~~~~~~~~~

Printers aren’t my specialty and I don’t think I’ve ever seen a Canon BJC210, but if your printer has replacable printheads (some printers make the printhead part of the ink cartridge while others make them a separate component), try replacing them. That was the problem with the only Canon printer I’ve ever fixed.
 
You might try another color ink cartridge too; sometimes those go bad even if they still have ink in them.
 
If that fails, Canon does have a tech support page for that printer. I gave it a quick look and it’s a bit sketchy, but maybe it’ll help. If nothing else, there’s an e-mail address for questions. The page is at http://209.85.7.18/techsupport.php3?p=bjc210 (to save you from navigating the entire www.ccsi.canon.com page).
 

I hope that helps.

Dave
 
~~~~~~~~~~
 

From: Bruce Edwards
Subject: Crazy Win98 Networking Computer Problem

Dear Dave:

I am having a crazy computer problem which I am hoping you or your readers
may be able to give me a clue to.  I do have this posted on my daily
journal, but since I get very little traffic, I thought your readership or
yourself may be able to help.  Here’s the problem:

My wife’s computer suddenly and inexplicably became very slow when accessing
web sites and usually when accessing her e-mail.  We access the internet
normally through the LAN I installed at home.  This goes to a Wingate
machine which is connected to the aDSL line allowing shared access to the
internet.

My computer still sends and receives e-mail and accesses the web at full
speed.  Alice’s computer now appears to access the web text at about the
speed of a 9600 baud modem with graphics coming down even more slowly if at
all.  Also, her e-mail (Outlook Express) usually times out when going
through the LAN to the Wingate machine and then out over the internet. 
The LAN is working since she is making a connection out that way.

File transfer via the LAN between my PC and hers goes at full speed.
Something is causing her internet access to slow to a crawl while mine is
unaffected.  Also, it appears to be only part of her internet access.  I can
telnet out from her computer and connect to external servers very fast, as
fast as always.  I know telnet is just simple text, but the connection to
the server is very rapid too while connecting to a server via an http
browser is much much slower and then, once connected, the data flows so slow
it’s crazy.

Also, dial-up and connect to the internet via AOL and then use her mail
client and (external to AOL) browser works fine and is as speedy as you
would expect for a 56K modem.  What gives?

I tried reinstalling windows over the existing set-up (did not do anything)
and finally started over from “bare metal” as some like to say.  Reformat
the C drive.  Reinstall Windows 98, reinstall all the drivers, apps, tweak
the configuration, get it all working correctly.  Guess what?  Same slow
speed via the aDSL LAN connection even though my computer zips out via the
same connection.  Any suggestions?

Sincerely,

Bruce W. Edwards
e-mail:  bruce@BruceEdwards.com
Check www.BruceEdwards.com/journal  for my daily journal.

Bruce  🙂
Bruce W. Edwards
Sr. I.S. Auditor  
~~~~~~~~~~

From: Dave Farquhar [mailto:dfarq@swbell.net]Sent: Monday, October 23, 2000 6:16 PM
To: Edwards, Bruce
Cc: Diana Farquhar
Subject: Re: Crazy Win98 Networking Computer Problem

Hi Bruce,
 
The best thing I can think of is your MTU setting–have you run any of those MTU optimization programs? Those can have precisely the effect you describe at times. Try setting yor MTU back to 1500 and see what that does. While I wholeheartedly recommend them for dialup connections, MTU tweaking and any sort of LAN definitely don’t mix–to the point that I almost regret even mentioning the things in Optimizing Windows.
 
Short of that, I’d suggest ripping out all of your networking protocols and adapters from the Network control panel and add back in TCP/IP and only the other things you absolutely need. This’ll keep Windows from getting confused and trying to use the wrong transport, and eliminate the corrupted TCP/IP possibility. These are remote, but possible. Though your reinstall should have eliminated that possibility…
 
If it’s neither of those things, I’d start to suspect hardware. Make sure you don’t have an interrupt conflict (rare these days, but I just saw one a couple weeks ago so I don’t rule them out). Also try swapping in a different cable or NIC in your wife’s machine. Cables of course go bad more frequently than NICs, though I’ve had horrible luck with cheap NICs. At this point I won’t buy any ethernet NIC other than a Bay Netgear, 3Com or Intel.
 
I hope that helps. Let me know how it goes for you.

Dave 
~~~~~~~~~~
From: Bruce Edwards

Hi Dave:
 
Thank you for posting on your web site. I thought you would like an update.
 
I verified the MTU setting was still at 1500 (it was).  I have not used one of the optimizing programs on this PC.
 
I removed all the adapters from the PC via the control panel.  Rebooted and only added back TCP/IP on the Ethernet card. 
 
I double checked the interrupts in the control panel, there do not appear to be any conflicts and all devices report proper function.
 
I still need to 100% verify the wiring/hubs.  I think they are O.K. since that PC, using the same adapter, is able to file share with other PCs on the network.  That also implies that the adapter is O.K.
 
I will plug my PC into the same hub and port as my wife’s using the same cable to verify that the network infrastructure is O.K.
 
Then, I’ll removed the adapter and try a different one.
 
Hopefully one of these things will work.
 
Cheers,
 
Bruce
~~~~~~~~~~

This is a longshot, but… I’m wondering if maybe your DNS settings are off, or if your browser might be set to use a proxy server that doesn’t exist. That’s the only other thing I can think of that can cause sporadic slow access, unless the problem is your Web browser itself. Whichever browser you’re using, have you by any chance tried installing and testing the other one to see if it has the same problems?
 
In my experience, IE 5.5 isn’t exactly the greatest of performers, or when it does perform well, it seems to be by monopolizing CPU time. I’ve gotten much better results with IE 5.0. As for Netscape, I do wish they’d get it right again someday…
 
Thanks for the update. Hopefully we can find an answer.

Dave 
~~~~~~~~~~ 

Fun with electricity

Fun with electricity. I’m trying to figure out if I’m overreacting or not. What really scares me is that this journalist seems to know a whole lot more about electrical safety than some other people working in an IS/IT department.
The scenario: I had a PC that wouldn’t boot or power off. It sat there in a catatonic state, HD LED solid, power LED solid, fans running, but no other signs of life. The only way to power it off was to pull the plug. Plug it back in, and it reverted instantly to the catatonic state. I popped the hood and didn’t see anything obvious. I did notice a weird smell, which isn’t unusual for an electrical problem, but it was somehow different. Organic… I unplugged the ATX power connector and went and plundered an ATX power supply from an old P166. I came back, plugged the plunder into the board’s power connector, connected the cord, and hit the switch. It fired up and the system POSTed. OK, it’s a short in the power supply. I’ll just e-mail Micron with the details and the serial number, and they’ll overnight me another one. In the meantime, this one’s not doing anything anyway.

So I unbolt the bad one, pull it out, flip it over, and get a nice splash of black liquid. What the? 10W40!? In a computer!? Wait… Suddenly the smell made sense. Old coffee. With cream and sugar, judging from how sticky my hands were getting. So I went to the facilities to wash my hands and get some paper towels to clean up the coffee spill that had now migrated to the IDE cables and elsewhere inside the case.

I cleaned up, assembled the system, and e-mailed my boss and my boss’ boss to ask what, if anything, needed to be said or done. My boss is incredibly busy, but my boss’ boss asked if we could loan them another Pentium II until theirs was fixed. I told him he was missing the point: I already got their computer working. My problem with the situation was we had an electrical device with liquid in it and no one told me before I started trying to fix it. The $35 power supply is meaningless. It’s a lot more expensive to repair or replace techs if they electrocute themselves.

He asked me what part of policy isn’t working if it’s not safe to work on equipment.

Am I the only one who remembers from grade school not to put a hair dryer in the bathtub? It’s the same principle, just with more current and less liquid. And I also remember from science class that pure water isn’t a conductor. It’s the stuff dissolved in the water that conducts. St. Louis has hard water. Add coffee. Add cream and sugar. Now you’ve got enough conductivity to short out the power supply. Having some idea what kind of juice accumulates in the power supply (I shook hands with a power supply a few years ago, which is why I don’t open power supplies anymore), this situation strikes me as dangerous.

I was at least owed the courtesy of being told they spilled coffee in there so I knew not to reach in with both hands and complete the circuit. The embarrasment is better than finding a dead Dave laying in their cube next to a dead Micron, isn’t it?

I guess I didn’t explain it well enough.

Tongue-tied

Anything to say? My sister (yes, she has a name–it’s Di) mailed me and asked me if I had anything to say today. Not really. I finally won a major victory at work that will result in the departure of two Macintoshes that have become the bane of my existence. The battle came at a high personal price–I’m exhausted and have little to say. Other than an observation that AppleShare IP 6.3 appears to be about as rude as its predecessors. It seems to like MacOS 9, but it also seems very willing to crash MacOS 8.6 and earlier clients. Seeing as these are 100, 120, and 132 MHz machines, upgrading to 9 isn’t exactly practical or worthwhile or cost-effective. So they’re getting brand spanking new Micron PCs with Pentium III 600 chips or whatever it is we’re buying these days. I will be very joyfully installing them in the morning.

———- From: al wynn
Does McAfee still sell Nuts&Bolts?

Exactly how do you use Nuts&Bolts to “sort directory entries by the file’s physical placement on the hard drive” (ie. under which menu item can I find it ?)

Also, what are some good web links (or other resources) that will show me how to optimize Norton Utilities configuration ?

———-
It’s in Disk Tune. Click Advanced–>Directory Sort–>Sort Criteria. There you can select Cluster number as your directory sort criteria. Under Win95, this makes N&B’s Disk Tune the best defragmenter/optimizer, but under Win98, NU’s Speed Disk and Fix-It’s Defrag Plus have features that will make them outperform Disk Tune in spite of this feature (they actually do some strategic fragmentation to increase speed). I suppose you could optimize the disk with one of the others, then try to get Disk Tune to skip the defragmentation part and just optimize the directories, but I think I tried to figure out how to do that and gave up. Alternatively you could optimize with Disk Tune first, then defragment with one of the others and not do anything with the directory entries–assuming you want to save absolutely every microsecond possible. (Be aware that Disk Tune is a very slow program, so we’re talking diminishing returns here to run it, then run one of the others.)

I haven’t seen a better resource for the utilities suites than chapters 3 and 5 of Optimizing Windows; those chapters were the result of about seven years’ experience messing around with disk utilities (starting under DOS, of course). I’ve never seen a Web site on the subject (good or bad); nor much other information outside of the manuals that came with some of the older versions. That was part of the reason why I wrote my own. I tried to explain what to do with whatever suite you happened to have, as well as the reasoning behind it.