Troubleshooting intermittent PC problems

How to troubleshoot an intermittent PC problem. We’ve got an aging P2-233 at work that likes to bluescreen a lot under NT4–usually once every day or two. No one who looked at it was able to track it down. The first thing I noticed was that it still had the factory installation of NT, from about three years ago. Factory installations are bad news. The first thing you should do with any PC is install a fresh copy of Windows. If all you have are CAB files and no CD, don’t format the drive–just boot to DOS, go into that directory, run Setup, and install to a new directory other than C:Windows. With NT, it’s also possible to install from DOS though the syntax escapes me momentarily.

The first thing I suggested was to run RAM Stress Test, from www.ultra-x.com , over the course of a weekend to eliminate the possibility of bad memory. I followed that by formatting the drive FAT and running SpinRite. After six hours, SpinRite gave the disk a completely clean bill of health.

Knowing the memory and disk were good, I built up the system, installing NT, then installing SP5 128-bit, then installing IE 5.01SP1, then installing Diskeeper Lite, then installing Office 97 and Outlook 98 and WRQ Reflection, then running Windows Update to get all the critical updates and SP6a. I ran Diskeeper after each installation to keep the drive in pristine condition–I find I get better results that way than by installing everything and then running Diskeeper.

The system seemed pretty stable through all that. Then I went to configure networking and got a bluescreen. Cute. I rebooted and all was well and remained well for an hour or two.

How to see if the bluescreen was a fluke?

I devised the following batch file:

:loop
dir /w /s c:
goto loop

Who says command lines are useless and archaic? Definitely not me! I saved the file as stress.bat and ran 10 instances of it. Then I hit Ctrl-Alt-Del to bring up Task Manager. CPU usage was at 100%. Good.

The system bluescreened after a couple of hours.

How to track down the problem? Well, I knew the CD-ROM drive was bad. Can a bad CD-ROM cause massive system crashes? I’ve never heard of that, but I won’t write off anything. So I disconnected the CD-ROM drive. I’d already removed all unnecessary software from the equation, and I hadn’t installed any extraneous peripherals either. So with the CD-ROM drive eliminated, I ran 10 instances of the batch file again.

The system didn’t make it through the night.

OK. Memory’s good. Hard drive’s good. Bad CD-ROM drive out of equation. Fresh installation of OS with nothing extra. What next?

I called my boss. I figured maybe he’d have an idea, and if not, he and I would contact Micron to see what they had to suggest–three-year warranties and a helpful technical support staff from a manufacturer who understands the needs of a business client are most definitely a good thing.

My boss caught the obvious possibility I missed: heat.

All the fans worked fine, and the CPU had a big heatsink put on at the factory that isn’t going anywhere. Hopefully there was thermal compound in there, but if there wasn’t, I wouldn’t be getting in there to put any in, nor would I be replacing the heatsink with a heatsink/fan combo. So I pulled the P2-333 out of the PC I use–it was the only 66 MHz-bus P2 I had–and put it in the system. I’d forgotten those old P2s weren’t multiplier-locked, so the 333 ended up running at 233. That’s fine. I’ve never had overheating problems with that chip at its rated speed, so at 100 MHz less, I almost certainly wouldn’t run into problems.

With that CPU, the system happily ran 10 instances of my batch file for 30 hours straight without a hiccup. So I had my culprit: That P2-233 was overheating.

Now, ideally a stress test would tax more system memory than this one did and would force some floating-point operations as well. Prime95 is ideal.

If you have time and parts available, you can troubleshoot a recalcitrant PC by running such a real-world stress test, then replacing possible suspect parts (CPU, memory, hard drive, motherboard) one at a time until you isolate the problem.

02/20/2001

Windows Me Too? I’ve read the allegations that Microsoft aped Mac OS X with the upcoming Windows XP. Maybe I’m dense, but I don’t see much resemblance beyond the resemblance between two cars made by different manufacturers. The Start menu has a new neon look, which is probably Apple-inspired to some degree. The Windows taskbar has had Dock-like functionality for several years now–it was added with IE4. The biggest change seems to be the Start menu–they’ve taken the Windows 2000 initiative, where only commonly used stuff is shown, to an extreme, and now the Start menu, at least in some screenshots, looks bigger. I don’t know if it really is or not–I saw another 1024×768 screenshot in which the Start menu actually takes a little less real estate than my current box at the same resolution. And they’ve re-drawn some icons.

As a whole there’s a more textured look now, but some of the Unixish Window managers have been doing that stuff since 1997. The login screen bears a definite resemblance to some of the Unixish login screens I’ve seen of late.

Microsoft is claiming this is the most significant user interface change since Windows 95. That’s true, but it’s not the big step that Windows 95 was from Windows 3.x. It’s an evolutionary step, and one that should have been expected, given that the Windows 9x Explorer interface is now older than the Program Manager interface was when it was replaced. Had 24-bit displays been common in 1995, Microsoft probably would have gone with a textured look then–they’ve always liked such superficialities.

Stress tests. New hardware, or suspect hardware, should always be stress-tested to make sure it’s up to snuff. Methods are difficult to find, however, especially under Windows. Running a benchmark repeatedly can be a good way to test a system–overclockers frequently complain that their newly overclocked systems can’t finish benchmark suites–but is it enough? And when the system can’t finish, the problem can be an OS or driver issue as well.

Stress testing with Linux would seem to be a good solution. Linux is pretty demanding anyway; run it hard and it’ll generally expose a system’s weaknesses. So I did some looking around. I found a stress test employed by VA-Linux at http://sourceforge.net/projects/va-ctcs/ that looked OK. And I found another approach at http://www.eskimo.com/~pygmy/stress.txt that just speaks of experience stress testing by repeatedly compiling the Linux kernel, which gives the entire system (except for the video card) a really good workout.

And the unbelievable… Someone at work mentioned an online President’s Day poll, asking who was the best president? Several obvious candidates are up on Mt. Rushmore: Washington, Lincoln, Jefferson, Teddy Roosevelt. Most people would add FDR and possibly Harry Truman and Woodrow Wilson to that list. I was talking with a good friend the other day about just this issue, and I argued in favor of Lincoln. Washington had a tough job of setting a standard, and he was great, but Lincoln had an even tougher job of holding a bitterly divided country together. So if I had to rank them, I’d probably say Lincoln, Washington, Jefferson, Teddy Roosevelt, and then we have a mess. I don’t agree with their politics, but FDR and Woodrow Wilson probably belong in there. James Madison and James Monroe belong in there, the question is where. Then it starts to get really tough. Was Harry Truman in those guys’ league? Not really, but he’s worlds better than Warren G. Harding and Bill Clinton. Fine, pencil him in at 9. Now who gets #10? Some would give it to Ronald Reagan. It seems to me that Reagan is at once overappreciated and underappreciated. A lot of people put him at the very bottom, which I think is unfair. But then there was this poll  that put him at the very top, by a very wide margin. When I looked, Reagan had 44% of the vote, followed by George Washington at 29% and Abraham Lincoln a distant third at 14%.

When I speak of the hard right in the media, that’s what I’m referring to: blind allegiance to an icon, however flawed. Don’t get me wrong, Reagan was no Warren G. Harding–he did win the Cold War after all. Conservatives say his economic policies saved the country, while liberals say it very nearly wrecked it. All I can tell you is my college economics professor taught that Reagan at the very least had the right idea–the big problem with the theory behind Reagan’s policies is the impossibility of knowing whether you’d gone too far or not far enough. Fine. FDR played a similar game. Both are revered by their parties and hated by the other party. But as president, neither Ronald Reagan nor FDR are in the Washington and Lincoln league. As a man, FDR probably was in that league, and if he was not the last, he was very close to it. But with the truly great presidents, there is very little doubt about them–and in the cases of Lincoln and Jefferson, their greatest critics were the voices inside their own heads.

Great people just don’t run for president anymore, and they rarely run for political office, period. It’s easy to see why. Anyone truly qualified to be President of the United States is also qualified to be en executive at a large multinational corporation, and that’s a far more profitable and less frustrating job. And the truly great generally aren’t willing to compromise as much as a politician must in order to get the job.

Early on, we had no shortage whatsoever of great minds in politics: Washington, Jefferson, Madison and Monroe certainly. Plus men who never were president, like Benjamin Franklin and Alexander Hamilton. We had, in effect, from Washington to Monroe, a string of men who met Socrates’ qualifications to be Philosopher-King. (Yes, John Adams was single-term, but he was a cut above most of those who were to follow.)

But as our country developed, so many better things for a great mind to do sprung up. Today you can be an executive at a large company, or you can be a researcher, or a pundit, or the president of a large and prestigious university. In 1789, there weren’t as many things to aspire to.

If we’ve got any Benjamin Franklins and Thomas Jeffersons and George Washingtons and Abraham Lincolns out there today (and I believe we do), they’ve got better things to do than waste time in Washington, D.C.

No, our greatest president wasn’t Ronald Reagan, just as it wasn’t Dwight Eisenhower or John Kennedy. That’s nostalgia talking.

Open source and innovation

Innovation. And of course I can’t let this slip by. Microsoft is trying to say that open source stifles innovation. Steve DeLassus and I have been talking about this (he was the one who originally pointed it out to me), and I think he and I are in agreement that open source by nature isn’t inherently innovative. It may improve on another idea or add features, but most open source projects (and certainly the most successful ones) are clones of proprietary software. Then again, so was a lot of Microsoft software, starting out. Pot, meet Kettle. Kettle, meet Pot.

But although the programs themselves aren’t always innovative, I think the open source atmosphere can stimulate innovation. Huh? Bear with me. Open source gets you in closer contact with computer internals than a Microsoft or Apple OS generally will. That gets you thinking more about what’s possible and what’s not–the idea of what’s possible starts to have more to do with the hardware than it does with what people have tried before. That stimulates creativity, which in turn stimulates innovation.

Need an example? A calculator company called Busicom accidentally invented the personal computer. I’ve heard several versions of the story, but the gist of it was, Busicom wanted to create a programmable calculator. In the process of creating this device, they commissioned the Intel 4004 CPU, the first chip of its kind. There are conflicting accounts as to whether the resulting product even used the Intel 4004, but that’s immaterial–this calculator’s other innovation was its inclusion of a tape drive.

Intel bought back the rights and marketed the 4004 on its own and became a success story, of course. Meanwhile, people started using their Busicom calculators as inexpensive computers–the built-in tape drive worked as well for data storage as it did for program storage. This was in 1970-1971, several years before the Altair and other kit computers.

Four years later, Busicom was out of business but the revolution was under way, all because some people–both engineers at Intel and end-users who bought the calculators–looked beyond the device’s intended use and saw something more.

Open source software frequently forces you to do the same thing, or it at least encourages it. This fuels innovation, and thus should be encouraged, if anything.

Last week’s flood. No, I haven’t answered all the mail about it. I’m going to give it another day before I deal with it, because dealing with a ton of mail is frankly harder than just writing content from scratch. I don’t mind occasionally, but I’d rather wait until a discussion reaches critical mass, you know?

One reader wrote in asking why foreigners care about U.S. gun laws. I don’t really have an answer to that question. I find it very interesting that no American has yet voiced any strong objections to anything I said–I even had a lifelong liberal Democrat write in, and while she stayed to my left, she advocated enforcement of the laws we already have on the books, rather than an outright ban. She’d force more safety classes, but I don’t have any real objections to that notion.

An interesting upgrade approach. The Register reported about a new upgrade board, about to be released by Hypertec, that plugs into any PC with an available ISA slot and upgrades the CPU, video, and sound subsystems. I’m assuming it also replaces the memory subsystem, since pulling system memory through the ISA bus would be pitifully slow.

The solution will be more expensive than a motherboard swap, but for a corporation that has a wide variety of obsolescent PCs, it might be a good solution. First, it’s cheaper than outright replacement. Second, it creates common ground where there was none: two upgraded systems would presumably be able to use the same Ghost/DriveImage/Linux DD image, lowering administrative costs and, consequently, TCO. Third, corporations are frequently more willing to upgrade, rather than replace, existing systems even when it doesn’t make economic sense to do so (that’s corporate management for you).

Depending on the chipset it uses and the expected timeframe, I may be inclined to recommend these for the company I work for. We’ve got anywhere from 30-100 systems that aren’t capable of running Office 2000 for whatever reason. Some of them are just old Micron Client Pros, others are Micron Millenias who were configured by idiots (a local clone shop that we used to contract with way back when–I’ve never seen anyone configure NT in a more nonsensical manner), others are clones built by idiots, and others are well-built clones that just happen to be far too old to upgrade economically.

Many of these machines can be upgraded–the Microns are all ATX, so an Intel motherboard and a low-end CPU would be acceptable. Most of the others are ATs and Socket 7-based. An upgrade CPU would likely work, but will be pricey and compatibility is always a dicey issue, and most businesses are still stuck in the Intel-only mindset. (Better not tell them Macintoshes don’t use Intel CPUs–wait… Someone PLEASE tell them Macs don’t use Intel CPUs! Yeah, I’ll be an Intel lackey in exchange for never having to troubleshoot an extension conflict on a Mac again. But that’s another story.) They all need memory upgrades, and buying SIMMs in this day and age is a sucker bet. Average price of the upgrades would be $550, but we’d have a hodgepodge of systems. If we can get common ground and two years of useful life for $700 from Hypertec, upper management would probably approve it.

02/14/2001

More from across the Big Pond. I got this from Chris Miller, one of my editors at Computer Shopper UK, yesterday. Always good to hear from him because he makes me think, even though we rarely agree about anything but magazine design.

Hi Dave

I’ve been looking at the web page and I’m glad you like the ‘Window cleaner’ illustration from the new issue – much better than the blue blobs. Also glad you are holding up Shopper UK as a paragon of design. Thanks.

I shall avoid the subject of John Ashcroft, whom you appear to revere for all the wrong reasons. What I really want to say is that I think you need to prioritise your outrage. A ‘sick, sick society’ is not one where a high school can produce a play about rape, but one where children are shot and killed in schoolyards every day. The purpose of art is sometimes to shock – insecurity and violence are perfectly valid themes to explore. And why tell a story about secure, confident people who know exactly what they are doing? Where’s the drama in that? If that were all that was allowed, there would be no “Romeo and Juliet”, no “Jane Eyre”, no “Jude the Obscure”, no “Psycho” – cultural landmarks all.

Guns, however, are a serious social problem in your country which no-one seems to want to do anything about because of some semi-mythological “constitutional right” – which is, if I may speak frankly, bulls–t. I’m tired of the excuses everybody uses – guns mean massive profits and no-one, except maybe a few Ivy League intellectuals and northern-California hippies, is really serious about banning them. This despite Columbine, the disgruntled postal workers, the dot com rage and countless other pointless and avoidable deaths.

High school plays are not the scourge of American society.

Cheers now
Chris

I think you take me for having oversimplified far more than I have. Inappropriate high school plays are mostly a symptom of the problem–I won’t say they don’t cause problems, but no, we won’t solve all our social ills by toning down our school plays or our television. But it wouldn’t hurt anything either.

Likewise, getting rid of all our guns won’t eliminate all our violence. Guns are outlawed in Britain, but does anyone really believe the IRA doesn’t have guns? But there are other, more creative and more effective ways to kill people and blow things up than to use guns, and you can do it with regular, perfectly legal household items, as the IRA has so effectively demonstrated over the years.

It’s not like massacres happen every day in the United States. Once or twice a year, someone’s caught planning one, like earlier this week, and on the occasional God-forsaken day, an event like Columbine happens.

But banning handguns is a very superficial solution to a bigger problem–no less superficial than banning school plays or a particular television show. Banning guns won’t keep them out of the hands of criminals. Even if it would, desperate or very angry people would commit their crimes with knives or other weapons, just as they did before guns were reliable. The irrefutable fact is that in the handful of states that have gone the opposite extreme and enacted concealed weapons laws, crime has gone down. Social engineers HATE to talk about that because it goes beyond all the hip, chic theories of the day. So a guy walks into McDonald’s and starts shooting. He’s in control. But then some gun-totin’ cowboy (to use the popular image of Americans) whips out his gun and from behind the cover of a table, starts shooting back. The odds are suddenly changed. Can the citizen with the gun prevent anyone from getting hurt? No. But he greatly increases the probability of the one person in the building who deserves to die in such situations (the armed gunman) of sustaining bodily harm of some sort, and greatly decreases the number of potential casualties. And what if there are two or three snipers? The out-of-control situation gets back under control real quick, with minimal harm.

You don’t hear of these situations often because 1) they don’t happen very often and 2) the hard left-leaning press hates these stories.

But remember, this works in the United States but sounds like insanity in Europe because of the differences in our culture. In Europe, private ownership of weapons was a threat to the government, so it generally didn’t happen. In the Americas, weapons were absolutely vital to protect yourself on the frontier–there were hostile animals out there, and yes, hostile people. As the frontier pushed west, weapons were less essential, but they didn’t become unnecessary. Then we gained independence, and the government favored private ownership of guns early on, partly because a citizens’ militia meant there was little need for a standing army, which saved tax dollars, which kept the citizens happy because they hated taxes. That didn’t last, but guns remained a necessity in the west for about a century. To a degree, they still are a necessity in some segments of our society–there are still predators out there that threaten your livestock. Guns are part of our culture, and you won’t transplant overnight the disarmed European culture that formed over a timeframe of centuries to the United States. But the Wild West approach still works here.

But this, too, is a symptom. The greater problem is that we’ve lost our moral compass. OK, so you don’t like my religion. Demonstrate to me that a society that says it’s OK to kill, OK to cheat on your spouse, OK to steal, OK to disrespect your parents, and OK to lie can thrive. Find me one. You won’t.

Whether you like the religion or not, you can’t deny that its set of morals just plain works. But so few teach right and wrong anymore–now you just do what feels good. It feels good to cheat on your wife, so you should do it. You’re liberated. OK. So how is that different from me deciding it feels good to kill my former neighbor who caused me so much grief? Or what about my current neighbor’s nice black BMW? Wouldn’t that be a much nicer ride than my Dodge Neon? Why not steal that? If it feels good, I should do it, right?

Personally, I fail to see the difference.

So what’s the matter here? We’ve got a very self-centered society, interested in very little other than individual pleasure. So go screw around, it’s fun. The eventual result of that is kids. That’s OK, they’re fun too when they’re winning trophies and doing good. Just don’t get in my way. Here’s the remote. Here’s a video game. Have fun. Don’t bother me. And the kids grow up with parents (or a parent) respecting no one but themselves, and they learn that behavior.

So the kids grow up. Their most basic needs of food and clothing and shelter are being met. Usually. But their emotional needs aren’t. Their parents aren’t really there for them. So they don’t mature properly. They don’t exactly learn right and wrong. Their parents don’t model it for them, and they sure aren’t being taught it in school. Growing up is tough. I remember. I was a smart kid, too smart for my own good maybe, and yeah, it made me unpopular. A lot of people didn’t like it. Plus I wasn’t a big guy. I’m 5’9″, 140 pounds now. (Below average height and below average weight, for the benefit of those on the metric system.) At 14, I was 5’4″, not even 100 pounds. I was an easy target. I got in my share of fights, and I usually didn’t win. For one, the bully was almost always bigger than me. For another, I was always outnumbered anyway. Growing up too smart can be as bad as growing up the wrong race. F. Scott Fitzgerald got it right in The Great Gatsby, when his character Daisy said, after her daughter was born, “All right, I’m glad it’s a girl. And I hope she’ll be a fool–that’s the best thing a girl can be in this world, a beautiful little fool.”

Actually, he got it half right. The best thing a guy can be in this world is a beautiful little fool, or better yet, a big hulking fool. People like dumb, beautiful people, because they’re good to look at and they’re non-threatening.

I’ll be brutally blunt: I grew up with a lot of jackasses, and frankly, there were times that I thought the world would have been a much better place if someone brought a gun to school and pumped some lead into their ugly faces. There. I said it.

When I read about the Columbine killers, it resonated with me. I understood those guys completely. One of them was the brains of the outfit. The other was a follower, pure and simple. But I understood how they felt, I understood (and even dug) the music they listened to, and for a time I even dressed like those two did. One of my former classmates even told me after the event, “Those two guys remind me of you.” After all, I used to run around in a black trenchcoat, black t-shirt and black jeans and combat boots, looking gloomy and listening to Joy Division and The Sisters of Mercy.

And don’t get me wrong. My dad had guns. My dad had a lot of guns. He kept the really big stuff locked up, but he had handguns stashed. There was a Derringer he kept in his sock drawer. He had another gun he kept stashed inside the couch in the basement. For all I know he had others. He taught me how to shoot the Derringer. He also taught me how to shoot a .22-calibre rifle. I wasn’t very good, but at close range you don’t have to be.

So why didn’t I turn into one of those guys? My dad taught me to respect human life. Dad was a doctor. Dad even treated a couple of guys on death row. There was a guy who used to hire drifters to steal cattle, then sell them quickly. Then he’d kill them to eliminate the evidence (and cheat them out of their share of the money). I don’t remember how many times he did this. My dad had a brief encounter with him while he was getting an x-ray. They exchanged words, and it wasn’t exactly nice. “Meanest sonofabitch I ever met,” he recalled. I asked him why he treated him, especially seeing as they were going to kill him anyway. Know what he said? He said it wasn’t his job to kill him. It was his job to make sure he had the same quality of life (or as close to it) as anyone else. Killing the man was the state’s job, if it ever got around to it.

So if my dad could respect the life of this man, who by the account of everyone who ever met him wasn’t worth the oxygen he breathed over the course of a day, then shouldn’t I respect the lives of the people at school?

Dad (and Mom too) taught me right and wrong. And they didn’t ignore me, they disciplined me when I stepped out of line. The worst happened when I was 2 or 3. I was being the epitome of brat, and making matters worse, we were guests at a family friend’s house. My mom took me out to the garage, partly to figure out what to do with me. Well, it was March or so, so it wasn’t too cold in there, and it wasn’t too hot, and there was absolutely nothing to do in there either, so she found a lawn chair and told me I had to sit there until I decided to act civilized. Then she went back in the house. Our host asked, “Where’s David?” and my mom told her. After about fifteen minutes, she came back out and asked if I could act civil. I said yes.

That was the most trouble I was ever in. Yes, I got spanked a few times (but it was a very few), and I got yelled at a few times. But with my parents, discipline was consistent, and it was swift. And because it was those things, it was rare–I didn’t step out of line much.

I don’t think the idea that if I were to commit a crime, I might be able to beat the system ever occurred to me until I was 18 or 19. If I didn’t beat the system at home or at school, why should I expect to be able to beat the government?

So no, I never thought of killing my antagonizers. And that’s fine. They got theirs. My biggest antagonizer never finished school. At 17, his parents kicked him out of the house. He drifted around a couple of years, living out of a van and the occasional cheap motel, then finally settled down. At age 21, he was working in a restaurant, doing the same job as a lot of 17-year-olds. He’d be 27 now, and if there’s anything more pathetic than a 14-year-old loser, it’s a 27-year-old loser, and anyone who knew us both would see it now.

Meanwhile, I kept working, doing my best at what I was good at, doing my best to ignore the taunts, and a funny thing happened. At age 17, the taunts stopped. People didn’t mess with the seniors–we were the oldest people in the school besides the teachers. We’d paid our dues. We earned our respect. And the seniors didn’t mess with each other. Being smart became almost… admirable. In college, that was even more so. And get out into the professional world, and it’s even more so. The things that people made fun of you for in school raise eyebrows now. I’m not at the pinnacle of success, but I have everything I want or I can get it.

So, coming back around again… It starts at home. It starts with the family paying attention to its members, and doing its duty. Morals may not be any fun, but an immoral society is even less fun. Certain things like life, dignity, and personal property have to be honored absolutely. Do these things, and you won’t come out all bad. The occasional bad apple will still slip through, but it’ll be an oddity, and a whole lot easier to deal with.

Do these things, one family at a time, and I don’t care what culture you’re in, you won’t go wrong. The whole culture will benefit, with or without guns, with or without questionable forms of entertainment.

02/11/2001

Mailbag:

Innovation

Steve DeLassus asked me for some ideas of where I see innovation, since I said Microsoft isn’t it. That’s a tough question. On the end-user side, it’s definitely not Microsoft. They’ve refined some old ideas, but most of their idea of Innovation is taking utilities that were once separate products from companies Microsoft wants to drive out of business, then grafting them onto the OS in such a way as to make them appear integrated. What purpose does making the Explorer interface look like a Web browser serve? Doesn’t everyone who’s used a real file manager (e.g. Norton Commander or Directory Opus) agree that the consumer would have been better served by replicating something along those lines? Not that that’s particularly innovative either, but at least it’s improving. The only innovation Microsoft does outside of the software development arena (and that makes sense; Microsoft is first and foremost a languages company and always has been) seems to be to try to find ways to drive other companies out of business or to extract more money out of their customers.

Richard Stallman’s GNU movement has very rarely been innovative; it’s been all about cloning software they like and making their versions free all along. It’s probably fair to call Emacs innovative; it was a text editor with a built-in programming environment long before MS Word had that capability. But I don’t see a whole lot of innovation coming out of the Open Source arena–they’re just trying to do the same thing cheaper, and in some cases, shorter and faster, than everyone else.

So, where is there innovation? I was thinking there was more innovation on the hardware side of things, but then I realized that a lot of those “innovations” are just refinements that most people think should have been there in the first place–drives capable of writing to both DVD-R and CD-R media, for instance. Hardware acceleration of sound and network cards is another. Amiga had hardware acceleration of its sound in 1985, so it’s hard to call that innovation. It’s an obvious idea.

A lot of people think Apple and Microsoft are being really innovative with their optical mice, but optical mice were around for years and years before either of those companies “invented” them. The optical mice of 2000 are much better than the optical mice of 1991–no longer requiring a gridded mouse pad and providing smoother movement–but remember, in 1991, the mainstream CPUs were the Intel 80286 and the 80386sx. That’s a far, far cry from the Thunderbird-core AMD Athlon. You would expect a certain degree of improvement.

I’d say the PalmPilot is innovative, but all they really did was take a failed product, Apple’s Newton, and figure out what went wrong and make it better. So I guess you could say Apple innovated there, but that was a long time ago.

So I guess the only big innovation I’ve seen recently from the end-user side of things has been in the software arena after all. I’m still not sold on Ray Ozzie’s Groove, but have to admit it’s much more forward-thinking than most of the things I’ve seen. Sure, it looks like he’s aping Napster, but he started working on Groove in 1997, long before Napster. Napster’s just file sharing, which has been going on since the 1960s at least, but in a new way. There again, I’m not sure that it’s quite right to call it true innovation, but I think it’s more innovative than most of the things I’ve seen come out of Microsoft and Apple, who are mostly content to just copy each other and SGI and Amiga and Xerox. If they’re going to steal, they should at least steal the best ideas SGI and Amiga had. Amiga hid its menu bars to save screen space. Maybe that shouldn’t be the default behavior, but it would be nice to make that an option. SGI went one further, making the pull-down menus accessible anywhere onscreen by right-clicking. This isn’t the same as the context menu–the program’s main menu came up this way. This saved real estate and mouse movements.

I’m sure I could think of some others but I’m out of time this morning. I’d like to hear what some other people think is innovative. And yes, I’m going to try to catch up on e-mail, either this afternoon or this evening. I’ve got a pretty big backlog now.

Mailbag:

Innovation

01/16/2001

AMD and DDR. Good news for hardware enthusiasts wanting AMD-based DDR systems. Via shipped its 266 MHz DDR chipset Monday. This is good news because Via can in all likelihood supply their chipsets in larger quantities than AMD can or will. It’ll take a little while for the KT266 to appear in earnest, but this should soon silence the DIY crowd, who’ve been protesting very loudly that they can’t get boards or chips. Virtually all of Gigabyte’s 760 boards are going to Compaq and Micron, which does make sense. Compaq and Micron will order boards and 266 MHz FSB chips in quantities of hundreds of thousands. The shops catering to the DIY crowd won’t. Given a limited supply, the big fish will get first dibs–it’s easier and less expensive to deal with two big customers than with a hundred tiny ones.

Infoworld. I think my Infoworld subscription has finally lapsed. I’ve been trying to let it lapse for months. I’d get a “This is your last issue if you don’t renew NOW!” warning attached to the cover, which would then be followed by six issues or so, before I’d get another warning. I think I’ve been getting these since last June.

Well, today I went to Infoworld’s site, and I remember why I’ve been trying to let my subscription lapse. They’re bleeding pundits. Q&A maestro Mark Pace quit. Then his partner, Brooks Talley, quit. Bob Metcalfe retired. Sean Dugan quit. Now, Stuart McClue and Joel Scambray are quitting, to be replaced by P.J. Connolly. They tried Connolly as a columnist once before. That experiment lasted about a month, probably because he wrote more about the Grateful Dead than he did about the subject at hand. (Which made me self-conscious about mentioning Aimee Mann and the Kansas City Royals too frequently, but I generally don’t mention them on a weekly basis, so I’m probably OK.)

Their best remaining columnists are Brian Livingston, Nicholas Petreley, and Ed Foster. Livingston has a lot of useful tips, while Foster is genuinely entertaining and provides a useful service to readers. Infoworld’s Robert X. Cringely isn’t quite as entertaining or as insightful as PBS’ Robert X. Cringely, but he’s usually worth a quick read. But there are half as many reasons to read the magazine now as there once were.

Amazon. Amazon’s under fire again from a number of directions, including Ed Foster, and I can’t say I’m in love with all of their practices, but I can’t help but notice something. From my limited vantage point, it would seem consumers don’t really seem to care all that much about Amazon’s business practices. I provided links to buy my book elsewhere, but the sales rankings at the other places are pathetic even after doing so. Sales at Borders and B&N are nearly non-existent. Sales at Fatbrain are sporadic at best. But there are a handful of venues where it sells well. The used places sell what copies they can get very quickly. And when Amazon can manage to allow people to order it, it sells very well. If they can’t get a used copy cheap, people would rather buy from Amazon, period. And they’ll even pay a higher price at Amazon than they will elsewhere. A number of people paid full cover price from Amazon off links from this site, even when it was available for less elsewhere. (Amazon seems to be currently selling it for $19.95 or so.)

Some people swear by Apple. I swear at Apple. Apparently Steve Jobs does too . (Not for the easily offended.)

01/13/2001

Have I been brainwashed by Redmond? In the wake of MacWorld, Al Hawkins wrote a piece that suggested maybe so. My post from Thursday doesn’t suggest otherwise.

So let’s talk about what’s wrong with the PC industry. There are problems there as well–problems across the entire computer industry, really. The biggest difference, I think, is that the big guns in the PC industry are better prepared to weather the storm.

IBM’s PC business has been so bad for so long, they’ve considered pulling out of the very market they created. They seem to be turning it around, but it may only be temporary, and their profits are coming at the expense of market share. They retreated out of retail and eliminated product lines. Sound familiar? Temporary turnarounds aren’t unheard of in this industry. IBM as a whole is healthy now, but the day when they were known as Big Black & Blue isn’t so distant as to be forgotten. But IBM’s making their money these days by selling big Unix servers, disk drives, PowerPC CPUs and other semiconductors, software, and most of all, second-to-none service. The PC line can be a loss leader, if need be, to introduce companies to the other things IBM has to offer.

Compaq is a mess. That’s why they got a new CEO last year. But Compaq is a pretty diverse company. They have DEC’s old mini/mainframe biz, they have DEC’s OpenVMS and Digital Unix (now Tru64 Unix) OSs, they have DEC’s Alpha CPU architecture, and DEC’s widely acclaimed service division, which was the main thing that kept DEC afloat and independent in its day. Compaq also has its thriving server business, a successful line of consumer PCs and a couple of lines of business PCs. The combined Compaq/DEC was supposed to challenge IBM as the 800-pound gorilla of the industry, and that hasn’t happened. Compaq’s a big disappointment and they’re having growing pains. They should survive.

HP’s not exactly in the best of shape either. They’ve made a lot of lunkhead decisions that have cost them a lot of customers, most notably by not releasing drivers for their widely popular printers and scanners for newer Microsoft operating systems. While developing these drivers costs money, this will cost them customers in the long run so it was probably a very short-sighted decision. But HP’s inkjet printers are a license to print money, with the cartridges being almost pure profit, and HP and Compaq are the two remaining big dogs in retail. Plus they have profitable mainframe, Unix, and software divisions as well. They’ve got a number of ways to return to profitability.

The holidays weren’t kind to Gateway. They actually had to resort to selling some of their surplus inventory in retail stores, rather than using the stores as a front for their build-to-order business as intended.

Dell’s not happy with last year’s results either, so they’re looking to diversify and give themselves less dependence on desktop PCs. They’re growing up, in other words. They’re killing IBM and Compaq in PCs, and those companies are still surviving. Dell wants a piece of that action.

Intel botched a number of launches this year. They had to do everything wrong and AMD had to do everything right in order for AMD to continue to exist. That happened. AMD’s past problems may have been growing pains, and maybe they’re beyond it now. We shall see. Intel can afford to have a few bad quarters.

As for their chips, we pay a certain price for backward compatibility. But, despite the arguments of the Apple crowd, x86 chips as a rule don’t melt routinely or require refrigerants unless you overclock. All of my x86 chips have simple fans on them, along with smaller heatsinks than a G4 uses. I’ve seen many a Pentium III run on just a heatsink. The necessity of a CPU fan depends mostly on case design. Put a G4 in a cheap case with poor airflow and it’ll cook itself too.

Yes, you could fry an egg on the original Pentium-60 and -66. Later revisions fixed this. Yet I still saw these original Pentiums run on heat sinks smaller than the sinks used on a G4. The Athlon is a real cooker, so that argument holds, but as AMD migrates to ever-smaller trace widths, that should improve. Plus AMD CPUs are cheap as dirt and perform well. The Athlon gives G4-like performance and high clock speeds at a G3 price, so its customers are willing to live with some heat.

And Microsoft… There are few Microsoft zealots left today. They’re rarer and rarer. Microsoft hasn’t given us anything, yet we continue to buy MS Office, just like Mac users. We curse Microsoft and yet send millions and billions their way, just like Mac users. We just happen to buy the OS from them too. And while we curse Microsoft bugs and many of us make a living deploying Windows-based PCs (but the dozen or so Macs I’m responsible for keep me busier than the couple of hundred PCs I’m responsible for), for the most part Windows works. Mac owners talk about daily blue screens of death, but I don’t know when I last got one. I probably get one or two a year. I currently have eight applications running on my Windows 98 box. OS/2 was a far better system than Windows, but alas, it lost the war.

I can’t stand Microsoft’s imperialism and I don’t like them fighting their wars on my hardware. They can pay for their own battlefield. So I run Linux on some of my boxes. But sometimes I appreciate Windows’ backward compatibility.

I always look for the best combination of price, performance, and reliability. That means I change platforms a lot. I flirted with the Mac in 1991, but it was a loveless relationship. The PCs of that era were wannabes. I chose Amiga without having used one, because I knew it couldn’t possibly be as bad as Windows 3.0 or System 7.0. I was right. By 1994, Commodore had self-destructed and the Amiga was perpetually on the auction block, so I jumped ship and bought a Compaq. Windows 3.1 was the sorriest excuse I’d seen for a multitasking environment since System 7.0 and Windows 3.0. I could crash it routinely. So I switched to OS/2 and was happy again. I reluctantly switched to Windows 95 in 1996. I took a job that involved a lot of Macs in 1998, but Mac OS 8.5 failed to impress me. It was prettier than System 7 and if you were lucky you could use it all day without a horrible crash, but with poor memory management and multitasking, switching to it on an everyday basis would have been like setting myself back 12 years, so the second date wasn’t any better than the first.

Linux is very interesting, and I’ve got some full-time Linux PCs. If I weren’t committed to writing so much about Windows 9x (that’s where the money is), Linux would probably be my everyday OS. Microsoft is right to consider Linux a threat, because it’s cheaper and more reliable. Kind of like Windows is cheaper and more reliable than Mac OS. Might history repeat itself? I think it could.

The computer industry as a whole isn’t as healthy this year as it was last year. The companies with the most resources will survive, and some of the companies with fewer will fold or be acquired. The reason the industry press is harder on Apple than on the others is that Apple is less diversified than the others, and thus far more vulnerable.

01/11/2001

Mailbag:

My docs; Apple; Lost cd rom drive

It’s that time of year again. MacWorld time. I work with Macs way too much, so of course I have opinions. If you expect me to withhold them, you don’t know me very well.

Let’s face it: Apple’s in serious trouble. Serious trouble. They can’t move inventory. The Cube is a bust–unexpandable, defect-ridden, and overpriced. The low-end G4 tower costs less than the Cube but offers better expandability.  Buying a Cube is like marrying a gorgeous airhead. After the looks fade in a few years, you’re permanently attached to an airhead. So people buy a G4 tower, which has better expandability, or they get an iMac, which costs less.

Unfortunately, that gorgeous airhead metaphor goes a long way with Apple. The Mac’s current product line is more about aesthetics than anything else. So they’ve got glitzy, glamorous cases (not everyone’s cup of tea, but hey, I hear some people lust after Britney Spears too), but they’re saddled with underpowered processors dragged down by an operating system less sophisticated under the hood than the OS Commodore shipped with the first Amiga in 1985. I don’t care if your PowerPC is more efficient than an equivalently-clocked Pentium IV (so’s a VIA Cyrix III but no one’s talking about it), because if your OS can’t keep that CPU fed with a steady stream of tasks, it just lost its real-world advantage.

But let’s set technical merit aside. Let’s just look at pure practicalities. You can buy an iMac for $799. Or, if you’re content with a low-end computer, for the same amount of money you can buy a low-end eMachine and pair it up with a 19-inch NEC monitor and still have a hundred bucks left over to put towards your printer. Yeah, so the eMachine doesn’t have the iMac’s glitzy looks. I’ll trade glitz for a 19-inch monitor. Try working with a 19-inch and then switch to a 15-inch like the iMac has. You’ll notice a difference.

So the eMachine will be obsolete in a year? So will the iMac. You can spend $399 for an accelerator board for your iMac. Or you can spend $399 for a replacement eMachine (the 19-inch monitor will still be nice for several years) and get a hard drive and memory upgrade while you’re at it.

On the high end, you’ve got the PowerMac G4 tower. For $3499, you get a 733 MHz CPU, 256 MB RAM, 60 GB HD, a DVD-R/CD-R combo drive, internal 56K modem, gigabit Ethernet you won’t use, and an nVidia GeForce 2 MX card. And no monitor. Software? Just the OS and iMovie, which is a fun toy. You can order one of these glitzy new Macs today, but Apple won’t ship it for a couple of months.

Still, nice specs. For thirty-five hundred bucks they’d better be nice! Gimme thirty-five hundred smackers and I can build you something fantabulous.

But I’m not in the PC biz, so let’s see what Micron might give me for $3500. For $3514, I configured a Micron ClientPro DX5000. It has dual 800 MHz Pentium III CPUs (and an operating system that actually uses both CPUs!), 256 MB of RDRAM, a 7200 RPM 60 GB hard drive, a DVD-ROM and CD-RW (Micron doesn’t offer DVD-R, but you can get it third-party if you must have one), a fabulous Sound Blaster Live! card, a 64 MB nVidia GeForce 2 MX, and in keeping with Apple tradition, no monitor. I skipped the modem because Micron lets me do that. If you must have a modem and stay under budget, you can throttle back to dual 766 MHz CPUs and add a 56K modem for $79. The computer also includes Intel 10/100 Ethernet, Windows 2000, and Office 2000.

And you can have it next week, if not sooner.

I went back to try to configure a 1.2 GHz AMD Athlon-based system, and I couldn’t get it over $2500. So just figure you can get a machine with about the same specs, plus a 19-inch monitor and a bunch more memory.

Cut-throat competition in PC land means you get a whole lot more bang for your buck with a PC. And PC upgrades are cheap. A Mac upgrade typically costs $400. With PCs you can often just replace a CPU for one or two hundred bucks down the road. And switching out a motherboard is no ordeal–they’re pretty much standardized at this point, and PC motherboards are cheap. No matter what you want, you’re looking at $100-$150. Apple makes it really hard to get motherboard upgrades before the machines are obsolete.

It’s no surprise at all to me that the Mac OS is now the third most-common OS on the desktop (fourth if you count Windows 9x and Windows NT/2000 as separate platforms), behind Microsoft’s offerings and Linux. The hardware is more powerful (don’t talk to me about the Pentium 4–we all know it’s a dog, that’s why only one percent of us are buying it), if only by brute force, and it’s cheaper to buy and far cheaper to maintain.

Apple’s just gonna have to abandon the glitz and get their prices down. Or go back to multiple product lines–one glitzy line for people who like that kind of thing, and one back-to-basics line that uses standard ATX cases and costs $100 less off the top just because of it. Apple will never get its motherboard price down to Intel’s range, unless they can get Motorola to license the Alpha processor bus so they can use the same chipsets AMD uses. I seriously doubt they’ll do any of those things.

OS X will finally start to address the technical deficiencies, but an awful lot of Mac veterans aren’t happy with X.

Frankly, it’s going to take a lot to turn Apple around and make it the force it once was. I don’t think Steve Jobs has it in him, and I’m not sure the rest of the company does either, even if they were to get new leadership overnight. (There’s pressure to bring back the legendary Steve Wozniak, the mastermind behind the Apple II who made Apple great in the 1970s and 1980s.)

I don’t think they’ll turn around because I don’t think they care. They’ll probably always exist as a niche player, selling high-priced overdesigned machines to people who like that sort of thing, just as Jaguar exists as a niche player, selling high-priced swanky cars to people who like that sort of thing. And I think the company as a whole realizes that and is content with it. But Jaguar’s not an independent company anymore, nor is it a dominant force in the auto industry. I think the same fate is waiting for Apple.

Mailbag:

My docs; Apple; Lost cd rom drive

Plextor bargains, and Year 2000 in review

A bargain Plextor CD-RW. I just spotted this great tip in a link to a link to a link in the StorageReview forums. The Iomega ZipCD 12x10x32 appears to be a relabeled Plextor drive, and it sometimes sells for around $100. So if you’re looking for the best CD-R on the market at a great price, go get it.
Details are at www.roundsparrow.com/comp/iomega1 if you want to have a look-see.

The $99 price seems to be a CompUSA special sale. Check local availability at www.compusa.com/products/product_info.asp?product_code=280095 if you’re interested.

Incidentally, the IDE 12x10x32 drives from TDK and Creative are also reported to be re-branded Plextors. Regular retail price on these four “twin” drives is similar, around $300. The TDK and Creative drives come with Nero Burning ROM, however, making them more desirable than the Plextor model. Iomega bundles Adaptec’s CD suite.

Happy New Year. An ancient Chinese curse says, “May you live in interesting times.” Well, 2000 certainly was interesting. So, my toast to you this year is this: May 2001 be less interesting than 2000. Boring isn’t always bad. Just usually.

Linux 2.4 almost made it. Yesterday, Linus Torvalds released linux2.4-prerelease and vowed there won’t be a prerelease1, prerelease2, etc.–this is it. Bugs get fixed in this one, then the final 2.4 comes out (to be immediately followed by linux2.4ac1, no doubt–Alan Cox always releases a patched kernel swatting a couple of bugs within hours of Linus releasing the new kernel. It happened with 2.0 and with 2.2, and history repeats itself).

Anyway, the 2.2 prerelease turned into a series in spite of Linus’ vows, so Linus isn’t always right, but I expect 2.4 will be out this month, if not this week.

Linux 2.4 will increase performance, especially on high-memory and SMP machines, but I ran a 2.3 series kernel (basically the Linux equivalent of an alpha release of 2.4) on my P120 for a long time and found it to be faster than 2.2, even on a machine that humble. I also found it to be more stable than Microsoft’s final releases, but hey.

I ought to download 2.4prerelease and put it on my dual Celeron box to see how far it’s come, but I doubt I get around to it today.

Other lowlights of 2000. Windows 2000 flopped. It’s not a total disaster, but sales aren’t meeting Microsoft’s expectations. PC sales flopped, and that was a disaster. The Pentium 4 was released to awful reviews. Nvidia bought the mortal remains of 3dfx for a song. Similarly, Aureal departed from this mortal coil, purchased by longtime archrival Creative Labs after bankruptcy. (In a former incarnation, before bankruptcy and being run into the ground, Aureal was known as MediaVision. PC veterans probably remember them.) A federal judge ordered the breakup of Microsoft, but the appeals process promises to at least delay it, if not prevent it. We’ll hear a lot about that in 2001, but 2001 probably won’t bring any closure.

Hmm, other highlights. Apple failed to release OS X this year, and saw its new product line flop. Dotcom after dotcom shuttered its doors, much to Wall Street’s dismay. Linux companies didn’t topple MS, much to Wall Street’s dismay. And speaking of Wall Street, Larry Ellison (Oracle) and Bill Gates (Microsoft) flip-flopped in the rankings of richest man in the world several times.

And two of my favorite pundits, Bob Metcalfe and G. Burgess Alison, called it quits last year. They are sorely missed.

And once again, 2000 wasn’t the year of the NC.

I know I missed a few. But those were the highlights, as I see them.

01/01/2001

Mailbag:

Partition; IDE/SCSI; Lost CD ROM; Optimizing ME; Win 98/ME

A bargain Plextor CD-RW. I just spotted this great tip in a link to a link to a link in the StorageReview forums. The Iomega ZipCD 12x10x32 appears to be a relabeled Plextor drive, and it sometimes sells for around $100. So if you’re looking for the best CD-R on the market at a great price, go get it.

Details are at www.roundsparrow.com/comp/iomega1 if you want to have a look-see.

The $99 price seems to be a CompUSA special sale. Check local availability at www.compusa.com/products/product_info.asp?product_code=280095 if you’re interested.

Incidentally, the IDE 12x10x32 drives from TDK and Creative are also reported to be re-branded Plextors. Regular retail price on these four “twin” drives is similar, around $300. The TDK and Creative drives come with Nero Burning ROM, however, making them more desirable than the Plextor model. Iomega bundles Adaptec’s CD suite.

Happy New Year. An ancient Chinese curse says, “May you live in interesting times.” Well, 2000 certainly was interesting. So, my toast to you this year is this: May 2001 be less interesting than 2000. Boring isn’t always bad. Just usually.

Linux 2.4 almost made it. Yesterday, Linus Torvalds released linux2.4-prerelease and vowed there won’t be a prerelease1, prerelease2, etc.–this is it. Bugs get fixed in this one, then the final 2.4 comes out (to be immediately followed by linux2.4ac1, no doubt–Alan Cox always releases a patched kernel swatting a couple of bugs within hours of Linus releasing the new kernel. It happened with 2.0 and with 2.2, and history repeats itself).

Anyway, the 2.2 prerelease turned into a series in spite of Linus’ vows, so Linus isn’t always right, but I expect 2.4 will be out this month, if not this week.

Linux 2.4 will increase performance, especially on high-memory and SMP machines, but I ran a 2.3 series kernel (basically the Linux equivalent of an alpha release of 2.4) on my P120 for a long time and found it to be faster than 2.2, even on a machine that humble. I also found it to be more stable than Microsoft’s final releases, but hey.

I ought to download 2.4prerelease and put it on my dual Celeron box to see how far it’s come, but I doubt I get around to it today.

Other lowlights of 2000. Windows 2000 flopped. It’s not a total disaster, but sales aren’t meeting Microsoft’s expectations. PC sales flopped, and that was a disaster. The Pentium 4 was released to awful reviews. Nvidia bought the mortal remains of 3dfx for a song. Similarly, Aureal departed from this mortal coil, purchased by longtime archrival Creative Labs after bankruptcy. (In a former incarnation, before bankruptcy and being run into the ground, Aureal was known as MediaVision. PC veterans probably remember them.) A federal judge ordered the breakup of Microsoft, but the appeals process promises to at least delay it, if not prevent it. We’ll hear a lot about that in 2001, but 2001 probably won’t bring any closure.

Hmm, other highlights. Apple failed to release OS X this year, and saw its new product line flop. Dotcom after dotcom shuttered its doors, much to Wall Street’s dismay. Linux companies didn’t topple MS, much to Wall Street’s dismay. And speaking of Wall Street, Larry Ellison (Oracle) and Bill Gates (Microsoft) flip-flopped in the rankings of richest man in the world several times.

And two of my favorite pundits, Bob Metcalfe and G. Burgess Alison, called it quits last year. They are sorely missed.

And once again, 2000 wasn’t the year of the NC.

I know I missed a few. But those were the highlights, as I see them.

Mailbag:

Partition; IDE/SCSI; Lost CD ROM; Optimizing ME; Win 98/ME