Why I generally buy AMD

I was talking to a new coworker today and of course the topic of our first PCs came up. It was Cyrix-based. I didn’t mention my first PC (it seems I’m about four years older–it was an Am486SX2/66).

With only a couple of exceptions, I’ve always bought non-Intel PCs. Most of the Intel PCs I have bought have been used. One boss once went so far as to call me anti-corporate.

I’m not so much anti-corporate as I am pro-competition.

Read more

Run the right version of Windows for your PC

I said I was done writing about system optimization. I changed my mind. I have one more thing, and it seems appropriate, now that Vista upgrades are available.

Be very wary about upgrading your version of Windows.There are a few Vista-only titles out there, and there will be some more, but the majority of titles aren’t. Walk into a software aisle and you’ll still find a lot of software that will run on Windows 95 (or possibly 98), assuming the computer meets the hardware requirements.

I’m typing this on an 800 MHz HP Pavilion 6835. Sure, it’s outmoded–for around $125, I could swap in an Athlon 64 motherboard that would give me 4-5x the CPU power and that would be considered a low-end PC by today’s standards–but this one’s peppy. I run Windows ME on it. Windows 2000 would be more stable but I’m lazy. I wouldn’t try XP on it. When XP came out, this system was already old.

Technically, XP will install on a 133 MHz Pentium if it has enough RAM. I’ve seen it done, and I’ve seen it try to run on one. It’s not pretty. I really wouldn’t try running XP on anything less than a 1 GHz PC with 256 megs of RAM, because that was the standard PC at the time of XP’s release. But believe it or not, if you install Windows 95 and Office 95 on that Pentium-133, it’s a reasonably nice machine–because that was a high-end box in 1995 when Windows 95 and Office 95 came out.

So when you’re refurbishing an old machine, try to install whatever the current version of Windows was when it was new. The PC will run a lot better. Here’s a guide.

Windows 95: Released August 1995
Typical PC of the time: 486, 66 MHz
Hot PC of the time: Pentium, 133 MHz

Windows NT 4.0: Released July 1996
Typical PC of the time: Pentium, 75 MHz
Hot PC of the time: Pentium Pro, 200 MHz

Windows 98: Released June 1998
Typical PC of the time: Pentium, 233 MHz
Hot PC of the time: Pentium II, 333 MHz

Windows 2000: Released February 2000
Typical PC of the time: Pentium III or Athlon, 600 MHz
Hot PC of the time: Pentium III or Athlon, 1 GHz

Windows XP: Released October 2001
Typical PC of the time: Pentium 4, 1.5 GHz
Hot PC of the time: Pentium 4 or Athlon, 2+ GHz

Windows Vista: Released January 2007
From what I understand, even a hot PC of 2007 has difficulty running it. I haven’t seen Vista yet; my employer is still running XP for everything.

Of course, if you install as much memory as the system will take, you can push your limits, since Windows is often more memory-bound than CPU-bound. I also try to replace the hard drive with the fastest model I can budget for. Don’t worry if the drive has a faster DMA rate than the controller on the board; you’ll still benefit from the faster seek times and better throughput of a newer drive. If the new drive saturates the bus, it could be worse–I guarantee the old one didn’t.

The best defragmenter for Windows NT, 2000, XP and Vista

Want Diskeeper’s features without ponying up 50 bucks?

Sorry, I can’t help you. The combination of My Defrag, Scandefrag, and Pagedefrag is better and it’s free.

Scandefrag defragments your system during the boot process, as early as it can. It works better on NT-based systems like Windows 2000 and XP than it does on 98 or ME. All it does is launch the other tools.

Pagedefrag is, of course, a classic. It’s just convenient to bundle it up with these other tools. This tool defragments your registry and swap file(s) at boot time, which is the only time the system allows it.

My Defrag (actually Jerrod Kessels’ defrag) is, to put it simply, the best general purpose defragmenter for Windows NT, 2000 and XP that I’ve ever seen. Period.

If My Defrag can’t do an ideal job, it does the best it can do. Some defragmenters leave a file alone if they can’t defragment it, but this one will defragment as much as possible and move it as close to the front of the disk as possible, where performance is much better. On full disks, this is important. Since ideal conditions almost never exist (except when a system is first built), a defragmenter’s performance under less than ideal conditions is very important.

The most exciting thing about My Defrag is its ability to sort files. I like Sort alphabetically.

Sorting alphabetically (the -a7 switch) helps because it uses the full pathname. This means all of your files that are part of, say, Mozilla Firefox will be put as close together on the disk as possible, so when you launch Firefox, all of those files are close together and the disk head doesn’t have to move around a lot. The result is an application that launches faster.

So how often should you defragment? Once a year, I would do a boot-time defragmentation with Scandefrag to whip the Registry and swap files into shape. When that finishes, I would run My Defrag in full optimization mode, with file sorting. If you make a major change to your system (say, upgrading your office suite), do a quick defragmentation after the install and a full defragmentation a month or so after.

As part of your routine system maintenance, a faster, automatic defrag with no options specified is a good idea on occasion. The author says to do it no more than once a day and I agree. In my experience, once a week or even once a month is almost always fine. The way My Defrag works, the system shouldn’t get terribly fragmented on a daily basis, even if you use your system heavily. Defragmenting too frequently can shorten a hard disk’s life expectancy, although the occasional defragmentation seems to help it. I defragment a few times a year (and always have), and I generally get five or six years out of a hard disk, which is a year or two longer than most experts say to expect.

Don’t waste your money on any other tools. Download this trio, install it, use it, and watch your system performance climb.

A better registry cleaner

Note: I wrote this back in the Windows XP days. It worked really well under XP, but if you’re going to run the registry cleaner portion in Windows 7 or Windows 10, be sure to create a restore point first.

I’ve been messing around with a registry cleaner called CCleaner. I like it a lot better than the commercial tools that used to come with Norton Utilities and the like, and I like it better than the freebies that we used to use like Microsoft’s Regclean.

And you’ll never beat the price.CCleaner runs on Windows 95, 98, 98SE, ME, NT4, 2000, XP, and Vista.

One thing that I liked about it is that the program is intelligent and relatively dummy-proof. If you click around and do all of the defaults, it’s not likely to harm your computer. I inadvertently wiped out my Firefox browser history (I wanted to keep that) but that’s not a showstopper. It will populate itself again in a few weeks. Unlike commercial utility suites, where I’ve written 20-page explanations how to use them safely, this program doesn’t really need any explanation.

CCleaner actually does more than just clean up the Registry, although it does a fine job of that. It also does a great job of weeding out useless temporary files. I ran it on my old laptop and it found 386 megabytes of junk on my crowded C drive. I’ve been manually cleaning it up by searching it by hand, and I think I do a pretty good job of finding a lot of stuff, but what can I say? The program found 386 megs of stuff that I didn’t.

There are three benefits to getting rid of that cruft. First, Windows needs quite a bit of free space just to function properly. When you start getting too little free space, the system just acts goofy. Second, large numbers of temp files in the system directory just seem to make the system act funny. This was a bigger problem in Windows 9x than in the newer NT-based Windows versions, but there’s still no reason to have hundreds of those laying around. In my desktop support days, just getting rid of temp files used to clear up all sorts of mysterious problems. And finally, not having all those large and useless files on the disk makes your defragmentation programs work better. Those programs need free space to work with, and they don’t have to work as hard when they don’t have hundreds of extra worthless files to move around.

Cleaning the Registry is another important job, since a lot of uninstallation programs don’t do a very thorough job of cleaning up after themselves. The extra bloat chews up memory and slows down searches for the legitimate data the programs you actually use need. Since I tend not to install many programs and I use most of the ones I do install, CCleaner didn’t find a whole lot in my Registry, but it found some stuff to clean up.

So what happened after I ran it? The most noticeable effects were that my Start menu was a lot peppier, and my Web browsers loaded and ran a little bit faster. I understand the Web browser speedup, but the Start menu puzzled me a bit. Not that I’m complaining–it’s irritating when you press Start and have to wait for your list of programs to come up.

CCleaner isn’t a miracle worker and it won’t turn my P3-700 into a Core Duo, but the two systems I’ve run it on do run noticeably faster afterward. It was certainly more than worth the 10 minutes it took for me to download it and run it on each.

So what about the commercial utilities suites? Skip them. In this day and age, there are better, free alternatives for everything those utilities suites could do. CCleaner is one of the superstars. In coming days, I’ll talk about free substitutes for the other most important components of the utility suites.

It\’s that time of month again, time to Slashdot the Wikipedia

Slashdot published an interview today with Wikipedia founder Jimmy Wales. I found it entertaining reading. Even though I’m a semi-regular contributor over at Wikipedia, I’ve never encountered its founder, possibly because I do my best these days to stay under the radar over there.The discussion on Slashdot was interesting. As always, someone questioned Wikipedia’s accuracy, wondering how anything but chaos can come from something that anyone can edit at any time. A few people read two articles and came back with the usual “99.9% of Wikipedia articles cite no sources and have inaccuracies in them.” Someone else came back and said he’d made a change to the M1A1 Abrams article and was corrected by an Army mechanic. I always like comments like that. It shows who actually has experience and who’s talking out his butt.

Wales was incredibly idealistic, with a vision of free textbooks educating the world and ridding the world of places where people have no sanitation. Free access to the sum of all human knowledge will solve all the world’s problems.

I wish I could be so idealistic.

Oh well, shoot for the stars and maybe you have a chance of hitting the moon, right?

I found the discussion on credibility more interesting. Someone asked how an encyclopedia produced by anarchy could have more credibility than the mighty Encyclopedia Britannica or even World Book. Linux Kernel hacker Alan Cox weighed in, pointing out that there’s plenty of bias in academia too, that academia is a tyranny of the day’s popular ideas and that generally ideas change by one generation dying out and a new generation with different ideas taking over. At least with Wikipedia, the divergent ideas get a chance to be heard. He had a point.

I disagree with Wales that his project will drive Britannica out of business, but I agree with Cox about credibility. I had an argument with a college professor over using the Internet as a primary source of information. This was in 1995 or 1996. I wrote a short paper on the Irish Republican Army, and I wanted to find out what people sympathetic to the IRA were saying. So I went to Alta Vista, did some searching, and cited what I found. I wanted to know what the people who made the bombs were thinking, and figured the people who made the bombs were more likely to have Web pages than they were to write books that would be in the University of Missouri library. But my professor wanted me to look for books. I decided he was a pompous, arrogant ass and maybe I didn’t want to minor in political science after all, especially if that meant I’d have to deal with him again.

I forgot what my point was. Oh yes. In journalism we have a sort of unwritten rule. You can cite as many sources as you want. In fact, the more sources the better. If a story doesn’t have three sources, it really ought not to be printed. That rule gets selectively enforced at times, but it’s there. Your sources can spout off all they want. That’s opinion. When three sources’ stories match independently, then it’s fact.

So what if Wikipedia is never the Britannica or even the World Book? It’s a source. It’s much more in touch with popular culture than either of those institutions ever will be. Most people will think you’re a bit odd if you sit down with a volume or two of the Britannica or World Book and read it like you would a novel. I know people who claim to have done it, but that doesn’t make the behavior unusual. Hitting random pages of Wikipedia can be entertaining reading, however. As long as you don’t get stuck in a rut of geography articles. But that’s become less and less likely.

So I don’t think it matters if the Wikipedia ever attains the status of the paper encyclopedias. You’ve got what the academics are saying. Wikipedia gives you the word on the street or in the coffee shop. Neither is necessarily a substitute for the other.

I’ve appealed to this before, but I’ll do it again. Visit Wikipedia. See what it has to say about your areas of interest. If it doesn’t say enough, take a few minutes to add to it. Resist the temptation to go to the articles on controversial people like Josef Stalin or Adolf Hitler. It’s a good way to get into an edit war and get frustrated. Find something obscure. I mostly write about old computers, old baseball players and old trains. Not too many Wikipedians are interested in those things. Especially the trains, so that’s what I write about most. (Other people seem to be; when I troll the ‘net for more information on those old companies, I frequently find copies of what I’ve already written and put in Wikipedia. It’s flattering.)

I look at it as a way of giving back. It’s relaxing to me. But there’s a community who’s written a ton of software, including an operating system, a web server, and a blogging system, and they’ve given it to me and never asked for a dime in return. I can’t program so I can’t give anything back in that way. But I have interests and I have knowledge in my head that doesn’t seem to be out there on the ‘net, and I have the ability to communicate it. So I give back that way.

It won’t change the world. Maybe all it’ll accomplish is me seeing fewer “Mar” trains on eBay and more Marx trains. But isn’t that something?