Run the right version of Windows for your PC

I said I was done writing about system optimization. I changed my mind. I have one more thing, and it seems appropriate, now that Vista upgrades are available.

Be very wary about upgrading your version of Windows.There are a few Vista-only titles out there, and there will be some more, but the majority of titles aren’t. Walk into a software aisle and you’ll still find a lot of software that will run on Windows 95 (or possibly 98), assuming the computer meets the hardware requirements.

I’m typing this on an 800 MHz HP Pavilion 6835. Sure, it’s outmoded–for around $125, I could swap in an Athlon 64 motherboard that would give me 4-5x the CPU power and that would be considered a low-end PC by today’s standards–but this one’s peppy. I run Windows ME on it. Windows 2000 would be more stable but I’m lazy. I wouldn’t try XP on it. When XP came out, this system was already old.

Technically, XP will install on a 133 MHz Pentium if it has enough RAM. I’ve seen it done, and I’ve seen it try to run on one. It’s not pretty. I really wouldn’t try running XP on anything less than a 1 GHz PC with 256 megs of RAM, because that was the standard PC at the time of XP’s release. But believe it or not, if you install Windows 95 and Office 95 on that Pentium-133, it’s a reasonably nice machine–because that was a high-end box in 1995 when Windows 95 and Office 95 came out.

So when you’re refurbishing an old machine, try to install whatever the current version of Windows was when it was new. The PC will run a lot better. Here’s a guide.

Windows 95: Released August 1995
Typical PC of the time: 486, 66 MHz
Hot PC of the time: Pentium, 133 MHz

Windows NT 4.0: Released July 1996
Typical PC of the time: Pentium, 75 MHz
Hot PC of the time: Pentium Pro, 200 MHz

Windows 98: Released June 1998
Typical PC of the time: Pentium, 233 MHz
Hot PC of the time: Pentium II, 333 MHz

Windows 2000: Released February 2000
Typical PC of the time: Pentium III or Athlon, 600 MHz
Hot PC of the time: Pentium III or Athlon, 1 GHz

Windows XP: Released October 2001
Typical PC of the time: Pentium 4, 1.5 GHz
Hot PC of the time: Pentium 4 or Athlon, 2+ GHz

Windows Vista: Released January 2007
From what I understand, even a hot PC of 2007 has difficulty running it. I haven’t seen Vista yet; my employer is still running XP for everything.

Of course, if you install as much memory as the system will take, you can push your limits, since Windows is often more memory-bound than CPU-bound. I also try to replace the hard drive with the fastest model I can budget for. Don’t worry if the drive has a faster DMA rate than the controller on the board; you’ll still benefit from the faster seek times and better throughput of a newer drive. If the new drive saturates the bus, it could be worse–I guarantee the old one didn’t.

The best defragmenter for Windows NT, 2000, XP and Vista

Want Diskeeper’s features without ponying up 50 bucks?

Sorry, I can’t help you. The combination of My Defrag, Scandefrag, and Pagedefrag is better and it’s free.

Scandefrag defragments your system during the boot process, as early as it can. It works better on NT-based systems like Windows 2000 and XP than it does on 98 or ME. All it does is launch the other tools.

Pagedefrag is, of course, a classic. It’s just convenient to bundle it up with these other tools. This tool defragments your registry and swap file(s) at boot time, which is the only time the system allows it.

My Defrag (actually Jerrod Kessels’ defrag) is, to put it simply, the best general purpose defragmenter for Windows NT, 2000 and XP that I’ve ever seen. Period.

If My Defrag can’t do an ideal job, it does the best it can do. Some defragmenters leave a file alone if they can’t defragment it, but this one will defragment as much as possible and move it as close to the front of the disk as possible, where performance is much better. On full disks, this is important. Since ideal conditions almost never exist (except when a system is first built), a defragmenter’s performance under less than ideal conditions is very important.

The most exciting thing about My Defrag is its ability to sort files. I like Sort alphabetically.

Sorting alphabetically (the -a7 switch) helps because it uses the full pathname. This means all of your files that are part of, say, Mozilla Firefox will be put as close together on the disk as possible, so when you launch Firefox, all of those files are close together and the disk head doesn’t have to move around a lot. The result is an application that launches faster.

So how often should you defragment? Once a year, I would do a boot-time defragmentation with Scandefrag to whip the Registry and swap files into shape. When that finishes, I would run My Defrag in full optimization mode, with file sorting. If you make a major change to your system (say, upgrading your office suite), do a quick defragmentation after the install and a full defragmentation a month or so after.

As part of your routine system maintenance, a faster, automatic defrag with no options specified is a good idea on occasion. The author says to do it no more than once a day and I agree. In my experience, once a week or even once a month is almost always fine. The way My Defrag works, the system shouldn’t get terribly fragmented on a daily basis, even if you use your system heavily. Defragmenting too frequently can shorten a hard disk’s life expectancy, although the occasional defragmentation seems to help it. I defragment a few times a year (and always have), and I generally get five or six years out of a hard disk, which is a year or two longer than most experts say to expect.

Don’t waste your money on any other tools. Download this trio, install it, use it, and watch your system performance climb.

I rebuilt a Dell Dimension 4100 last night

So, I rebuilt a Dell Dimension 4100 last night. I didn’t make any hardware changes other than replacing the Western Digital hard drive inside, which was on its last legs.

Along the way, I learned a few things.I won’t say much about the WD drive except to say it’s the most recent in a long line of bad experiences I’ve had with the brand. I don’t know anything about current WD drives. But this one was loud and shrill, Windows bluescreened when I tried to install to it, and when I tried to run SpinRite on it, it said it would take 140 hours to test. A drive that size (20GB) should take 8-10.

In its defense, that drive was five years old. But I replaced it with a Maxtor drive that’s almost eight years old. SpinRite processed that Maxtor in 3 hours and found nothing worth commenting about. (Just because SpinRite didn’t say anything doesn’t necessarily mean it didn’t do anything.)

The Dell Dimension 4100 does have a proprietary power supply (although it looks like an ATX). If you work on Dells, I suggest bookmarking PC Power and Cooling’s Dell cheatsheet. PCP&C power supplies are expensive, but they are reliable, and their prices are comparable to what Dell would charge for a replacement and they are higher quality than what you would get from Dell–assuming Dell will even sell you the part (they’re in the business of selling computers, not parts). I believe newer Dells use standard power supplies.

If you buy a Micron, you can punch in a serial number and get drivers for the machine. With a Dell, you just get guesses based on the options that were available for the machine.

Download the chipset drivers and other low-level stuff from Dell’s support site. Windows 2000 didn’t completely recognize the system’s Intel i815 chipset and I get better performance afterward.

Nlite offers a lot of promise–automating the Windows install, removing components, etc.–but I had trouble getting it to work with the OS recovery CDs I had. I didn’t have enough time (or blank CDs) to figure out how to get it to work for me. I’m sure it works better with a plain old Windows 2000 Workstation CD, but of course I can’t find mine. But if you have a CD that works with it, it’s nice even if you don’t remove the stuff Microsoft doesn’t let you remove, since it provides a nice interface for slipstreaming service packs and hotfixes and removing all of the prompts during installation.

The tricks in Windows 2000 with 32MB of RAM work pretty nicely, even when you have more than 32 megs. Of course, if you’re ruthless with Nlite and can get it to work for you, you probably don’t need that bag of tricks.

I didn’t try to install it without Internet Explorer. I’d love to try that sometime but I didn’t have time for that. At least disabling Active Desktop (see the link in the paragraph above) gives most of the benefit you would get from smiting IE.

The quality of the Dell hardware is reasonable. It didn’t floor me, but I didn’t see anything that made me cry either.

Windows 2000 without Internet Explorer

Those of you who bought my book (both of you) know that the biggest secret to speeding up Windows 95 and 98 was removing the extra junk nobody used–especially the stuff Microsoft deliberately made impossible to remove via normal means.

In Windows 95, all it took was modifying INF files with a text editor to remove MSN, IE, and the other obsolete software it shipped with from the get-go. Win98 got a bit more complicated. But with 2000, Microsoft started getting nasty–putting encrypted data in multiple places, so even if you hacked the INFs, it didn’t do any good.

But several people still figured out how to do it.I really like Fred Vorck‘s site, because he’s careful to document everything. He also found out the same thing I did by writing my book–there are lots of people who will whine that your instructions are too complicated, they’ll whine that when they follow the directions and make a mistake it doesn’t work, or they just repeat the Microsoft party line that the software can’t be removed, and your mouse will stop working, your computer will generate horrific RF interference, and gas prices will soar if you remove IE from Windows. (The last part is probably true, of course, but none of the rest is.)

What really happens when you remove IE from Windows 2000 is similar to what happened when you removed it from Win95 and 98: Your memory usage drops (by about 20 megs, in this case) and your boot time is cut in half.

Since some software does break, because some software does use the IE engine, you might not want to do this on every PC you own. But if, say, you want to run Windows 2000 on an old laptop with limited memory so you can run a handful of useful Windows applications, this is perfect. If you want a stable, lightweight (by modern standards) OS for any Pentium II-class machine that might be sitting in the closet, this makes it a viable option too. A lot of computers are sitting in closets today not because they’re no longer useful, but because there’s no practical or affordable way to boost them up to the half-gig of memory that you need for Windows XP to be practical to use on them.

Read Vorck’s site some more, and dig around, and you’ll find that minimal Windows installs have created something of a subculture. I don’t know if anyone’s squeezed XP down to the level I got Win95 down to (the original Windows 95, released on Aug. 24, 1995, can be hacked down to an installation footprint of 17 megabytes without much hassle), but some people have done some pretty amazing things.

Yes, when I get time someday, I’ll be messing around with this. I wish I’d discovered it sooner.

And in case anyone cares, I found this because some know-it-all at work said you can’t uninstall Outlook Express from Windows 2000. I vaguely remembered having seen a piece of software that goes so far as to remove IE, so I said, “You can remove IE from Windows 2000 if you’re willing to work hard enough at it, let alone Outlook Express.” So I did some more searching, just to satisfy my curiosity.

If and when I end up building a minimal Win2000 box, I may just have to bring it in one day to show the know-it-all. But as longtime readers of this site know, I’ve dealt with that type before. So it’s probably not worth the effort to carry it out to the car.

Windows 2000 in 32 megs of RAM

I can’t remember if I linked this before or not, so here’s Windows 2000 on 32 MB of RAM.

Of course I find this interesting. And his advice is pretty good. My first choice for an OS in 32 megs of RAM would be Windows 95, and probably Windows 95a at that (and gee, some idiot wrote a book about that), but if you need better reliability and stability, Windows 2000 is a good second choice.

One piece of advice worth mentioning that he didn’t mention: If there’s a modem on the system, lose it, especially if it’s a Winmodem. That’ll save lots of precious RAM and CPU cycles.

Operating System Not Found, Missing Operating System, and friends

So the PC that stored my resume got kicked (as in the foot of a passer-by hitting it) and died, and the backup that I thought I had… Well, it wasn’t where I thought it was.

Time for some amateur home data recovery. Here’s how I brought it back.This machine ran Windows 2000. The first trick to try on any machine running any flavor of Windows is to boot from a DOS boot disk containing FDISK.EXE and issue the command FDISK /MBR. This replaces the master boot record. A corrupt MBR is the most common malady that causes these dreaded error messages, and this is the easiest fix for it.

That didn’t work for me.

The second trick is to use MBRWork. Have it back up the first sector, then have it delete the boot record. Then it gives you an option to recover partitions. Run that, then run the option that installs the standard MBR code. I can’t tell you how many times this tool has made me look like I can walk on water.

No dice this time either.

Next I tried grabbing the Windows 2000 CD and doing a recovery install. This has brought systems back to life for me too. Not this time. As happens all too often, it couldn’t find the Windows 2000, so it couldn’t repair it.

The drive seemed to work, yet it couldn’t boot or anything. I could have and probably should have put it in another PC to make sure it was readable. But I didn’t have a suitable donor handy. Had there been such a system, I would have put the drive in, checked to see if it was readable, and probably would have run CHKDSK against it.

Lacking a suitable donor, instead I located an unused hard drive and put it in the system. I booted off the drive just to make sure it wasn’t a hardware problem. It wasn’t–an old copy of Windows 98 booted and dutifully spent 20 minutes installing device drivers for the new motherboard hardware. So I powered down, installed both drives, and broke out a copy of Ghost.

Ghost, as I have said before, doesn’t exactly copy data–what it does is better described as reinterpreting the data. This allows you to use Ghost to lay down an image on dissimilar hard drives. It also makes Ghost a fabulous data recovery tool. Ghost complained that the NTFS log needed to be flushed. Well, that requires booting into Windows (and I think that’s all that’s necessary), but I couldn’t do that. It offered to try the copy anyway, so I chose that. So it cranked for about 15 minutes. I exited Ghost, powered down, and disconnected the bad drive. I powered back up, and it booted. Fabulous.

Now I can use Ghost to copy the now-good drive back over to the drive that was bad in the first place. I’ll do that, but sending out the resume takes much higher priority.

Fixing a computer that shows the wrong partition size after resizing

So, I’ve got these Windows 2000 boxes that didn’t have enough space, so I resized some partitions. No error messages, no problems. I reboot, and the drives still show their old size, even though in Disk Administrator they show the right size.

What gives? Microsoft acknowledges this issue in Windows XP, but hasn’t released a fix yet. But these aren’t XP, they’re 2000.

I’ve got a crazy solution.

If you have a copy of Ghost by Symantec, take a Ghost image of the partition that’s sized wrong. Then, immediately after creating the image, write the image back to the partition you just Ghosted.

Makes no sense, right? Well, but Ghost doesn’t do a bit-by-bit copy. It makes sure it gets good copies of your files, but it saves an interpretation of the partition, rather than the partition itself. So when it writes it back, minor errors that were there before get wiped out.

Now, why there can’t be a disk utility that does this same thing to a partition without the imaging runaround, I don’t know.

I just know I’ve brought a lot of computers with weird disk problems back to life over the years by making Ghost images of them and then writing the image back. This one today is just the latest in that long line.

Disk defragmentation in Windows 2000, XP, and, uh, NT4

The disk defragmenter that Microsoft includes with Windows 2000 and XP really stinks up the place.

I’ve been playing with an alternative.It’s free. It’s called DIRMS, an acronym for Do It Right Microsoft.

It’s text mode. That means XP and NT owners can schedule defragments without paying for Diskeeper, which is good, because Diskeeper is barely better than MS’s defrag because they were written by the same company.

DIRMS uses the same built-in API so it ought to be safe but it uses a different algorithm. Whereas Executive’s programs won’t even try to defrag a file if it can’t do it completely, DIRMS just does the best it can. And unlike Diskeeper/Defrag, it moves files to the front of the disk, just like Win98’s Defrag, which increases performance further.

I’m not ready to put it on any system I care about yet, but I think it has a lot of potential. Rather than running Defrag four times to clean up a disk, I can run DIRMS followed by Defrag to mop up the operation and get a disk that’s almost 100% defragmented.

In fact the two programs seem to do better in tandem than either could ever do on their own. At any rate, it’s free, and worth checking out.

Things to look for in a flatbed scanner

David Huff asked today about scanners, and I started to reply as a comment but decided it was too long-winded and ought to be a separate discussion.

So, how does one cut through the hype and get a really good scanner for not a lot of money?The short answer to David’s question is that I like the Canon Canoscan LIDE series. Both my mom and my girlfriend have the LIDE 80 and have been happy with it.

For the long answer to the question, let’s step through several things that I look for when choosing a scanner.

Manufacurer. There are lots of makers of cheap and cheerful scanners out there. Chances are there are some cheap and nasty ones too. Today’s cheap and nasty scanners will be a lot better than 1995’s crop of cheap and nasties, since the PC parallel port was a huge source of incompatibilities, but I want a scanner from a company with some experience making scanners and with good chances of still being around in five years.

Driver support. Much is made of this issue. But past track record isn’t much of an indicator of future results. HP and Umax infamously began charging for updated drivers, for example. But at least I could get a driver from HP or Umax, even if it costs money. My Acer scanner is forever tethered to a Windows 98 box because I can’t get a working driver for Windows 2000 or XP for it.

Umax used to have a stellar track record for providing scanner drivers, which was why I started buying and recommending them several years ago. I don’t know what their current policy is but I know some people have sworn them off because they have charged for drivers, at least for some scanners, in the recent past. But you can get newer drivers, in many cases, from Umax UK.

But that’s why I like to stick with someone like Canon, HP, Umax, or Epson, who’ve been making scanners for several years and are likely to continue doing so. Even if I have to pay for a driver, I’d rather pay for one than not be able to get one. Keep in mind that you’ll be running Windows XP until at least 2006 anyway.

Optical resolution. Resolution is overrated, like megahertz. It’s what everyone plays up. It’s also a source of confusion. Sometimes manufacturers play up interpolated resolution or somesuch nonsense. This is where the scanner fakes it. It’s nice to have, but there are better ways to artificially increase resolution if that’s what you’re seeking.

Look for hardware or optical resolution. Ignore interpolated resolution.

Back to that overrated comment… Few of us need more than 1200dpi optical resolution. For one thing, not so long ago, nobody had enough memory to hold a decent-sized 4800dpi image in memory in order to edit it. If you’re scanning images to put them on the Web, remember, computer screen resolution ranges from 75 to 96dpi, generally speaking. Anything more than that just slows download speed. For printing, higher resolution is useful, but there’s little to no point in your scanner having a higher resolution than your printer.

I just did a search, and while I was able to find inkjet printers with a horizontal resolution of up to 5760dpi, I found exactly one printer with a vertical resolution of 2400dpi. The overwhelming majority were 1200dpi max, going up and down.

Your inkjet printer and your glossy magazines use different measurements for printing, but a true 1200dpi is going to be comparable to National Geographic quality. If your photography isn’t up to National Geographic standards, megaresolution isn’t going to help it.

Bit depth. If resolution is the most overrated factor, bit depth is the most underrated. Generally speaking, the better the bit depth, the more accurate the color recognition. While even 24 bits gives more colors than the human eye can distinguish, there is a noticeable difference in accuracy between scans done on a 24-bit scanner and scans from a 36-bit scanner.

If you have to choose between resolution and bit depth, go for bit depth every time. Even if you intend to print magazines out of your spare bedroom or basement. After all, if the color on the photograph is off, nobody is going to pay any attention to how clear it is.

Size and weight. Some flatbed scanners are smaller and lighter than a laptop. If they can draw their power from the USB port, so much the better. You might not plan to take one with you, but it’s funny how unplanned things seem to happen.

So there is a benefit to running Windows Server 2003 and XP

One of the reasons Windows Server 2003 and XP haven’t caught on in corporate network environments is that Microsoft has yet to demonstrate any real benefit to either one of them over Windows 2000.

Believe it or not, there actually is one benefit. It may or may not be worth the cost of upgrading, but if you’re buying licenses now and installing 2000, this information might convince you it’s worth it to install the current versions instead.The benefit: NTFS compression.

Hang on there Dave, I hear you saying. NTFS compression has been around since 1994, and hard drives are bigger and cheaper now than ever before. So why do I want to mess around with risky data compression?

Well, data compression isn’t fundamentally risky–this site uses data compression, and I’ve got the server logs that prove it works just fine–it just got a bad rap in the early 90s when Microsoft released the disastrous Doublespace with DOS 6.0. And when your I/O bus is slow and your CPU is really fast, data compression actually speeds things up, as people who installed DR DOS on their 386DX-40s with a pokey 8 MHz ISA bus found out in 1991.

So, here’s the rub with NTFS compression when it’s used on Windows Server 2003 with XP clients: the data is transferred from the server to the clients in compressed form.

If budget cuts still have you saddled with a 100 Mb or, worse yet, a 10 Mb network, that data compression will speed things up mightily. It won’t help you move jpegs around your network any faster, but Word and Excel documents sure will zoom around a lot quicker, because those types of documents pack down mightily.

The faster the computers are on both ends, the better this works. But if the server has one or more multi-GHz CPUs, you won’t slow down disk writes a lot. And you can use this strategically. Don’t compress the shares belonging to your graphic artists and web developers, for instance. Their stuff tends not to compress, and if any of them are using Macintoshes, the server will have to decompress it to send it to the Macs anyway.

But for shares that are primarily made up of files created by MS Office, compress away and enjoy your newfound network speed.