The best defragmenter for Windows NT, 2000, XP and Vista

Want Diskeeper’s features without ponying up 50 bucks?

Sorry, I can’t help you. The combination of My Defrag, Scandefrag, and Pagedefrag is better and it’s free.

Scandefrag defragments your system during the boot process, as early as it can. It works better on NT-based systems like Windows 2000 and XP than it does on 98 or ME. All it does is launch the other tools.

Pagedefrag is, of course, a classic. It’s just convenient to bundle it up with these other tools. This tool defragments your registry and swap file(s) at boot time, which is the only time the system allows it.

My Defrag (actually Jerrod Kessels’ defrag) is, to put it simply, the best general purpose defragmenter for Windows NT, 2000 and XP that I’ve ever seen. Period.

If My Defrag can’t do an ideal job, it does the best it can do. Some defragmenters leave a file alone if they can’t defragment it, but this one will defragment as much as possible and move it as close to the front of the disk as possible, where performance is much better. On full disks, this is important. Since ideal conditions almost never exist (except when a system is first built), a defragmenter’s performance under less than ideal conditions is very important.

The most exciting thing about My Defrag is its ability to sort files. I like Sort alphabetically.

Sorting alphabetically (the -a7 switch) helps because it uses the full pathname. This means all of your files that are part of, say, Mozilla Firefox will be put as close together on the disk as possible, so when you launch Firefox, all of those files are close together and the disk head doesn’t have to move around a lot. The result is an application that launches faster.

So how often should you defragment? Once a year, I would do a boot-time defragmentation with Scandefrag to whip the Registry and swap files into shape. When that finishes, I would run My Defrag in full optimization mode, with file sorting. If you make a major change to your system (say, upgrading your office suite), do a quick defragmentation after the install and a full defragmentation a month or so after.

As part of your routine system maintenance, a faster, automatic defrag with no options specified is a good idea on occasion. The author says to do it no more than once a day and I agree. In my experience, once a week or even once a month is almost always fine. The way My Defrag works, the system shouldn’t get terribly fragmented on a daily basis, even if you use your system heavily. Defragmenting too frequently can shorten a hard disk’s life expectancy, although the occasional defragmentation seems to help it. I defragment a few times a year (and always have), and I generally get five or six years out of a hard disk, which is a year or two longer than most experts say to expect.

Don’t waste your money on any other tools. Download this trio, install it, use it, and watch your system performance climb.

A better registry cleaner

Note: I wrote this back in the Windows XP days. It worked really well under XP, but if you’re going to run the registry cleaner portion in Windows 7 or Windows 10, be sure to create a restore point first.

I’ve been messing around with a registry cleaner called CCleaner. I like it a lot better than the commercial tools that used to come with Norton Utilities and the like, and I like it better than the freebies that we used to use like Microsoft’s Regclean.

And you’ll never beat the price.CCleaner runs on Windows 95, 98, 98SE, ME, NT4, 2000, XP, and Vista.

One thing that I liked about it is that the program is intelligent and relatively dummy-proof. If you click around and do all of the defaults, it’s not likely to harm your computer. I inadvertently wiped out my Firefox browser history (I wanted to keep that) but that’s not a showstopper. It will populate itself again in a few weeks. Unlike commercial utility suites, where I’ve written 20-page explanations how to use them safely, this program doesn’t really need any explanation.

CCleaner actually does more than just clean up the Registry, although it does a fine job of that. It also does a great job of weeding out useless temporary files. I ran it on my old laptop and it found 386 megabytes of junk on my crowded C drive. I’ve been manually cleaning it up by searching it by hand, and I think I do a pretty good job of finding a lot of stuff, but what can I say? The program found 386 megs of stuff that I didn’t.

There are three benefits to getting rid of that cruft. First, Windows needs quite a bit of free space just to function properly. When you start getting too little free space, the system just acts goofy. Second, large numbers of temp files in the system directory just seem to make the system act funny. This was a bigger problem in Windows 9x than in the newer NT-based Windows versions, but there’s still no reason to have hundreds of those laying around. In my desktop support days, just getting rid of temp files used to clear up all sorts of mysterious problems. And finally, not having all those large and useless files on the disk makes your defragmentation programs work better. Those programs need free space to work with, and they don’t have to work as hard when they don’t have hundreds of extra worthless files to move around.

Cleaning the Registry is another important job, since a lot of uninstallation programs don’t do a very thorough job of cleaning up after themselves. The extra bloat chews up memory and slows down searches for the legitimate data the programs you actually use need. Since I tend not to install many programs and I use most of the ones I do install, CCleaner didn’t find a whole lot in my Registry, but it found some stuff to clean up.

So what happened after I ran it? The most noticeable effects were that my Start menu was a lot peppier, and my Web browsers loaded and ran a little bit faster. I understand the Web browser speedup, but the Start menu puzzled me a bit. Not that I’m complaining–it’s irritating when you press Start and have to wait for your list of programs to come up.

CCleaner isn’t a miracle worker and it won’t turn my P3-700 into a Core Duo, but the two systems I’ve run it on do run noticeably faster afterward. It was certainly more than worth the 10 minutes it took for me to download it and run it on each.

So what about the commercial utilities suites? Skip them. In this day and age, there are better, free alternatives for everything those utilities suites could do. CCleaner is one of the superstars. In coming days, I’ll talk about free substitutes for the other most important components of the utility suites.

Vindicated?

This article on Windows installation at Firing Squad preaches all the same things I was preaching nearly six years ago in my Windows 9x book.

Where to find the stuff has almost all changed, and msot of the old utilities don’t work anymore, but these are exactly the same concepts I yammered on and on about. Funny, I’ve been told system optimization is a waste of time…Incidentally, this is the second article on optimization that I’ve seen in less than a month. The other one read an awful lot like a Windows XP translation of an article I published in Computer Shopper UK back in 2000, which in turn was a shortened version of one of the chapters in the same book.

So I guess people don’t just throw their 2-gigahertz computers away and buy new ones when they start to seem slow?

It really makes me wonder what would have happened if, after the book received a gushing review in Canada and was perpetually sold out in stores up north, if those 3,000 copies of the book that languished in a warehouse in Tennessee had made their way into those stores.

That’s OK. That was five years ago, nothing can change it, and I really don’t have any desire to be a computer author anymore. I find the only way to really know a lot about computers is to work with them for 40-60 hours a week in a production environment. Labs don’t cut it–you can never underestimate the effect of 1,000+ users hammering on what you built. Never. And if you spend those hours working, that doesn’t leave enough time to write books and release them in a timely fashion.

So rather than write mediocre computer books or send myself to an early grave by working full time in addition to writing for 30-45 hours a week, I’d rather have a life, make a decent living, and not write computer books.

Setting the MTU automatically in Debian

I run this website off a server running on an ADSL line in a spare bedroom of my home. That causes some weird issues from time to time, like the time I had to figure out setting the MTU automatically in Debian.

Why Debian?

My server of choice is Debian, because it’s Linux which makes it fast, reliable, and cheap, and Debian makes it pretty easy to install only what you want and need, so I can have a server OS that’s only using 125 megabytes of disk space, leaving most of my drive available for content. I like having space for content.

This will work in Debian-derived systems like Ubuntu and Linux Mint too.

PPPoE issues

Now, the downside of modern DSL: Southwestern Bell, like most ISPs these days, uses PPPoE. So not only can your IP address change with no notice whatsoever, you also have the hassles of PPPoE. With the default settings, some unknown but noticeably large percentage of web users won’t be able to access a web server running on a DSL connection using PPPoE.

The reason is MTU and fragmentation.Yes, you remember MTU if you used Windows 9x back in the bad old days of dialup. Tweakers would play around with the MTU settings in hopes of squeezing just a little more performance out of their 56K modems, and they would swear that one utility did a better job than any others, or that this MTU setting was optimal and the conventional wisdom of 576 stank… Remember those flamewars?

Well, with broadband, theoretically the right setting to use is 1500. Trouble is, PPPoE steals some of that packet space, and the result is something worse than slow speeds. In some scenarios, you completely vanish.

Setting the MTU

The way to make my website reappear is to issue the command ifconfig eth0 mtu 1472. The exact number doesn’t seem to matter much. It appears for me that 1472 is the maximum I can use. (It can vary from ISP to ISP, in case you’re wondering.)

Excellent. Problem solved.

Not so fast. Problem solved until my server reboots. Linux doesn’t need to reboot, right? Right you are. But here in St. Louis, the power likes to hiccup a lot, especially in the summertime. My server is on a UPS, but every once in a while, in the middle of the night, there must be a long enough power failure that my UPS dies, because every once in a while I fall off the ‘net again.

To set the default MTU permanently–that is, to change it automatically on bootup–one normally would change the ifup script or the rc.startup script. Except Debian doesn’t have either of those.

My kludgy solution: cron. Log in as root, issue the command crontab -e, and add the following line:

*/2 * * * * ifconfig eth0 mtu 1472

With this in place, only seconds will elapse between the time my power comes back on for good and I reappear on the ‘net. I can live with that.

More Debian tricks

If you change your network card or motherboard, you don’t have to reinstall Debian but you may lose eth0. Here’s how to get eth0 back. And here’s how to get a list of the installed packages in Debian.

Shrinking Windows 9x

There seems to be a competition to see how small one can make Windows 9x and have it still boot into a GUI. The latest salvo in this war reduces Win98SE to under 5 megs.
People brag about how fast Windows runs when you do this. Well, yeah! Look at the file listing and the most crowded directory is 28 entries. I’ve seen 1,000+ files in C:\Windows at times. Since FAT is very efficient when dealing with small numbers of files (MS themselves said in the DOS 5 manual to never put more than 100 files in a directory) but inefficient when not, it’s no wonder to me that Windows, cut down this much, can boot in seconds. A computer’s disk is its biggest bottleneck, and the FAT filesystem doesn’t help.

The only problem is, as far as I can tell, Windows cut this small has no networking capabilities or anything else interesting besides a GUI. Which raises the question: Whatcha gonna do with it now? These days, an OS without Internet connectivity and some means to print isn’t very useful to anyone. I know that eliminates two of the three reasons I wanted a computer in the first place.

Why my ramdisk techniques don’t work with XP

I got a question today in a roundabout way asking about ramdisks in Windows, specifically, where to find my instructions for loading Win98 into a ramdisk, and how to do the same in XP.
I haven’t thought about any of this kind of stuff for more than two years. It seems like two lifetimes.

The original instructions appeared in my book, Optimizing Windows (now in the half-price bin at Amazon.com), and instructions to use DriveSpace to compress the disk appear here. You can get the freeware xmsdisk utility this trick requires from simtel.

These techniques absolutely do not work with Windows NT4, 2000, or XP. Despite the similar name, Windows NT/2000/XP are very different operating systems than Windows 9x. Believe it or not, they’re much more closely related to IBM’s OS/2 than they are to Windows 98. Since there is no DOS laying underneath it all, there’s no easy way to do the trickery that the bootable ramdisk tricks use. What these two tricks do is literally intercept the boot process, copy Windows into the ramdisk, then continue booting.

There’s a $99 piece of software called SuperSpeed that gives the NT-based operating systems this capability. I haven’t used it. I imagine it works using the same principle, hooking into the boot process and moving stuff around before booting continues.

The downside, no matter what OS you use, is the boot time. XP boots in seconds, and my book talks about the trickery necessary to get 95 and 98 to boot in 30 seconds or less. But any time you’re moving a few hundred megs or–yikes–a gig or two of data off a disk into a ramdisk, the boot process is going to end up taking minutes instead.

Is it worth it? For some people, yes. It’s nice to have applications load instantly. A lot of things aren’t CPU intensive. You spend more time waiting for your productivity apps to load than you do waiting for them to do anything. Web browsing and e-mail are generally more bandwidth- and disk-intensive than they are CPU-intensive (although CSS seems determined to change that).

But a lot of games aren’t especially disk-intensive, with the possible exception of when they’re loading a new level. So loading the flavor-of-the-week FPS game into a ramdisk isn’t going to speed it up very much.

Of course, XP is far, far more stable than 98. Windows 9x’s lack of stability absolutely drives me up the wall, and for that matter, I don’t think 2000 or XP are as stable as they should be. Given the choice between XP or 98 in a ramdisk, I’d go for XP, with or without speedup utilities.

I’ve made my choice. As I write, I’m sitting in front of a laptop running 2000 (it’s VPNed into work so I can keep an eye on tape backup jobs) and a desktop PC running Linux. I have a 400 MHz Celeron with Windows 98 on it, but it’s the last Win9x box I have (I think I had 4 at one point when I was writing the aforementioned book). Sometimes I use it to play Baseball Mogul and Railroad Tycoon. Right now it doesn’t even have a keyboard or monitor connected to it.

I guess in a way it feels like hypocrisy, but I wrote the first couple of chapters of that book with a word processor running in Red Hat Linux 5.2 (much to my editor’s chagrin), so I started down that path a long, long time ago.

Windows 98 CD-ROM drive not working? Try this

Windows 98 CD-ROM drive not working? Try this

Occasionally, a PC’s CD or DVD-ROM drive will stop responding for no known good reason. Sometimes the problem is hardware–a CD-ROM drive, being a mechanical component, can fail–but as often as not, it seems, the problem is software rather than hardware. Here’s what to do with a Windows 95 or Windows 98 CD-ROM drive not working when the same drive works just fine in another OS.

If Windows has both 16- and 32-bit CD-ROM drivers, it can get confused and disable the drive to protect itself. The solution is to remove the 16-bit driver, then delete the obscure NoIDE registry key to re-enable the 32-bit driver.

Read more

Microsoft’s Slammer pain is good for everybody

SQL Slammer hit where it counts, including HP–historically, one of the biggest Microsoft supporters around–and Microsoft itself.
This is good. Really good.

Microsoft is one of its own biggest customers. Part of this is due to one of the worst cases of not-invented-here syndrome in the industry, and part of it is marketing. If Microsoft can run its enterprise on mostly its software, its argument that you ought to be able to run all of yours on it is much stronger.

When Microsoft feels our pain, that’s good. Problems generally get fixed. Not necessarily the way we want them fixed, but fixed. When Microsoft for whatever reason doesn’t feel our pain, things languish. Witness the development of Windows 9x late in its lifecycle, after Microsoft was able to run everything internally, including laptops, on Windows 2000. While Windows 98SE was fairly good, all things considered, Windows Me was so horrid that one of my magazine editors wrote me and asked me the least painful way to escape it. Windows Me was fast, but it was less stable than 98SE.

What happened? The patches were difficult to install, poorly tested, poorly documented, and it was extremely difficult to know when you needed them. Microsoft’s inability to keep its own servers sufficiently patched illustrates this.

Several things are likely to happen now. People will take non-Microsoft solutions more seriously and, in some cases, deploy them. A not-as-homogenous Internet is good for everybody. Meanwhile, Microsoft will be cleaning up its act, making it easier to ensure that their patches actually work and can be deployed with reasonable ease.

I still think we’ll have disasters like SQL Slammer again. But this is a good step in the right direction.

Windows potpourri

I’ll give some random Windows tips tonight, since it’s getting late and I don’t really want to think. So here’s some stuff I’ve been putting off. So let’s talk utilities and troubleshooting.
Utilities first. Utilities are more fun. So let’s talk about a pair of reader submissions, from Bryan Welch.

Proxomitron. Bryan wondered if I’d ever heard of it because I’d never mentioned it. I’m sure I mentioned it on my page at editthispage.com because I ran Proxomitron for a couple of years. Proxomitron is a freeware proxy server that blocks ads, Javascript, cookies, and just about anything else undesirable. I’ve found that these days I get everything I need from Mozilla–it blocks popups just fine, and I can right-click and pick “Block images from this server” when I run across an objectionable ad, and of course I have GIF animation turned off and Flash not installed. That works for me, and it saves me memory and CPU time.

But if you want more than Mozilla gives you off the shelf, Proxomitron will give it to you. I used to recommend it wholeheartedly. I haven’t looked at a recent version of it but I’d be shocked if it’s changed much. If any of that interests you, I’m sure you’ve already run off to download it. It runs on any version of Windows from Win95 on.

98lite. Most of my readers run Windows 2000 or XP at this point, but about 20% of you are still running Win98 or WinMe. If you want to get a little extra speed, download and run 98lite to remove Internet Explorer and other not-quite-optional-but-mostly-useless cruft. It’s been pretty well established that Windows 9x runs 20-25% faster with IE gone. That’s more improvement than you’ll get from overclocking your CPU. Or from any single hardware upgrade, in most cases.

If you need IE, 98lite can still help you–it can break the desktop integration and speed things up for you, just not as much.

If you’re still running 98, I highly recommend it. How much so? When I was writing Optimizing Windows, Shane Brooks probably would have given me a copy of it, on the theory that its mention in a book would cause at least sales he wouldn’t get otherwise. I mentioned it (I think I dedicated half a chapter to it), but I didn’t ask him for one. I registered the thing. If I liked it enough to pay for it when I probably didn’t have to, that ought to say something.

Troubleshooting. Let’s talk about troubleshooting Windows 2000 and XP.

Weird BSODs in Premiere under Windows 2000. I haven’t completely figured out the pattern yet, but my video editing computer gets really unstable when the disk gets jammed. A power play at church forced me to “fork” my new video–my church gets its edited, censored, changed-for-the-sake-of-change version (pick one) while everyone else gets the slightly longer how-the-guy-with-the-journalism-degree-intended-it version. Re-saving a second project filled up nearly all available disk space and the machine started bluescreening left and right. After I’d done some cleanup last week and freed up over a gig on all my drives, and then defragmented, it had been rock solid.

So if you run Premiere and it seems less than stable, try freeing up some disk space and defragmenting. It seems to be a whole lot more picky than any other app I’ve ever seen. I suspect it’s Premiere that’s picky about disk space and one or more of the video codecs that’s picky about fragmentation. But if you’re like me, you don’t really care which of them is causing the BSODs, you just want it to stop.

Spontaneous, continuous Explorer crashes in Windows 2000. Yeah, the same machine was doing that too. I finally traced the problem to a corrupt file on my desktop. I don’t know which file. I found a mysterious file called settings.ini or something similar. I don’t know if deleting that was what got me going again or if it was some other file. But if Explorer keeps killing itself off on you and restarting and you can’t figure out why, try opening a command prompt, CD’ing to your desktop, and deleting everything you find. (I found I had the same problem if I opened the desktop directory window in Explorer while logged on as a different user, which was how I stumbled across the command line trick.)

I can’t say I’ve ever seen this kind of behavior before. First I thought I had a virus. Then I thought I had a corrupt system file somewhere. I’m glad the problem turned out to have a simple cure, but I wish I’d found that out before I did that reinstall and that lengthy virus scan…

Defragging jammed drives in Windows 2000 and XP. If you don’t have 15% free space available to Defrag (and how it defines “available” seems to be one of the great mysteries of the 21st century), it’ll complain and not do as good of a job as it should. In a pinch, run it anyway. Then run it again. Often, the available free space will climb slightly. You’ll probably never get the drive completely defragmented but you should be able to improve it at least slightly.

Increase the speed of your Web pages

There are commercial utilities that will optimize your HTML and your images, cutting the size down so your stuff loads faster and you save bandwidth. But I like free.
I found free.

Back in the day, I told you about two programs, one for Windows and one for Unix, that will crunch down your JPEGs by eliminating metadata that’s useless to Web browsers. The Unix program will also optimize the Huffman tables and optionally resample the JPEG into a lossier image, which can net you tremendous savings but might also lower image quality unacceptably.

Yesterday I stumbled across a program on Freshmeat that strips out extraneous whitespace from HTML and XML files called htmlcrunch. Optionally, it will also remove comments. The program works in DOS–including under a command prompt in Windows 9x/NT/2000/XP, and it knows how to handle long filenames–or Unix.

It’s not advertised as such, but I suspect it ought to also work on PHP and ASP files.

How much it will save you depends on your coding style, of course. If you tend to put each tag on one line with lots of pretty indentation like they teach in computer science classes, it will probably save you a ton. If you code HTML like me, it’ll save you somewhat less. If you use a WYSIWYG editor, it’ll probably save you a fair bit.

It works well in conjunction with other tools. If you use a WYSIWYG editor, I suggest you first run the code through HTML Tidy first. HTML Tidy, unlike htmlcrunch, actually interprets the HTML and removes some troublesome information. But in some cases, HTML Tidy will add characters, but this is usually a good thing–its changes improve browser compatibility. If you feed HTML Tidy a bunch of broken HTML, it’ll fix it for you.

You can further optimize your HTML with the help of a pair of Unix commands. But you run Windows? No sweat. You can grab native Windows command-line versions of a whole slew of Unix tools in one big Zip file here.

I’ve found that these HTML tools sometimes leave spaces between HTML elements under some circumstances. Whether this is intentional or a bug in the code, who knows. But it’s easy to fix with the Unix tr command:

tr "> indexopt.html

Some people believe that Web browsers parse 255-character lines faster than any other line length. I’ve never seen this demonstrated. And in my experience, any Web browser parses straight-up HTML plenty fast no matter what, unless you’re running a seriously, seriously underpowered machine, in which case optimizing the HTML isn’t going to make a whole lot of difference. Also in my experience, every browser I’ve looked at parses CSS entirely too slow. It takes most browsers longer to render this page than it takes for my server to send it over my pokey DSL line. I’ve tried mashing my stylesheets down and multiple 255-character lines versus no linebreaks whatsoever made little, if any, difference.

But if you want to try it yourself, pass your now-optimized HTML file(s) through the standard Unix fmt command, like so:

fmt -w 255 index.html > index255.html

Optimizing your HTML files to the extreme will take a little time, but it’s probably something you only have to do once, and your page visitors will thank you for it.