Why I generally buy AMD

I was talking to a new coworker today and of course the topic of our first PCs came up. It was Cyrix-based. I didn’t mention my first PC (it seems I’m about four years older–it was an Am486SX2/66).

With only a couple of exceptions, I’ve always bought non-Intel PCs. Most of the Intel PCs I have bought have been used. One boss once went so far as to call me anti-corporate.

I’m not so much anti-corporate as I am pro-competition.

Read more

Run the right version of Windows for your PC

I said I was done writing about system optimization. I changed my mind. I have one more thing, and it seems appropriate, now that Vista upgrades are available.

Be very wary about upgrading your version of Windows.There are a few Vista-only titles out there, and there will be some more, but the majority of titles aren’t. Walk into a software aisle and you’ll still find a lot of software that will run on Windows 95 (or possibly 98), assuming the computer meets the hardware requirements.

I’m typing this on an 800 MHz HP Pavilion 6835. Sure, it’s outmoded–for around $125, I could swap in an Athlon 64 motherboard that would give me 4-5x the CPU power and that would be considered a low-end PC by today’s standards–but this one’s peppy. I run Windows ME on it. Windows 2000 would be more stable but I’m lazy. I wouldn’t try XP on it. When XP came out, this system was already old.

Technically, XP will install on a 133 MHz Pentium if it has enough RAM. I’ve seen it done, and I’ve seen it try to run on one. It’s not pretty. I really wouldn’t try running XP on anything less than a 1 GHz PC with 256 megs of RAM, because that was the standard PC at the time of XP’s release. But believe it or not, if you install Windows 95 and Office 95 on that Pentium-133, it’s a reasonably nice machine–because that was a high-end box in 1995 when Windows 95 and Office 95 came out.

So when you’re refurbishing an old machine, try to install whatever the current version of Windows was when it was new. The PC will run a lot better. Here’s a guide.

Windows 95: Released August 1995
Typical PC of the time: 486, 66 MHz
Hot PC of the time: Pentium, 133 MHz

Windows NT 4.0: Released July 1996
Typical PC of the time: Pentium, 75 MHz
Hot PC of the time: Pentium Pro, 200 MHz

Windows 98: Released June 1998
Typical PC of the time: Pentium, 233 MHz
Hot PC of the time: Pentium II, 333 MHz

Windows 2000: Released February 2000
Typical PC of the time: Pentium III or Athlon, 600 MHz
Hot PC of the time: Pentium III or Athlon, 1 GHz

Windows XP: Released October 2001
Typical PC of the time: Pentium 4, 1.5 GHz
Hot PC of the time: Pentium 4 or Athlon, 2+ GHz

Windows Vista: Released January 2007
From what I understand, even a hot PC of 2007 has difficulty running it. I haven’t seen Vista yet; my employer is still running XP for everything.

Of course, if you install as much memory as the system will take, you can push your limits, since Windows is often more memory-bound than CPU-bound. I also try to replace the hard drive with the fastest model I can budget for. Don’t worry if the drive has a faster DMA rate than the controller on the board; you’ll still benefit from the faster seek times and better throughput of a newer drive. If the new drive saturates the bus, it could be worse–I guarantee the old one didn’t.

The best defragmenter for Windows NT, 2000, XP and Vista

Want Diskeeper’s features without ponying up 50 bucks?

Sorry, I can’t help you. The combination of My Defrag, Scandefrag, and Pagedefrag is better and it’s free.

Scandefrag defragments your system during the boot process, as early as it can. It works better on NT-based systems like Windows 2000 and XP than it does on 98 or ME. All it does is launch the other tools.

Pagedefrag is, of course, a classic. It’s just convenient to bundle it up with these other tools. This tool defragments your registry and swap file(s) at boot time, which is the only time the system allows it.

My Defrag (actually Jerrod Kessels’ defrag) is, to put it simply, the best general purpose defragmenter for Windows NT, 2000 and XP that I’ve ever seen. Period.

If My Defrag can’t do an ideal job, it does the best it can do. Some defragmenters leave a file alone if they can’t defragment it, but this one will defragment as much as possible and move it as close to the front of the disk as possible, where performance is much better. On full disks, this is important. Since ideal conditions almost never exist (except when a system is first built), a defragmenter’s performance under less than ideal conditions is very important.

The most exciting thing about My Defrag is its ability to sort files. I like Sort alphabetically.

Sorting alphabetically (the -a7 switch) helps because it uses the full pathname. This means all of your files that are part of, say, Mozilla Firefox will be put as close together on the disk as possible, so when you launch Firefox, all of those files are close together and the disk head doesn’t have to move around a lot. The result is an application that launches faster.

So how often should you defragment? Once a year, I would do a boot-time defragmentation with Scandefrag to whip the Registry and swap files into shape. When that finishes, I would run My Defrag in full optimization mode, with file sorting. If you make a major change to your system (say, upgrading your office suite), do a quick defragmentation after the install and a full defragmentation a month or so after.

As part of your routine system maintenance, a faster, automatic defrag with no options specified is a good idea on occasion. The author says to do it no more than once a day and I agree. In my experience, once a week or even once a month is almost always fine. The way My Defrag works, the system shouldn’t get terribly fragmented on a daily basis, even if you use your system heavily. Defragmenting too frequently can shorten a hard disk’s life expectancy, although the occasional defragmentation seems to help it. I defragment a few times a year (and always have), and I generally get five or six years out of a hard disk, which is a year or two longer than most experts say to expect.

Don’t waste your money on any other tools. Download this trio, install it, use it, and watch your system performance climb.

Don’t overlook thrift stores when looking for software

Need a cheap copy of Windows or Office? Don’t need the newest, buggiest, clunkiest version?

Visit your local Salvation Army Thrift Store.I was flipping through CDs at a Salvation Army store over the weekend. The software was mixed in with the music. I found several copies of Windows 95 and Windows NT 4.0, and numerous copies of Office 97, all marked at $3.

Windows 98 is probably more useful, which is probably why I didn’t find any copies of it. But NT4 is reasonably fast and stable (by Microsoft standards) as long as your hardware is supported.

Office 97, on the other hand, had all the major functionality of later versions but is a lot less CPU- and memory-intensive. Remember, when it came out, 133 MHz PCs were above average, and 32 MB of RAM was usually considered excessive.

Just make sure the disc is original, the right disc is in the case, and it includes the CD key. I found a number of odd things in Windows 95 CD cases–some more useful than Win95 and some a whole lot less. None of it would have mattered since they would have required a different CD key from the one on the jewel case.

And make sure that if you’re going to run this stuff and connect the computer to the Internet that you’re sitting behind a reasonably good firewall. A Linksys router or wireless access point is perfectly adequate. Microsoft no longer provides security fixes for this old software, so you could be more susceptible to attacks than someone running the latest and worst.

I was definitely glad to stumble across a source of legal and useful commercial software. I know it’s just a matter of time before I’ll need it, and I’d much rather pay $3 for Office 97 than $300 for a newer version that didn’t really add anything useful besides ribbon toolbars, new Clippy animations, and a soundtrack by Robert Fripp.

Are computer repair people all amateurs like this BBC reporter says?

I saw this link on Slashdot to a BBC story that calls all computer technician types “unqualified amateurs.”

I think I resent that.I think I happen to be pretty good. Understand, I got that way by being very bad for a very long time. But I will admit I’ve met a lot of IT people, and very few have impressed me. Most are better at sounding like they know what they’re talking about than they are at actually accomplishing anything. I once worked with someone who had the longest resume I’ve ever seen. He claimed to be a budding Windows NT Server administrator with experience in every application you can think of. I got suspicious when he didn’t know how to use a mouse properly. I got severely torqued off when I wrote a whiteboard full of detailed instructions on how to Ghost a PC, left for an hour, and came back to find he had completed three of them, and two of those incorrectly.

But that’s not everyone.

I’m seldom impressed with in-store technicians too. But I can tell you why. The big-box stores have difficulty keeping their good technicians. Headhunters are constantly scouring those stores in search of talent, and it’s only a matter of time before anyone who’s good leaves for greener pastures–namely, a job with fixed or semi-fixed hours and benefits.

So, no, I don’t let my friends take their computers to those places.

I’ve thought about doing what the BBC author did: Posting a notice somewhere offering computer help to home users. I’ve done a bit of it on the side in years past. But there’s a problem. Generally, too many people call, and too often.

Sometimes people seem to think they’re entitled to free computer help for life because they paid you $40 once. Other times they just keep calling you. My biggest problem with it as a part-time gig is that it’s too easy to get buried in it. I work too many hours as it is to come home to three more hours of part-time work every night.

As a full-time gig it would be more tempting, but the problem there is self employment. Thanks to self employment, the government is likely to take half of your earnings, so in order to make what you make in someone else’s employ, you really need to double the number.

That’s my deterrent. There are too many broken computers out there to do this part time, but are there enough broken computers nearby that I could fix 8 of them in a day, and do that about 260 times a year, so that I could make enough money to make it worth my while?

So that’s why I don’t operate a computer repair business out of my home. If someone bribes me enough, I’ll fix theirs, but I can think of better outlets for my entrepreneurial ability.

Just don’t call me an unqualified amateur.

If you’re concerned you might be talking to a hack in a store, here are some questions you can ask to gauge knowledge.

I’ve been messing around with Backup Exec 10

Veritas is trying mightily to unseat Microsoft as my least-favorite software company. I do believe Backup Exec to be the worst piece of software of any kind on the market. In fact, babysitting Backup Exec is the reason I haven’t been around much.

I’m looking to version 10 for some relief (and the much-needed 1.0 quality that Microsoft usually delivers around version 3–when Veritas will deliver it probably is an interesting Calculus problem).The downside to version 10: I’m told there’s no more Windows NT 4.0 support. Can’t back ’em up. I haven’t actually tried installing the remote agent on an NT4 box to see if it’s unsupported as in we-won’t-help-when-it-breaks or unsupported as in no-can-do. Smart businesses hocked their NT4 servers a couple of years ago. I won’t say anything else, except that not every business is smart.

More downside: If a tape fills up and you can’t change it because the server is offsite and/or behind locked doors that require approval from 14 middle managers and a note from your mother to get to, under some circumstances Backup Exec 10 will hang indefinitely while cancelling the job. Version 9 had the same problem. Bouncing the services will usually relieve the hang, but sometimes you have to reboot.

It’s tempting to put Backup Exec and your tape drive on your biggest file server to get faster backups. But trust me, if you put it on a server that’s dedicated to backups–its day job can be as a domain controller or some other task that’s shared by multiple, redundant mahcines–you’ll thank yourself. It’s very nice to be able to reboot your Backup Exec server without giving your seven bosses something else besides the cover sheet on your TPS reports to grumble about.

If you must put Backup Exec on your file server, set up DFS and mirror the file shares to another server. It doesn’t have to be anything fancy–just something that can prop things up while the server’s rebooting. And run Windows 2003, because it boots fast.

The upside: I can make Backup Exec 9.1 die every time by creating a direct-to-tape job and running it concurrently with a disk-to-disk-to-tape job. The tape portion of the second job will bomb every time. Veritas technical support tells me that bug was fixed in 9.1SP1. It wasn’t. But it’s fixed in 10.

There are some other features in 10, like synthetic backups, that promise to speed backups along. That would be very nice. It would also be nice if it would be reliable.

I’m not going to put it in production yet–when I first deployed 9, it fixed a lot of problems but it made a whole bunch of new ones–but maybe, just maybe, Backup Exec 10 will do what it’s supposed to do well enough that I can work something close to regular hours again.

Otherwise I’ll look forward to Backup Exec 11 and hope that it features more changes than just a new Symantec black-and-gold color scheme and wizards featuring Peter Norton. We’ll see.

So, do you still think having Internet Explorer on your server is a good idea?

Microsoft is making its updates to IE only available for Windows XP.

To which I say, what about all of those servers out there?Surely they include Server 2003 in this. But that’s a problem. Upgrading to Server 2003 isn’t always an option. Some applications only run on Windows NT 4.0, or on Windows 2000.

Unfortunately, sometimes you have to have a web browser installed on a server to get updates, either from your vendor or from MS. Windows Update, of course, only works with Internet Explorer.

One option is to uninstall Internet Explorer using the tools from litepc.com. A potentially more conservative option is to keep IE installed, use it exclusively for Windows Update, and install another lightweight browser for searching knowledge bases and downloading patches from vendors. Offbyone is a good choice. It has no Java or Javascript, so in theory it should be very secure. It’s standalone, so it won’t add more muck to your system. To install it, copy the executable somewhere. To uninstall it, delete the executable.

An even better option is just to run as few servers on Windows as possible, since they insist on installing unnecessary and potentially exploitable software on servers–Windows Media Player and DirectX are other glaring examples of this–but I seem to hold the minority opinion on that. Maybe now that they wilfully and deliberately install security holes on servers and refuse to patch them unless you run the very newest versions, that will change.

But I’m not holding my breath.

Munich\’s unexpected migration costs prove nothing so far

I saw an article in the Toronto Star in which Steve Ballmer was, um, well, talking gleefully about the city of Munich’s highly publicized and controversial migration to Linux, server to desktop, costing more money than expected.

So I suppose Mr. Ballmer is prepared to reimburse one of my clients for its unexpected expenses in migrating from VMS to Windows then, eh?

Yeah, that’s what I thought.I wouldn’t call myself a migration specialist, per se, but it seems that during my career, just as often as not I’ve been involved in projects that are migrations to something or other, and more often than not, they’ve been migrations to Windows. I helped one of the first OS/2 networks outside of IBM itself migrate to Windows NT. I helped lots of smaller clients migrate from various versions of Mac OS to Windows NT. I’ve done a couple of small projects that migrated something Windows- or VMS-based to Linux. Last year I helped a client migrate from VMS to Windows 2003. Right now I’m working on a project that migrates another client from VMS to Windows 2000/2003.

I’m not trying to prove that I’m a migration expert, but I do think I’ve learned a few things along the way. And one of the first things I learned is that if you’re trying to migrate in order to save money right away, you’re migrating for the wrong reason and your project is probably going to fail very quickly. It’s very hard for a migration to save you that much money that quickly, and if it does, then that means its predecessor was so broken that somebody ought to be fired for not replacing it five years earlier.

The other thing I’ve learned is that a migration always always has unexpected costs, for a very simple reason. It’s impossible to know everything that’s going on on your network. I don’t know everything that’s going on on my home network, and most of the time, I’m the only one using it.

You might say I’m scatterbrained. I say you might be right. But let me give you an example from a network other than mine. In my first job, they decommissioned DOS-based WordPerfect years before I was born started working there. But since the system didn’t prevent people from installing software, people just smuggled in their copies of WordPerfect from home, installed it, and went right on using it, creating new data. Then I came along to migrate them to Windows NT, and they planned the same charade all over again. Only this time, they weren’t able to install their copy of WordPerfect. When told it was illegal to install and we weren’t going to do it, they said they needed that data in order to do their job.

That, my friend, is an unexpected expense.

The city of Munich undoubtedly has data in obsolete formats, being used every day by people, without anyone else knowing about it. I have a client still running something they rely on every day in dBASE II. Yes, TWO! Yes, when the account manager told me that, I made a joke about CP/M. For those of you who haven’t been around that long, dBASE II was obsoleted more than 20 years ago, although some people continued to use it after it was replaced by dBASE III. Some longer than others, it seems…

In this line of work, you find weird stuff. I know weird stuff is attracted to me, but I know I’m not the only one who finds this.

And weird stuff like that, my friend, can sometimes be an unexpected major expense.

The unexpected expenses my current client paid in its current migration paid for me to have a box full of my dad’s old Lionel trains fixed up better than new, and then to buy a bunch of new stuff. Trust me, it wasn’t cheap. And trust me, only a percentage of what my employer got trickled down to me.

I’m sure the city of Munich went into this knowing some or all of this. I’m also sure this wasn’t about money, even though Microsoft is gloating about money now.

What Steve Ballmer wants everyone to forget is that Microsoft came in with the lowest bid. Maybe not initially, but in the end they did. And Munich went with a Linux-based solution anyway.

Why? I’ll tell you why. New Microsoft Office releases every two years. New versions of operating systems every three to four years. New bloatware service packs that guarantee you’ll have to replace your hardware every three years, released every year. Annual antivirus subscription rates. Lost productivity when a virus slips through the cracks anyway. Lost productivity when spyware breaks some required business app.

MCSEs work cheap, and the software is inexpensive at first. But you get nickled and dimed to death.

Linux is more costly than expected this year. But the next four years will be less expensive than anticipated.

And Munich may be betting on that.

Optimizing Windows networks

My church’s IT czar asked me a good question the other day. His network performance was erratic and Network Neighborhood was messed up. Some computers saw different views of the network, although if you manually connected to other computers, that usually worked.
There are probably 35 or so computers on the network now, so it’s no longer a small network. He asked a few good questions, and the tips that came out of the discussion bear repeating here.

1. Establish a master browser. There’s supposed to be one and only one keeper of the Network Neighborhood’s directory, if you will. Whenever a Windows computer comes online, it calls for an election. Usually the winner of the election makes sense. But sometimes a computer that has no business winning the election wins. Or sometimes the computers seem to get confused about who won the election.

Networks shouldn’t be like the U.S. political system.

Windows NT, 2000, and XP boxes run a service called Computer Browser. Ideally, you want one master browser and a couple of backups online all the time. So pick four computers who are likely to always be on, and who are running Windows 2000 or XP, preferably (since they’re likely to be newer computers). Then turn the Computer Browser service off on all but those four computers. Browser elections and related bureaucracy can chew up 30% of your network bandwidth in worst-case situations, so this can be worth doing even if you’re not yet experiencing the problem.

2. Use WINS. Unless you have an Active Directory domain and you’re running DNS on Windows 2000 or 2003 Server, Windows boxes have to broadcast because they don’t know the addresses of any other computers on the network. All that broadcast traffic chews up bandwidth and can cause other unusual behavior. WINS is basically like Windows-proprietary DNS. Set up WINS on one of your Windows servers, if you have one, or on a Linux box running Samba, and you’ll end up with a faster, more reliable network.

If you’re running a home network with fewer than 10 PCs, this probably isn’t worth the effort–especially the WINS server. The Computer Browser service might be worth disabling but more because it’ll save you a little bit of memory. If you’re a large enterprise with hundreds or thousands of computers running that service, the freeware PSTools suite from Sysinternals has some command-line utilities that can help you turn off services remotely, to avoid the daunting task of visiting every desk.

Easy and secure remote Linux/Unix file transfers with SCP

Sometimes you need to transfer files between Linux boxes, or between a Linux box and some other box, and setting up Samba or some other form of network file system may not be practical (maybe you only need to transfer a couple of files, or maybe it’s just a one-time thing) or possible (maybe there’s a firewall involved).
Well, you should already have SSH installed on your Linux boxes so you can remotely log in and administer them. On Debian, apt-get install ssh sshd. If you’re running distro based on Red Hat or UnitedLinux, you may have a little investigative work to do. (I’d help you, but I haven’t run anything but Debian for 2 or 3 years.)

The cool thing about SSH is that it not only does remote login, but it will also do remote file transfer. And unlike FTP, you don’t have to stumble around with a clumsy interface.

If you want to transfer files from a Windows box, just install PuTTY. I just downloaded the 240K PSCP.EXE file and copied it into my Windows directory. That way I don’t have to mess with paths, and it’s always available. Make sure you’re downloading the right version for your CPU. The Windows NT Alpha version won’t run on your Intel/AMD/VIA CPU. Incidentally, Putty.exe is a very good Telnet/SSH client and a must-have if you’re ever connecting remotely to Unix/Linux machines from Windows.

SSH includes a command called SCP. SCP works almost like the standard Unix CP command. All you to do access a remote file is append a username, followed by the @ sign, and the IP address of the remote server. SCP will then prompt you for a password.

Let’s say I want to move a file from my Linux workstation to my webserver:

scp logo.jpg root@192.168.1.2:/var/www/images

SCP will prompt me for my password. After I enter it, it’ll copy the file, including a nice progress bar and an ETA.

On a Windows machine with PuTTY installed, simply substitute the command pscp for scp.

I can copy the other way too:

scp root@192.168.1.2:/var/www/index.php .

This command will grab a file from my webserver and drop it in the current working directory.

To speed up the transfers, add the -C switch, which turns on compression.

SCP is more secure than any other means of file transfer, it’s probably easier (since you already need SSH anyway), and since it’ll do data compression, it’s probably faster too.