Don’t overlook thrift stores when looking for software

Need a cheap copy of Windows or Office? Don’t need the newest, buggiest, clunkiest version?

Visit your local Salvation Army Thrift Store.I was flipping through CDs at a Salvation Army store over the weekend. The software was mixed in with the music. I found several copies of Windows 95 and Windows NT 4.0, and numerous copies of Office 97, all marked at $3.

Windows 98 is probably more useful, which is probably why I didn’t find any copies of it. But NT4 is reasonably fast and stable (by Microsoft standards) as long as your hardware is supported.

Office 97, on the other hand, had all the major functionality of later versions but is a lot less CPU- and memory-intensive. Remember, when it came out, 133 MHz PCs were above average, and 32 MB of RAM was usually considered excessive.

Just make sure the disc is original, the right disc is in the case, and it includes the CD key. I found a number of odd things in Windows 95 CD cases–some more useful than Win95 and some a whole lot less. None of it would have mattered since they would have required a different CD key from the one on the jewel case.

And make sure that if you’re going to run this stuff and connect the computer to the Internet that you’re sitting behind a reasonably good firewall. A Linksys router or wireless access point is perfectly adequate. Microsoft no longer provides security fixes for this old software, so you could be more susceptible to attacks than someone running the latest and worst.

I was definitely glad to stumble across a source of legal and useful commercial software. I know it’s just a matter of time before I’ll need it, and I’d much rather pay $3 for Office 97 than $300 for a newer version that didn’t really add anything useful besides ribbon toolbars, new Clippy animations, and a soundtrack by Robert Fripp.

Are computer repair people all amateurs like this BBC reporter says?

I saw this link on Slashdot to a BBC story that calls all computer technician types “unqualified amateurs.”

I think I resent that.I think I happen to be pretty good. Understand, I got that way by being very bad for a very long time. But I will admit I’ve met a lot of IT people, and very few have impressed me. Most are better at sounding like they know what they’re talking about than they are at actually accomplishing anything. I once worked with someone who had the longest resume I’ve ever seen. He claimed to be a budding Windows NT Server administrator with experience in every application you can think of. I got suspicious when he didn’t know how to use a mouse properly. I got severely torqued off when I wrote a whiteboard full of detailed instructions on how to Ghost a PC, left for an hour, and came back to find he had completed three of them, and two of those incorrectly.

But that’s not everyone.

I’m seldom impressed with in-store technicians too. But I can tell you why. The big-box stores have difficulty keeping their good technicians. Headhunters are constantly scouring those stores in search of talent, and it’s only a matter of time before anyone who’s good leaves for greener pastures–namely, a job with fixed or semi-fixed hours and benefits.

So, no, I don’t let my friends take their computers to those places.

I’ve thought about doing what the BBC author did: Posting a notice somewhere offering computer help to home users. I’ve done a bit of it on the side in years past. But there’s a problem. Generally, too many people call, and too often.

Sometimes people seem to think they’re entitled to free computer help for life because they paid you $40 once. Other times they just keep calling you. My biggest problem with it as a part-time gig is that it’s too easy to get buried in it. I work too many hours as it is to come home to three more hours of part-time work every night.

As a full-time gig it would be more tempting, but the problem there is self employment. Thanks to self employment, the government is likely to take half of your earnings, so in order to make what you make in someone else’s employ, you really need to double the number.

That’s my deterrent. There are too many broken computers out there to do this part time, but are there enough broken computers nearby that I could fix 8 of them in a day, and do that about 260 times a year, so that I could make enough money to make it worth my while?

So that’s why I don’t operate a computer repair business out of my home. If someone bribes me enough, I’ll fix theirs, but I can think of better outlets for my entrepreneurial ability.

Just don’t call me an unqualified amateur.

If you’re concerned you might be talking to a hack in a store, here are some questions you can ask to gauge knowledge.

I’ve been messing around with Backup Exec 10

Veritas is trying mightily to unseat Microsoft as my least-favorite software company. I do believe Backup Exec to be the worst piece of software of any kind on the market. In fact, babysitting Backup Exec is the reason I haven’t been around much.

I’m looking to version 10 for some relief (and the much-needed 1.0 quality that Microsoft usually delivers around version 3–when Veritas will deliver it probably is an interesting Calculus problem).The downside to version 10: I’m told there’s no more Windows NT 4.0 support. Can’t back ’em up. I haven’t actually tried installing the remote agent on an NT4 box to see if it’s unsupported as in we-won’t-help-when-it-breaks or unsupported as in no-can-do. Smart businesses hocked their NT4 servers a couple of years ago. I won’t say anything else, except that not every business is smart.

More downside: If a tape fills up and you can’t change it because the server is offsite and/or behind locked doors that require approval from 14 middle managers and a note from your mother to get to, under some circumstances Backup Exec 10 will hang indefinitely while cancelling the job. Version 9 had the same problem. Bouncing the services will usually relieve the hang, but sometimes you have to reboot.

It’s tempting to put Backup Exec and your tape drive on your biggest file server to get faster backups. But trust me, if you put it on a server that’s dedicated to backups–its day job can be as a domain controller or some other task that’s shared by multiple, redundant mahcines–you’ll thank yourself. It’s very nice to be able to reboot your Backup Exec server without giving your seven bosses something else besides the cover sheet on your TPS reports to grumble about.

If you must put Backup Exec on your file server, set up DFS and mirror the file shares to another server. It doesn’t have to be anything fancy–just something that can prop things up while the server’s rebooting. And run Windows 2003, because it boots fast.

The upside: I can make Backup Exec 9.1 die every time by creating a direct-to-tape job and running it concurrently with a disk-to-disk-to-tape job. The tape portion of the second job will bomb every time. Veritas technical support tells me that bug was fixed in 9.1SP1. It wasn’t. But it’s fixed in 10.

There are some other features in 10, like synthetic backups, that promise to speed backups along. That would be very nice. It would also be nice if it would be reliable.

I’m not going to put it in production yet–when I first deployed 9, it fixed a lot of problems but it made a whole bunch of new ones–but maybe, just maybe, Backup Exec 10 will do what it’s supposed to do well enough that I can work something close to regular hours again.

Otherwise I’ll look forward to Backup Exec 11 and hope that it features more changes than just a new Symantec black-and-gold color scheme and wizards featuring Peter Norton. We’ll see.

So, do you still think having Internet Explorer on your server is a good idea?

Microsoft is making its updates to IE only available for Windows XP.

To which I say, what about all of those servers out there?Surely they include Server 2003 in this. But that’s a problem. Upgrading to Server 2003 isn’t always an option. Some applications only run on Windows NT 4.0, or on Windows 2000.

Unfortunately, sometimes you have to have a web browser installed on a server to get updates, either from your vendor or from MS. Windows Update, of course, only works with Internet Explorer.

One option is to uninstall Internet Explorer using the tools from litepc.com. A potentially more conservative option is to keep IE installed, use it exclusively for Windows Update, and install another lightweight browser for searching knowledge bases and downloading patches from vendors. Offbyone is a good choice. It has no Java or Javascript, so in theory it should be very secure. It’s standalone, so it won’t add more muck to your system. To install it, copy the executable somewhere. To uninstall it, delete the executable.

An even better option is just to run as few servers on Windows as possible, since they insist on installing unnecessary and potentially exploitable software on servers–Windows Media Player and DirectX are other glaring examples of this–but I seem to hold the minority opinion on that. Maybe now that they wilfully and deliberately install security holes on servers and refuse to patch them unless you run the very newest versions, that will change.

But I’m not holding my breath.

Munich\’s unexpected migration costs prove nothing so far

I saw an article in the Toronto Star in which Steve Ballmer was, um, well, talking gleefully about the city of Munich’s highly publicized and controversial migration to Linux, server to desktop, costing more money than expected.

So I suppose Mr. Ballmer is prepared to reimburse one of my clients for its unexpected expenses in migrating from VMS to Windows then, eh?

Read more

Optimizing Windows networks

My church’s IT czar asked me a good question the other day. His network performance was erratic and Network Neighborhood was messed up. Some computers saw different views of the network, although if you manually connected to other computers, that usually worked.
There are probably 35 or so computers on the network now, so it’s no longer a small network. He asked a few good questions, and the tips that came out of the discussion bear repeating here.

1. Establish a master browser. There’s supposed to be one and only one keeper of the Network Neighborhood’s directory, if you will. Whenever a Windows computer comes online, it calls for an election. Usually the winner of the election makes sense. But sometimes a computer that has no business winning the election wins. Or sometimes the computers seem to get confused about who won the election.

Networks shouldn’t be like the U.S. political system.

Windows NT, 2000, and XP boxes run a service called Computer Browser. Ideally, you want one master browser and a couple of backups online all the time. So pick four computers who are likely to always be on, and who are running Windows 2000 or XP, preferably (since they’re likely to be newer computers). Then turn the Computer Browser service off on all but those four computers. Browser elections and related bureaucracy can chew up 30% of your network bandwidth in worst-case situations, so this can be worth doing even if you’re not yet experiencing the problem.

2. Use WINS. Unless you have an Active Directory domain and you’re running DNS on Windows 2000 or 2003 Server, Windows boxes have to broadcast because they don’t know the addresses of any other computers on the network. All that broadcast traffic chews up bandwidth and can cause other unusual behavior. WINS is basically like Windows-proprietary DNS. Set up WINS on one of your Windows servers, if you have one, or on a Linux box running Samba, and you’ll end up with a faster, more reliable network.

If you’re running a home network with fewer than 10 PCs, this probably isn’t worth the effort–especially the WINS server. The Computer Browser service might be worth disabling but more because it’ll save you a little bit of memory. If you’re a large enterprise with hundreds or thousands of computers running that service, the freeware PSTools suite from Sysinternals has some command-line utilities that can help you turn off services remotely, to avoid the daunting task of visiting every desk.

Easy and secure remote Linux/Unix file transfers with SCP

Sometimes you need to transfer files between Linux boxes, or between a Linux box and some other box, and setting up Samba or some other form of network file system may not be practical (maybe you only need to transfer a couple of files, or maybe it’s just a one-time thing) or possible (maybe there’s a firewall involved).
Well, you should already have SSH installed on your Linux boxes so you can remotely log in and administer them. On Debian, apt-get install ssh sshd. If you’re running distro based on Red Hat or UnitedLinux, you may have a little investigative work to do. (I’d help you, but I haven’t run anything but Debian for 2 or 3 years.)

The cool thing about SSH is that it not only does remote login, but it will also do remote file transfer. And unlike FTP, you don’t have to stumble around with a clumsy interface.

If you want to transfer files from a Windows box, just install PuTTY. I just downloaded the 240K PSCP.EXE file and copied it into my Windows directory. That way I don’t have to mess with paths, and it’s always available. Make sure you’re downloading the right version for your CPU. The Windows NT Alpha version won’t run on your Intel/AMD/VIA CPU. Incidentally, Putty.exe is a very good Telnet/SSH client and a must-have if you’re ever connecting remotely to Unix/Linux machines from Windows.

SSH includes a command called SCP. SCP works almost like the standard Unix CP command. All you to do access a remote file is append a username, followed by the @ sign, and the IP address of the remote server. SCP will then prompt you for a password.

Let’s say I want to move a file from my Linux workstation to my webserver:

scp logo.jpg root@192.168.1.2:/var/www/images

SCP will prompt me for my password. After I enter it, it’ll copy the file, including a nice progress bar and an ETA.

On a Windows machine with PuTTY installed, simply substitute the command pscp for scp.

I can copy the other way too:

scp root@192.168.1.2:/var/www/index.php .

This command will grab a file from my webserver and drop it in the current working directory.

To speed up the transfers, add the -C switch, which turns on compression.

SCP is more secure than any other means of file transfer, it’s probably easier (since you already need SSH anyway), and since it’ll do data compression, it’s probably faster too.

More on building under a small Linux environment

Well, I’ve been playing a little bit with Erik Anderson’s uClibc-based development environment mentioned in the previous two posts.
When I compile, I issue the command export CFLAGS='-Os -s -mcpu=i386 -march=i386' to create small-as-possible binaries. Using the default flags, the Links web browser balloons to nearly 2.6 megs on my dual Celeron, mostly due to the debug symbols. It drops to around 760K with those options. Specifying i386 binaries shrinks them down at the expense of some speed on some CPUs (especially 486s and first-generation Pentiums), so you have to set your priorities. It doesn’t matter nearly as much on newer CPUs. But I’m pretty sure if you’re interested in uClibc you’re not just running it on Pentium 4s.

For the record, Links compiles without warnings without doing anything special to its configuration and seems to run without incident (I immediately used it to locate and download more source code to compile). Samba’s more difficult, giving some warnings in various places. It may or may not require some special configuration in order to actually run (I didn’t have time tonight to test it), and of course that could result in some reduced functionality. The binaries total 9.3 meg, which isn’t bad considering it implements a complete Windows NT-compatible file server as well as some simple client utilities for connecting to NT shares on a network. The files themselves are about 20% smaller than on a stock Debian system.

Erik Anderson says the majority of Unix software will compile under uClibc, which is probably true. I generally see compiler warnings occasionally even when using a completely mainstream system.

Why my ramdisk techniques don’t work with XP

I got a question today in a roundabout way asking about ramdisks in Windows, specifically, where to find my instructions for loading Win98 into a ramdisk, and how to do the same in XP.
I haven’t thought about any of this kind of stuff for more than two years. It seems like two lifetimes.

The original instructions appeared in my book, Optimizing Windows (now in the half-price bin at Amazon.com), and instructions to use DriveSpace to compress the disk appear here. You can get the freeware xmsdisk utility this trick requires from simtel.

These techniques absolutely do not work with Windows NT4, 2000, or XP. Despite the similar name, Windows NT/2000/XP are very different operating systems than Windows 9x. Believe it or not, they’re much more closely related to IBM’s OS/2 than they are to Windows 98. Since there is no DOS laying underneath it all, there’s no easy way to do the trickery that the bootable ramdisk tricks use. What these two tricks do is literally intercept the boot process, copy Windows into the ramdisk, then continue booting.

There’s a $99 piece of software called SuperSpeed that gives the NT-based operating systems this capability. I haven’t used it. I imagine it works using the same principle, hooking into the boot process and moving stuff around before booting continues.

The downside, no matter what OS you use, is the boot time. XP boots in seconds, and my book talks about the trickery necessary to get 95 and 98 to boot in 30 seconds or less. But any time you’re moving a few hundred megs or–yikes–a gig or two of data off a disk into a ramdisk, the boot process is going to end up taking minutes instead.

Is it worth it? For some people, yes. It’s nice to have applications load instantly. A lot of things aren’t CPU intensive. You spend more time waiting for your productivity apps to load than you do waiting for them to do anything. Web browsing and e-mail are generally more bandwidth- and disk-intensive than they are CPU-intensive (although CSS seems determined to change that).

But a lot of games aren’t especially disk-intensive, with the possible exception of when they’re loading a new level. So loading the flavor-of-the-week FPS game into a ramdisk isn’t going to speed it up very much.

Of course, XP is far, far more stable than 98. Windows 9x’s lack of stability absolutely drives me up the wall, and for that matter, I don’t think 2000 or XP are as stable as they should be. Given the choice between XP or 98 in a ramdisk, I’d go for XP, with or without speedup utilities.

I’ve made my choice. As I write, I’m sitting in front of a laptop running 2000 (it’s VPNed into work so I can keep an eye on tape backup jobs) and a desktop PC running Linux. I have a 400 MHz Celeron with Windows 98 on it, but it’s the last Win9x box I have (I think I had 4 at one point when I was writing the aforementioned book). Sometimes I use it to play Baseball Mogul and Railroad Tycoon. Right now it doesn’t even have a keyboard or monitor connected to it.

I guess in a way it feels like hypocrisy, but I wrote the first couple of chapters of that book with a word processor running in Red Hat Linux 5.2 (much to my editor’s chagrin), so I started down that path a long, long time ago.

What large market for x86 Unix?

What large market for x86 Unix?

In a bizarre turn of events, SCO has sued IBM for not less than $1 billion, claiming IBM willfully destroyed SCO’s business by handing its intellectual property over to the Linux movement.

Read more