Are Google\’s corporate perks excessive?

Google’s corporate perks are the subject of a Fortune magazine article. I’m going to take what I suspect is a contrarian view on this. I think Google’s excessive spending on its employee perks is a good thing.

Why? Because I’ve seen what happens with the opposite.I know of one company whose ultimate goal is to use temporary contractors as much as possible. The reason is simple: Overhead. Find a company that gives its contractors as little as possible to keep rates low, use those people, and then you don’t have to mess around with giving benefits like vacation and sick time and vacation days aside from Christmas and Thanksgiving.

Personally, I think the guy’s an idiot, and you can quote me on that. I once worked at a struggling company that used a ton of contractors. None of us had any of that messy and expensive sick time. So when a contractor got sick, rather than give up a week’s pay, he or she just sucked down Dayquil like it was water and showed up for work. The result? An epidemic. I’ve never seen so many sick people in September in my life. And guess what? The rest of cold/flu season wasn’t any better.

That particular company wasn’t profitable when I worked there, and it isn’t profitable today. I wonder if it’s because nothing gets done from September to February because everyone’s sick?

I worked someplace else that was paying me about $15,000 less than what the job search engines said I should be making. I was having a hard time paying my bills some months. Did it make it hard for me to concentrate on things at work? Absolutely. I knew from year to year I was only going to get a cost-of-living raise whether I did well or poorly, so I didn’t really try all that hard to excel.

Knowing what I know about that particular employer’s bottom line and customer satisfaction, I suspect they could really have used the results of a couple of my projects from the last year and a half or so.

So when I see that Google gives its employees free food and does their laundry for free and gives them $500 worth of takeout food when they have a baby–among other things–I don’t exactly think that’s a bad idea.When an employee doesn’t have to solve those kinds of personal problems, that’s that much more energy the employee has to devote to the company. And, hopefully, the company’s needs are more interesting to the employee than laundry.

Now I’m not sure that this is universal. A company like Google is going to have a higher rate of return on this kind of investment than, say, Radio Shack.

Let’s take a look at another company. Everybody knows eBay, and the company is always profitable because it doesn’t have to do a lot of work, and it makes money whether the stuff sells or not. It’s a nice situation to be in: Millions of people are working extremely hard to make sure eBay is profitable, simply in hopes of making lots of money themselves (and while some do, many don’t).

Yet eBay’s stock price is in the toilet. The problem is that eBay isn’t growing anymore. They have a monopoly on the online auction business, but they’re pretty much expanded as much as they can, and the company hasn’t had a second great idea. They’ve had several lousy ideas in the past year, and they’re likely to have a bunch more and lose lots of money in the process of chasing the next great idea.

If Google wants to not be the next eBay, it needs to keep cranking out a steady stream of profitable ideas. Its market share in search keeps growing. Meanwhile, it’s turned advertising into a big cash cow. Maybe YouTube is Google’s next big cash cow. Maybe not, and maybe Google Base is the next one. Or maybe it’s something that hasn’t been publicly unveiled yet.

But the only reason Google got to where it is was because it had lots of brilliant people working for it, and they were free to try lots of wacky ideas. Those wacky ideas that succeeded have turned it into a juggernaut. So I think taking care of the basic needs of those fertile minds is a great idea. That means those minds have that much more energy to concentrate on coming up with great ideas. And if those minds are happy, they’re more likely to come up with great ideas for Google than profitable side projects for themselves.

The formula seems to be working. Google can pretty much hire anyone it wants at this point. The few exceptions I can think of, such as Bill Gates, probably don’t have much to offer Google anyway.

Meanwhile, people are leaving Microsoft like crazy. Whether this is a good thing or bad thing for Microsoft remains to be seen, but Google is able to retain the people it wants to retain, while Microsoft appears to be having trouble doing that.

I think the perks have a lot to do with it.

Of course, the perks won’t do much good if Google doesn’t hire the right people–I can think of some people I know and have known whose extra brainpower isn’t worth having–but Google finds itself in the position of being able to pick and choose its hires.

If Google tanks in five years, people will look back at today as a time when Google blew it by wasting revenue on excesses, but I don’t think Google will tank in five years. I think it’s more likely that in five years, everything that comes to mind when people think of the Internet will be something that Google owns.

It’ll be interesting to see.

Pay off a mortgage in five years

Thanks to some circumstances where somebody knew somebody who knew somebody, I found myself tonight at a seminar where John Cummuta was speaking. He’s the guy who you may have heard on the radio hawking a system called Transforming Your Debt into Wealth. From him, I learned how to pay off a mortgage in five years.

Hopefully I won’t get into too much trouble by presenting the simplified version of his plan.The secret of credit is that creditors will not extend you more credit than you can conceivably pay off in a fairly short length of time (like, less than a decade). The secret is to make that work for you, rather than for them.

His system is simple enough that you can plug it into an Excel worksheet. Mine has three equations in it. Here’s what you do.

Take 10 percent of your monthly income and use it to pay down debt. Pick the debt you can pay off the fastest. Forget interest. Pay the minimum monthly payment on all of your debts except the one you can pay the fastest. Add that 10 percent of your monthly income to the debt you’re working on. So if it’s a credit card balance with a minimum payment of $22, and you make $2,000 a month, you pay $222 towards that credit card.

Then, when that credit card balance is paid off, you take the debt you can pay off second, add its minimum monthly payment to that $222. Keep cascading the payments until you’ve paid everything off.

Using that formula, I can have my car paid off in a year and two months, and my house paid off in five years and two months after that.

The more money you can plow into paying off debts, the faster it goes.

He said the interest rates are pretty much irrelevant because you are paying the debts off so quickly. So it doesn’t make sense to refinance or consolidate debts or anything like that because you won’t recoup the closing costs.

The formula is a bit crude because it doesn’t take into effect the minimum monthly payments you are making, nor the accumulated interest on the on which debts you’re making minimal progress. But he said those numbers pretty much end up in a wash. Following this crude formula, you’ll be within a couple of months or two.

Also, he suggested putting off investments until you have your debt eliminated. The exception is 401(K) or similar plans where employers match your contributions. The logic is that the compound interest on your debts will almost always be larger than the compound interest your investments can earn.

However, he did not say you should empty your bank accounts to pay debt. If you have enough money in the bank to be able to take half of it and pay your smallest debt, go ahead and do it, but otherwise leave your existing bank accounts and investments alone, suspend contributing to them (or do the minimum), and then, when you have the debt paid off, you can afford to contribute to them very aggressively. Remember, at the end of the plan, you no longer have those monthly house and car payments to make.

Someone who makes $40,000 a year and works 40 years will make $1.6 million over the course of that career. The idea is to pay as little of it as possible in interest, so that money is working for you instead of your creditors.

It seems to me that debt ought to be like college. It ought to be something we do for a few years in order to get something we need, but after a few years, it’s over. And if we have to make a few sacrifices along the way, just like we did for college, we ought to do them.

Update: It worked. Thanks to finding better paying jobs and applying that, we were able to pay the mortgage off ahead of schedule.

So there is a benefit to running Windows Server 2003 and XP

One of the reasons Windows Server 2003 and XP haven’t caught on in corporate network environments is that Microsoft has yet to demonstrate any real benefit to either one of them over Windows 2000.

Believe it or not, there actually is one benefit. It may or may not be worth the cost of upgrading, but if you’re buying licenses now and installing 2000, this information might convince you it’s worth it to install the current versions instead.The benefit: NTFS compression.

Hang on there Dave, I hear you saying. NTFS compression has been around since 1994, and hard drives are bigger and cheaper now than ever before. So why do I want to mess around with risky data compression?

Well, data compression isn’t fundamentally risky–this site uses data compression, and I’ve got the server logs that prove it works just fine–it just got a bad rap in the early 90s when Microsoft released the disastrous Doublespace with DOS 6.0. And when your I/O bus is slow and your CPU is really fast, data compression actually speeds things up, as people who installed DR DOS on their 386DX-40s with a pokey 8 MHz ISA bus found out in 1991.

So, here’s the rub with NTFS compression when it’s used on Windows Server 2003 with XP clients: the data is transferred from the server to the clients in compressed form.

If budget cuts still have you saddled with a 100 Mb or, worse yet, a 10 Mb network, that data compression will speed things up mightily. It won’t help you move jpegs around your network any faster, but Word and Excel documents sure will zoom around a lot quicker, because those types of documents pack down mightily.

The faster the computers are on both ends, the better this works. But if the server has one or more multi-GHz CPUs, you won’t slow down disk writes a lot. And you can use this strategically. Don’t compress the shares belonging to your graphic artists and web developers, for instance. Their stuff tends not to compress, and if any of them are using Macintoshes, the server will have to decompress it to send it to the Macs anyway.

But for shares that are primarily made up of files created by MS Office, compress away and enjoy your newfound network speed.

Floppies, meet your replacement

I must be the next-to-last person in the world to spend significant lengths of time experimenting with these, but for the benefit of the last person in the world, I’d like to talk about USB flash drives, also known as thumb drives (for a brand name), pen drives, or keychain drives, because they’re small enough to fit on a keychain.They are, as that popular brand name suggests, about the size of your thumb. It’s possible to buy one that holds as little as 64 megabytes of data, which is still a lot of Word and Excel files, but currently the sweet spot seems to be 512 megabytes or 1 GB. This is, of course, always a moving target, but as I write, it’s entirely possible to find a 512-meg drive for around $40, although sometimes you have to deal with rebates to get the price that low. It’s harder, but still possible, to get a 1 GB drive for under $90. That will change. Currently a 2 GB drive is more than $200.

I remember when people went ga-ga over a 1 GB hard drive priced at an astounding $399. That price was astoundingly low, and that was only 10 years ago. Progress marches on, and sometimes progress really is an improvement.

The drives are so small because they use flash memory–a type of readable/writable memory chip that doesn’t lose its contents when it loses power. It’s not as fast as RAM, and it’s a lot more expensive, and its lifespan is much more finite, so you won’t see flash memory replacing your computer’s RAM any time soon. But as a replacement for the floppy disk, it’s ideal. It’s fast, it’s compatible, and unlike writable CDs and DVDs, they require no special software or hardware to write.

The drive plugs into a USB port, which is present on nearly every computer made since about 1997. Use with Windows 98 will almost certainly require the installation of a driver (hopefully your drive comes with either a driver or a web site you can use to download a driver–check compatibility before you buy one for Win98), but with Windows 2000, XP, and Mac OS X, these devices should just plug in and work, for the most part. With one Windows 2000 box, I had to reboot after plugging the drive in the first time.

From then on, it just looks like a hard drive. You can edit files from it, or drag files onto it. If the computer has USB 2.0 ports, its speed rivals that of a hard drive. It’s pokier on the older, more common USB 1.1 ports, but still very tolerable.

The only thing you have to remember is to stop the device before you yank it out of the USB port, to avoid data loss. Windows 2000 and XP provide an icon in the system tray for this.

These are great as a personal backup device. They’re small enough to carry with you anywhere–the small flashlight I keep on my keychain is bigger than most of these drives–and it only take a few minutes to copy, so you can copy those files to computers belonging to friends or relatives for safekeeping.

If your only interest in a laptop is carrying work with you–as opposed to being able to cruise the net in trendy coffee shops while you drink a $5 cup of coffee–a pen drive makes a very affordable alternative to a laptop. Plug one into your work computer, copy your files, and take work home with you. Take it on the road and you can plug it into any available computer to do work. It’s not the same as having your computer with you all the time, but for many people, it’s more than good enough, and the drives make a Palm Pilot look portly, let alone a laptop.

So how do you maximize the usable space on these devices? The ubiquitous Zip and Unzip work well, and you can download small command-line versions from info-zip.org. If you want something more transparent, there’s an old PC Magazine utility from 1997, confusingly named UnFrag, that reduces the size of many Word and Excel files. Saving in older file formats can also reduce the size, and it increases the possibility of being able to work elsewhere. Some computers still only have Office 97.

You may be tempted to reformat the drive as NTFS and turn on compression. Don’t. Some drives respond well to NTFS and others stop working. But beyond that, NTFS’s overhead makes it impractical for drives smaller than a couple of gigs (like most flash drives), and you probably want your drive to be readable in as many computers as possible. So FAT is the best option, being the lowest common denominator.

To maximize the lifespan of these drives, reduce the number of times you write to it. It’s better to copy your files to a local hard drive, edit them there, then copy them back to the flash drive. But in practice, their life expectancy is much longer than that of a Zip or floppy drive or a CD-RW. Most people are going to find the device is obsolete before it fails.

The technologically savvy can even install Linux on one of these drives. As long as a computer is capable of booting off a USB device, then these drives can be used either as a data recovery tool, or as a means to run Linux on any available computer. 512 megabytes is enough to hold a very usable Linux distribution and still leave some space for data.

Make Linux look like Windows XP

I can’t say I discovered this–I saw a reference to it in User Friendly this past week–but there’s now an XP-lookalike window manager for Linux called XPDE.

A quick look at the screenshots shows it’s a pretty convincing clone. But is it legal?The authors maintain its legality, because it uses no Microsoft code, mentions no Microsoft trademarks, and uses no Microsoft icons. I wish them well, but there is precedent for a copyright infringement anyway.

Some 20 years ago, the best-selling spreadsheet (and perhaps best-known piece of software in the world) was Lotus 1-2-3. It was expensive. In 1985, microcomputer pioneer Adam Osborne began predicting the emergence of Lotus 1-2-3 clones priced under $100. The theory was, if one could clone the IBM PC and undercut IBM’s price, why couldn’t the same technique be used to clone expensive software and undercut it in price as well?

Osborne had insider knowledge, being the president of his own software company. He released a Lotus 1-2-3 clone himself, and in 1987, Lotus sued him. Borland also incorporated Lotus 1-2-3’s menu structure into its own spreadsheet product, Quattro Pro. Lotus won its case against Osborne’s Paperback Software, with a court finding Paperback in violation of Lotus’ copyright, and Osborne disappeared into obscurity in disgust. Borland was more successful, winning its case against Lotus on appeal. But it took six years to do it, during which both companies’ products were eclipsed in the marketplace by Microsoft Excel.

So while XPDE may technically be legal, if I were involved in the project, I would be afraid of being litigated into oblivion.

But in the meantime, if you want or need a Windows-like interface for your Linux box, you can download XPDE.

What needs to happen for Linux to make it on the desktop

I saw an editorial at Freshmeat that argued that there’s actually too much software for Linux. And you know what? It has a point.
I’m sure some people will be taken aback by that. The number of titles that run under Windows must number into six digits, and it’s hard to walk into a computer store and buy Linux software.

But I agree with his argument, or at least most of it. Back in my Amiga days, the first thing people used to ask me was, “What, do you not like software?” Then I asked why they felt the need to have their choice of 10 different word processors, especially when they’d just buy pirate Microsoft Word or WordPerfect anyway. (Let’s face it: Large numbers of people chose PCs in the early 90s over superior architectures was because they could pirate software from work. Not everyone. Maybe not even the majority. But a lot.) I argued that one competent software title in each category I needed was all I wanted or needed. And for the most part, the Amiga had that, and the software was usually cheaper than the Mac or PC equivalent.

Linux is the new Amiga. Mozilla is a far better Web browser than IE, and OpenOffice provides most of the functionality of Microsoft Office XP–it provides more functionality than most people use, and while it doesn’t always load the most complex MS Office documents correctly, it does a much better job of opening slightly corrupt documents and most people don’t create very complex documents anyway. But let’s face it: Its biggest problem is it takes an eternity to load no matter how fast your computer is. If it would load faster, people would be very happy with it.

But there is nothing that provides an equivalent to a simple database like Access or Filemaker. I know, they’re toys, and MySQL is far more powerful. But end users like dumb, brain-dead databases with clicky GUI interfaces on them that they can migrate to once they realize a spreadsheet isn’t intended to do what they’re trying to do with it. Everyone’s first spreadsheet is Excel. Then someday they realize Excel wasn’t intended to do what they’re using it for. But you don’t instantly dive into Oracle. You need something in between, and Linux doesn’t really have anything for that niche.

People are constantly asking me about a WYSIWYG HTML editor for Linux as well. I stumbled across one. Its name is GINF. Yes, another stupid recursive-acronym name. GINF stands for “GINF is not Frontpage.” How helpful. What’s wrong with a descriptive name like Webpage-edit?

More importantly, what was the first non-game application that caught your fancy? For most people I know, it was Print Shop, or one of the many knockoffs of Print Shop. People love to give and receive greeting cards, and when they can pick their own fonts and graphics and write their own messages, they love it even more. Not having to drive to the store and fork over $3.95 is just a bonus. Most IT professionals have no use for Print Shop, but Linux’s lack of alternatives in that department is hurting it.

Take a computer with a CPU on the brink of obsolesence, a so-so video chipset, 128 megs of RAM and the smallest hard drive on the market, preload Linux on it along with a fast word processor that works (AbiWord, or OpenOffice Writer, except it’s not fast), a nice e-mail client/PIM (Evolution), a nice Web browser (Mozilla), and a Print Shop equivalent (bzzzt!), and a couple of card games (check Freshmeat) and you’d have a computer for the masses.

The masses do not need 385 text editors. Sysadmin types will war over vi and emacs until the end of time; one or two simple text-mode editors as alternatives will suffice, and one or two equivalents of Notepad for X will suffice.

Linux’s day will eventually arrive regardless, if only because Microsoft is learning what every monopolist eventually learns: Predatory pricing stops working once you corner the market. Then you have to raise prices or find new markets. Eventually you run out of worthwhile markets. So in order to sustain growth, you have to raise prices. Microsoft is running out of markets, so it’s going to have to raise prices. Then it will be vulnerable again, just like Apple and CP/M were vulnerable to Microsoft because their offerings cost more than Microsoft was willing to charge. And, as Microsoft showed Netscape, you can’t undercut free.

But that day will arrive sooner if it doesn’t take a week to figure out the name of the Linux equivalent of Notepad because there are 385 icons that vaguely resemble a notepad and most of them have meaningless names.

Encryption on the cheap

Disspam cruises along. It’s not often that I gush about a program, let alone a 4.5K Perl script, but Disspam continues to make my life easier. Granted, it simply takes advantage of existing network resources, but they’re resources that were previously (to my knowledge) limited to the mail administrator. Literally half my e-mail at home today was spam. Disspam caught every last piece.
A little scripting of my own. I’ve got a client at work who wants absolute privacy guaranteed. He and his assistant have some files they don’t want anyone else to be able to read, period. Well, there’s no way to guarantee that under NT, Unix, or VMS. Under NT, we can take away anyone else’s rights to read the file, but an administrator can give himself rights to read the file once again. We can make it set off all kinds of sirens if he does it, but that security isn’t good enough.

Well, the only way we can guarantee what they want is with encryption. But we’re nervous about making files that one and only one person can read, because last year, one of our executives went on vacation in Florida, fell ill, and died. We don’t want to be in a situation where critical information that a successor would need can’t be unlocked under any circumstance. So we need to encrypt in such a fashion that two people can unlock it, but only two. So the client’s backup is his assistant, and the assistant’s backup is the client. That way, if something ever happens to one of them, the other can unlock the files.

Password-protected Zip files are inadequate, because any computer manufactured within the past couple of years is more than fast enough to break the password through brute force in minutes, if not seconds. The same goes for password-protected Word and Excel documents. Windows 2000’s encryption makes it painfully easy to lock yourself out of your own files.

So I spent some time this afternoon trying to perfect a batch file that’ll take a directory, Zip it up with Info-Zip, then encrypt it with GnuPG. I chose those two programs because they’re platform-independent and open source, so there’s likely to always be some kind of support available for them, and this way we’re not subject to the whims of companies like NAI and PKWare. We’d be willing to pay for this capability, but this combination plus a little skullwork on my part is a better solution. For one, the results are compressed and encrypted, which commercial solutions usually aren’t. Since they may sometimes transfer the encrypted package over a dialup connection, the compression is important.

Plus, it’s really nice to not have to bother with procurement and license tracking. If 40 people decide they want this, we can just give it to them.

The biggest problem I ran into was that not all of the tools I had to use interpreted long filenames properly. Life would have been much easier if Windows 2000 had move and deltree commands as well. Essentially, here’s the algorithm I came up with:

Encrypt:
Zip up Private Documents subdirectory on user’s desktop
Encrypt resulting Zip file, dump file into My Documents
Back up My Documents to a network share

Decrypt and Restore:
Decrypt Zip file
Unzip file to C:Temp (I couldn’t get Unzip to go to %temp% properly)
Move files into Restored subdirectory on user’s desktop

I don’t present the batch files here yet because I’m not completely certain they work the right way every time yet.

They don’t quite have absolute security with this setup, but that’s where NTFS encryption comes in. If these guys are going to run this script every night to back the documents up, it’s no problem if they accidentally lock themselves out of those files. If their laptops get stolen, all local copies of the documents are encrypted so the thief won’t be able to read them. And the other user will be able to decrypt the copy stored on the server or on a backup tape. Or, I can be really slick and copy their GPG keys up onto the same network drive.

This job would be much easier with Linux and shell scripts–the language is far less clunky, and file naming is far less kludgy–but I have to make do. I guess in a pinch I could install the NT version of bash and the GNU utilities to give myself a Unixish environment to run the job, but that’s a lot more junk to install for a single purpose. That goes against my anti-bloat philosophy. I don’t believe in planning obsolescence. Besides, doing that would severely limit who could support this, and I don’t have to try to plant job security. I always get suspicious when people do things like that.

Another entry from the Clueless Dept.

Someone else who needs to buy a clue. I normally don’t have a problem with John Dvorak, and frequently I actually like his stuff. He’s not as clueless as some people make him out to be. Dvorak’s not as smart as he thinks he is, but one thing I’ve noticed about his critics is that they usually aren’t as smart as they think they are either.
Dvorak’s most recent Modest Proposal is that we fire all the technology ignorami out there and then, essentially, throw away corporate standards, let end-users run anything they bloody well want, and basically make them administrators of their own machines.

I’ve got a real problem with that. Case point: One of my employer’s executives recently brought in his home PC and insisted we get it running with remote access. Only one problem with that: He has Windows XP Home. XP Home’s networking is deliberately crippled, so businesses don’t try to save money by buying it. A sleazy move, but a reality we have to live with. We got it to work somewhat, but not to his satisfaction. He’s mad, but mostly because he doesn’t have any idea what changes went on under the hood in XP and doesn’t know he’s asking the impossible. But he’s perfectly competent using Word, Excel, PowerPoint and Outlook. He’s also very comfortable ripping his CDs to MP3 format–he’s got one of the largest MP3 collections in the company. He’s competent technologically. But he has no business with admin rights on his computer.

The same goes for a lot of our users. The record I’ve found for the most spyware-related files installed on a work PC is 87. These aren’t the technical ignorami who are installing this garbage. It’s the people who know how to use their stuff, but they love shareware and freeware. Maybe some of it helps them get their work done. But these people are the first to complain when their system crashes inexplicably. And I’m expected to keep not only the corporate standard apps like M$ Office running, but I’m also expected to support RealPlayer, Webshots, Go!Zilla, Gator, WinAmp, RealJukebox, AOL, and other programs that run ripshod all over the system and frequently break one another (or the apps I’m supposed to support).

If the users were completely responsible for keeping their systems running, that would be one thing. But install all that stuff on one computer and try to keep it running. You won’t have enough time to do your job.

Dvorak argues that people like me should solely be concerned with keeping the network working. That’s fine, but what about when some Luddite decides to ditch all modern apps and bring in an IBM PS/2 running DOS 5.0 and compatible versions of Lotus 1-2-3 and WordPerfect and dBASE? Unless there’s already an Ethernet card in that machine, I won’t be able to network it. And the person who decides a Macintosh SE/30 running System 6.0.8 is where it’s at will have a very difficult time getting on the network and won’t be able to exchange data with anyone else either.

Those scenarios are a bit ridiculous, but I’ve had users who would have done that if they could have. And someone wanting to run XP Home absolutely is not ridiculous, nor uncommon. If my job is to network every known operating system and make those users able to work together in this anarchy, my job has just become impossible.

As much as I would love for people to use Linux in my workplace and something other than Word and Outlook, the anarchy Dvorak is proposing is completely unworkable. It’s many orders of magnitude worse than the current situation.

This is just wrong too. Yes, New Englanders, I know about heartbreak. I’m from Kansas City. At least your Red Sox have posted more than one winning record in the past 10 years.

Anyway, not only are the Royals’ glory years over, they’ve forgotten where their glory years came from. They’ve once again denied Mark Gubicza entry into their Hall of Fame. Who? In the late 1980s, Mark Gubicza was the Royals’ second-best pitcher, behind Bret Saberhagen. Injuries did him in the same as Saberhagen (only a little sooner) but he’s still among their career leaders in wins and strikeouts.

And after spending 13 seasons in a Royals’ uniform, the Royals had a chance to trade Gubicza for hard-hitting DH Chili Davis. But you don’t trade a guy who’s poured his heart and soul into the team for 13 years and stayed completely and totally loyal to it no matter how much it hurt, right? Gubicza said yes. Gubicza went to the GM and told him that if he could make the Royals a better team by trading him, to trade him.

Chili Davis hit 30 home runs for the Royals in 1997. Then he bolted for the Yankees.

Meanwhile, Gubicza blew out his arm for good and the Angels released him. He pitched two games for them.

It takes a great man to tell the team he loves that the best thing he can do for them is to get traded for someone who can help the team more. That was Mark Gubicza. They don’t make ’em like him anymore.

But even more importantly, the immortal Charley Lau was once again denied entry. Who’s he? He was a journeyman catcher who spent his entire career as a backup and whose career batting average was .255, but that was because he had about zero natural ability. He was a genius with the bat, which was how he managed to hit .255. More importantly, Lau was the Royals’ hitting coach in the early 1970s. He spotted some skinny guy who was playing third base because Paul Schaal couldn’t play third base on artificial turf and their first choice to replace him, Frank White, couldn’t play third base at all. This skinny blond fielded just fine, but he was hitting terribly. Lau asked him what he was doing over the All-Star break. The kid said he was going fishing with Buck Martinez. Lau put his foot down. He told him he was going to stay in Kansas City and learn how to hit.

“He changed my stance. I had been standing up there like Carl Yastrzemski, but the next thing I knew I looked like Joe Rudi,” the kid recalled. But he started hitting. By the end of the year, he’d pulled his average up to a very respectable .282.

Soon Lau had every player on the Royals standing at the plate like Joe Rudi, and taking the top hand off the bat after contact with the ball. And the Royals created a mini-dynasty in the American League Western Division.

What was the name of that kid, anyway?

George Brett.

If it hadn’t been for Charley Lau, George Brett would have been nothing. The Royals probably would have never won anything. And they probably wouldn’t be in Kansas City anymore either. Who puts up with 30 years of losing, besides Cubs fans?

Charley Lau belongs in their Hall of Fame. Even if nobody besides George Brett and me remembers who he was.

Ho-hum.

Another day, another Outlook worm. Tell me again why I continue to use Outlook? Not that I ever open unexpected attachments. For that matter, I rarely open expected ones–I think it’s rude. Ever heard of cut and paste? It’s bad enough that I have to keep one resource hog open to read e-mail, so why are you going to make me load another resource hog, like Word or Excel, to read a message where the formatting doesn’t matter?
The last couple of times I received Word attachments that were important, I converted them to PDFs for grins. Would you believe the PDFs were considerably smaller? I was shocked too. Chances are there was a whole lot of revisioning data left in those documents–and it probably included speculative stuff that underlings like me shouldn’t see. Hmm. I guess that’s another selling point for that PDF-printer we whipped up as a proof of concept a couple of weeks ago, isn’t it? I’d better see if I can get that working again. I never did get it printing from the Mac, but seeing as all the decision-makers who’d be using it for security purposes use PCs, that’s no problem.

I spent the day learning a commercial firewall program. (Nope, sorry, won’t tell you which one.) My testbed for this thing will be an old Gateway 2000 box whose factory motherboard was replaced by an Asus SP97 at some point in the past. It’s got 72 megs of RAM. I put in an Intel Etherexpress Pro NIC today. I have another Etherexpress Pro card here that I’m bringing in, so I’ll have dual EEPros in the machine. The firewall has to run under Red Hat, so I started downloading Red Hat 7.2. I learned a neat trick.

First, an old trick. Never download with a web browser. Use the command-line app wget instead. It’s faster. The syntax is really simple: wget url. Example: wget http://www.linuxiso.org/download/rh7.2-i386-disc1.iso

Second trick: Download your ISOs off linuxiso.org. It uses some kind of round-robin approach to try to give you the least busy of several mirrors. It doesn’t always work so well on the first try. The mirror it sent me to first was giving me throughput rates that topped out at 200KB/sec., but frequently dropped as low as 3KB/sec.Usually they stayed in the 15MB/sec range. I cancelled the transfer (ctrl-c) and tried again. I got a mirror that didn’t fluctuate as wildly, but it rarely went above the 20MB/sec. range. I cancelled the transfer again and got a mirror that rarely dropped below 50MB/sec and occasionally spiked as high as 120MB/sec. Much better.

Third trick (the one I learned today): Use wget’s -c option. That allows wget to resume transfers. Yep, you can get the most important functionality of a download manager in a 147K binary. It doesn’t spy on you either. That allowed me to switch mirrors several times without wasting the little bit I’d managed to pull off the slow sites.

Fourth trick: Verify your ISOs after you download them. LinuxISO provides MD5 sums for its wares. Just run md5sum enigma-i386-disc1.iso to get a long 32-character checksum for what you just downloaded. If it doesn’t match the checksum on the site, don’t bother burning it. It might work, but you don’t want some key archive file (like, say, the kernel) to come up corrupt. Even though CD-Rs are dirt cheap these days and high-speed burners make quick work of them, there’s still no point in unnecessarily wasting 99 cents and five minutes on the disc and half an hour on a questionable install.

As for downloading the file in separate pieces like Go!Zilla does, there’s a command-line Linux program called mget that does it, but it doesn’t follow redirection and it doesn’t do FTP except through a proxy server, so I have a hard time recommending it as a general-purpose tool. When it works, it seems to work just fine. You might try mget, but chances are decent you’ll end up falling back on wget.

04/08/2001

How far we’ve come… While I was hunting down tax paperwork yesterday (found it!), I ran across a stash of ancient computer magazines. For grins, I pulled out the May 1992 issue of Compute, which celebrated the release of Windows 3.1. I would have received this magazine nine years ago this month.

Some tidbits I liked:

“Windows 3.0… entered a hostile world. OS/2 loomed on the horizon like a dragon ready to devour us, and MS-DOS, stuck in version 4.0, had lost its momentum. It looked as if Digital Research…was the only company trying to make DOS better.” –Clifton Karnes, pg 4

That’s what happens when there’s no strong competition. I don’t get the OS/2 and dragon metaphor though. What, people didn’t want a computer that worked right? I didn’t get it at the time. I had an Amiga, which at the time offered OS/2 features and a good software library.

“Some people even started talking about Unix.” Ibid.

Some things never change.

“The masses are happy, and nobody talks about Unix much anymore.” Ibid.

That certainly changed.

“You can now buy a 200 MB drive for just $500.” –Mark Minasi, pg 58

That now-laughable line was from a Mark Minasi column that talked about strategies for getting drives larger than 512 MB working. Strangely, that problem still rears its ugly head more often than it should, and its descendant problem, getting a drive bigger than 8 gigs working, is even more common.

“A 286-based notebook is a very capable machine; with a decent-size hard disk and a portable mouse, you could even run Windows applications on one (except for those requiring enhanced mode performance such as Excel).” –Peter Scisco, pg 72

Don’t let any of the end users I support read that line. That’s funny. Later in the same article, Scisco discusses the problem of battery life, a struggle we still live with.

“The last dozen modems I’ve installed here at Compute have been compact models. It’s almost like the manufacturers are trying to get better mileage by leaving out parts and making the cards smaller. These modems don’t reject line noise very well.” –Richard Leinecker, pg. 106

Now there’s a problem that only got worse with time.

An ad from Computer Direct on page 53 offered a 16 MHz 386SX with a meg of RAM and dual floppy drives (no hard drive) for $399. Your $399 gets you a lot more these days, but that price got a second look for sure. A complete system with a 14-inch VGA monitor and 40-meg HD ran $939. The same vendor offered an external CD-ROM drive (everything was a 1X in these days) for $399.

An ad on page 63 proclaimed the availability of the epic game Civilization, for “IBM-PC/Tandy/Compatibles.” Yes, these were the days when you could still buy a PC at Radio Shack and expect to be taken seriously.