So there is a benefit to running Windows Server 2003 and XP

One of the reasons Windows Server 2003 and XP haven’t caught on in corporate network environments is that Microsoft has yet to demonstrate any real benefit to either one of them over Windows 2000.

Believe it or not, there actually is one benefit. It may or may not be worth the cost of upgrading, but if you’re buying licenses now and installing 2000, this information might convince you it’s worth it to install the current versions instead.The benefit: NTFS compression.

Hang on there Dave, I hear you saying. NTFS compression has been around since 1994, and hard drives are bigger and cheaper now than ever before. So why do I want to mess around with risky data compression?

Well, data compression isn’t fundamentally risky–this site uses data compression, and I’ve got the server logs that prove it works just fine–it just got a bad rap in the early 90s when Microsoft released the disastrous Doublespace with DOS 6.0. And when your I/O bus is slow and your CPU is really fast, data compression actually speeds things up, as people who installed DR DOS on their 386DX-40s with a pokey 8 MHz ISA bus found out in 1991.

So, here’s the rub with NTFS compression when it’s used on Windows Server 2003 with XP clients: the data is transferred from the server to the clients in compressed form.

If budget cuts still have you saddled with a 100 Mb or, worse yet, a 10 Mb network, that data compression will speed things up mightily. It won’t help you move jpegs around your network any faster, but Word and Excel documents sure will zoom around a lot quicker, because those types of documents pack down mightily.

The faster the computers are on both ends, the better this works. But if the server has one or more multi-GHz CPUs, you won’t slow down disk writes a lot. And you can use this strategically. Don’t compress the shares belonging to your graphic artists and web developers, for instance. Their stuff tends not to compress, and if any of them are using Macintoshes, the server will have to decompress it to send it to the Macs anyway.

But for shares that are primarily made up of files created by MS Office, compress away and enjoy your newfound network speed.

The big question: PC or Mac?

I haven’t stirred the pot in a while, so to prove that I am a professional writer after all, I’ll go tackle the most inflammatory question I can imagine, something that makes Bush vs. Kerry look like a game of paddy-cake.

What’s the better computer, a PC or a Macintosh?OS X closely follows the history of the first Macintosh in that the first version showed lots of promise, but had lots of problems, probably shipped too soon, and lacked some important capabilities. But Apple, to its credit, washed its dirty laundry in public, fixing the problems and adding capabilities. And now, OS X has a reputation as something that “just works.” And it has something to back it up with.

Windows XP, well, that joke about 32-bit extensions to a 16-bit graphical interface on top of an 8-bit operating system originally written for 4-bit computers by a 2-bit corporation that can’t stand 1 bit of competition is almost true. Microsoft bought the 8-bit OS from a company that may have stolen it. And while Gary Kildall‘s first operating system was 4-bit, he may have written CP/M from scratch. But I digress.

Unlike Apple, Windows XP tries really hard for backward compatibility. And for all the stink about the things SP2 breaks, I’ll bet you a dollar you can go download the 1981 edition of VisiCalc for MS-DOS and it’ll run just as well on your three-point-whatever gig Pentium 4 running XP as it did on the first IBM PC. And if you can find old copies of WordStar and dBASE II and Turbo Pascal, chances are they’ll run too. Old programs that break are at least as likely to break because of timing problems with CPUs that are almost a thousand times faster than they expect as they are because of Windows. Probably more.

Sure, you’ll find programs that break, but you’ll probably find a thousand that work for every one that breaks. Especially if you limit yourself to titles that aren’t games.

This is a blessing and a curse. The blessing is that software you bought almost a quarter century ago still runs if you need it. If you think that isn’t important, I’ll introduce you to one of my clients who’s still using dBASE II. It sure is important to him. The curse is all that spaghetti code you need to keep those billions and billions of old programs running.

I have a little bit more sympathy for Microsoft when I remember that Windows XP is really OS/2 1.3 with DOS bolted on, and Windows 3.1 and 98 bolted on next door.

Just a little.

When you look at it that way, is it any wonder that sometimes when you plug in your digital camera it acts goofy?

But truth be told, more often than not, your mouse and your digital camera and all your other stuff works, whether you plug it into a Windows box or a Mac. And when it doesn’t work, it’s every bit as infuriating on a Mac as it is on a Windows PC. When Windows has an error code, it spits one out in hexadecimal. The Mac spits out an error code in decimal. I guess that makes the Mac friendlier.

But I guess it doesn’t matter whether I say “deleterious” in English or in Pig Latin. It’s still not going to be a word you’re likely to have heard today, either way. And there’s a decent chance it’ll send you reaching for a dictionary (or Google).

I’ll be frank: I hated OS 9 and OS 8 and everything else that came before it. I tried to get the Mac Toss turned into an official olympic sport. If there are any old Macintoshes in the pond in front of the office building where I used to fix Macintoshes, I know nothing about them.

But Apple knew it was b0rken and threw it away and bought something better. I still think they bought the wrong something better and would have gotten here a lot sooner if they’d bought BeOS, but they bought NeXT and got Steve Jobs back, so here they are.

All things being equal, I’d go with a Mac, if only because it’s got a Unix layer underneath it.

But all things aren’t equal. Macintoshes cost a lot of money. And when you’re 2 percent of the market, you don’t have a lot of software to choose from. I know. I had long love affairs with Amiga and with OS/2 before I threw in the towel and installed Windows. And it wasn’t until 1997 that I actually used Windows as my everyday OS.

When someone hands me a disk, I can read it. When someone tells me I’ve gotta try out this new program, it runs.

On the other hand, there’s virtually no problem with viruses and spyware on the Macintosh. If I want to spy on people or cause enough damage to make the front page of USA Today, I’m going to set my sights on 90+% of the market instead of the Macintosh’s 2%. Being a minority can have its advantages.

But, after living for years with good computers and operating systems that were years or even decades ahead of their time but had no software availability, I run Windows most of the time and exercise caution to keep my system clean. I don’t use Internet Explorer, I keep my virus definitions up to date, I don’t read e-mail from strangers and don’t open unexpected attachments, and I don’t install freeware software unless it’s open source.

And guess what? I don’t have any problems with my computer either.

I know and respect other people who’ve gone the other way. For me, there never was much choice other than PC hardware. I can afford a Macintosh, but that’s money I really need to be putting towards paying off my car and my house sooner, or saving for retirement. Or any number of other things. I’m a legendary tightwad.

Other people may have had their own other reasons for making the same decision.

Open sourcing code doesn’t necessarily mean people will rush to it

Open sourcing code doesn’t necessarily mean people will rush to it

John C. Dvorak wrote a nice layman’s introduction to open source on PCMag.com. But he makes at least one big false assumption.

Dvorak says he’d love to see old code open sourced. Some examples he sought, such as CP/M, CP/M-86, and GEM, have already been open source for years. Caldera, after buying the intellectual property of the former Digital Research from Novell, released just about everything that wasn’t directly related to DR-DOS, some of it as GPL, and some under other licenses. The results have hardly been earth shattering.

Read more

Go get ’em, SCO!

I’m sure you’ve read it 4.3 billion other places already, but Microsoft has been granted a patent on double-clicking.

Well, there’s something you probably have only read a few hundred other places. Apple obviously had people double-clicking more than a year before Microsoft did, seeing as Windows 1.0 was released in November 1985 and the first Macintosh shipped in early 1984. Commodore had Amigans double-clicking by the summer of 1985. So did Atari.

Guess who supplied Atari with its operating system, since Jack Tramiel failed to swindle his way into ownership of the Amiga?

Digital Research, that’s who. DR provided Atari with a version of CP/M-68K, with its GEM GUI running on top of it. Atari marketed the bundle as TOS, for Tramiel OS.

Digital Research got crushed by the Microsoft juggernaut a few years later and eventually sold out to Novell. Novell then attempted to compete head-on with Microsoft (buying up its Utah neighbor, WordPerfect, and part of Borland in the process) and failed spectacularly. Smelling a rat–Novell believed Microsoft sabotaged some of its applications so they would not run under DR-DOS–it then pawned the Digital Research portfolio off on Caldera, a Linux company run by former Novell executives. The catch? Caldera had to turn around and sue Microsoft. Which they did, successfully.

A few more years later, The Santa Cruz Operation, a small Unix firm, wanted out. It sold its Unix-on-Intel business, as well as the rights to the old AT&T Unix (purchased from Novell, ironically) to Caldera, who soon changed its name to The SCO Group to reflect this business.

Yes, this is the same SCO who is now on a legal rampage, suing anything that moves.

Now, whether Novell or SCO is the more rightful owner of the double-click “innovation” is arguable. But such matters never seem to matter to SCO. It’s a frivolous lawsuit, but Darl McBride and Co. have made frivolous and baseless lawsuits into an art form.

Go get ’em, Darl.

The problem with online streaming video

I think we may have lost a project at work today: a project to do streaming video. It’s not really our fault; our offering looked just like everyone else’s streaming video.

The problem is that our competition isn’t everyone else’s streaming video.First let’s look at the hurdles. No matter which option you pick, some percentage of your audience is going to have to download or install something. That all but eliminates Real, since I don’t think even Woodward and Bernstein could successfully track down the link to their free player every time.

Windows Media Player is easier, but won’t necessarily run on some older versions of Windows. An overwhelming number of people have Windows XP now, but not everyone does. How many hundreds of millions of copies of Windows 98 did Microsoft sell? Do you think all of those people have thrown them away yet? No. Those people will have to download and install something.

But Media Player will leave some Macintoshes in the cold. Do you want to do that if your target audience might include schools?

QuickTime is the best cross-platform solution, but again, Windows users will have to download and install something.

OK, so you got it installed. Prepare thyself for thrilling, 15 frame-per-second 160×120 video!

Translation: Video the size of a postage stamp that moves about as fast as your mailman.

Theoretically you can stream bigger and faster video, but it’s going to be jerkier if you do. There’ll be dropped frames, artifacts, and the audio may drop out. And what’s it look like when you send DVD-sized 720×480 video? Well, considering a lot of people run their monitors at 1024×768, it makes letterboxing look good. It’s not full-screen like it is when you pop a DVD into your DVD drive.

And that’s precisely the problem. The competition isn’t other people who stream video. The competition is DVDs. Computers are digital, right? So why does its video look worse than the oldest, most worn-out VHS tape at the video rental place? And why do I have to jump through so many hoops in order to play it? On a DVD, I hit the "menu" button and then I hit "enter" or "play." (Also keep in mind that some people can’t even figure out how to do that. I’m serious. I dated a girl once whose parents couldn’t figure out a DVD player, so they had to get their 15-year-old son to come hit the buttons for them.)

And that, I think, is the reason you still don’t see tons and tons of streaming video on the Web, in spite of the high availability of DSL and cable modems in the United States, the abundance of cheap bandwidth, and the cheapness of the server software (free, in the case of QuickTime, and included with Windows Server in the case of Media Player).

MyDoom/Novarg Gloom

Just in case anybody is curious, my employer’s virus scanners filtered roughly 3,000 copies of Novarg (a.k.a. My Doom) during working hours yesteray. If that’s not a record for us, it approaches it. I know we weren’t the only one.I’ve heard Novarg/MyDoom/My Doom called the fastest spreading virus yet. I don’t have statistics on prior viruses with me, but suffice it to say, its impact certainly felt similar to the big names from the past.

Although SCO would like people to believe it was written by a Linux zealot, I’m more inclined to believe it was created by organized crime. Maybe the creators hate SCO, or maybe the anti-SCO DDoS was just an added touch to throw investigators off.

LoveLetter was the first virus outbreak to really have much impact on my professional career, and I noticed something about it. Prior to LoveLetter, I never, ever got spam at work. Not once. After LoveLetter, I started getting lots of it. I don’t believe LoveLetter’s intent was to gather e-mail addresses for spammers, but I do believe that more than one spammer, probably independently, noticed that viruses were a very efficient way to gather a large number of e-mail addresses.

I got spam before LoveLetter, and I saw viruses before LoveLetter. But I started seeing a lot more of both very soon after LoveLetter.

I don’t buy any giant conspiracy to sell anti-virus software, nor do I buy any giant conspiracy against SCO. I do believe in bored people with nothing better to do than to write viruses, and I also believe in people who can profit off their side effects.

I’ve said it once and I’ll say it again. If you run Windows, you must run anti-virus software. You can download Grisoft AVG anti-virus software for free. Don’t open unexpected e-mail attachments, even from people you know. Even if it looks safe. Don’t send unexpected e-mail attachments either–you don’t want anyone to get the idea that’s normal. Quite frankly, in this day and age, there’s no reason to open any piece of e-mail that looks suspicious for any reason. I told someone yesterday that this is war. And I think that’s pretty accurate.

If you’re an intrepid pioneer, there’s something else you can do too, in order to be part of the solution. If you join the Linux revolution, you can pretty much consider that computer immune. Macintoshes are slightly less immune, but certainly much less vulnerable than Windows. Amiga… Well, I haven’t seen the words “Amiga” and “virus” in the same sentence since 1991 or 1992. But one thing is certain: a less homogenous field is less susceptible to things like this.

 

The first PC I ever built

I’ve noticed a disturbing trend lately: Everyone who built his own PC knows everything. Just ask him.
Now, don’t get me wrong: It’s admirable to build your own PC rather than just buying Dell’s special of the week (although some people would be better off just doing exactly that), and it does require at least skill with handling a screwdriver. But it’s not what it used to be. Today, building a PC makes you know something. It no longer makes you an expert.

Read more

If I had my own Linux distribution

I found an interesting editorial called If I had my own Linux Distro. He’s got some good ideas but I wish he’d known what he was talking about on some others.
He says it should be based on FreeBSD because it boots faster than Linux. I thought everyone knew that Unix boot time has very little to do with the kernel? A kernel will boot more slowly if it’s trying to detect too much hardware, but the big factor in boot time is init, not the kernel. BSD’s init is much faster than SysV-style init. Linux distros that use BSD-style inits (Slackware, and optionally, Debian, and, as far as I understand, Gentoo) boot much faster than systems that use a traditional System V-style init. I recently converted a Debian box to use runit, and the decrease in boot time and increase in available memory at boot was noticeable. Unfortunately now the system doesn’t shut down properly. But it proves the concept.

He talks about installing every possible library to eliminate dependency problems. Better idea: Scrap RPM and use apt (like Debian and its derivatives) or a ports-style system like Gentoo. The only time I’ve seen dependency issues crop up in Debian was on a system that had an out of date glibc installed, in which case you solve the issue by either keeping the distribution up to date, or updating glibc prior to installing the package that fails. These problems are exceedingly rare, by the way. In systems like Gentoo, they don’t happen because the installation script downloads and compiles everything necessary.

Debian’s and Gentoo’s solution is far more elegant than his proposal: Installing everything possible isn’t going to solve your issue when glibc is the problem. Blindly replacing glibc was a problem in the past. The problems that caused that are hopefully solved now, but they’re beyond the control of any single distribution, and given the choice between having a new install stomp on glibc and break something old or an error message, I’ll take the error message. Especially since I can clear the issue with an apt-get install glibc. (Then when an old application breaks, it’s my fault, not the operating system’s.)

In all fairness, dependency issues crop up in Windows all the time: When people talk about DLL Hell, they’re talking about dependency problems. It’s a different name for the same problem. On Macintoshes, the equivalent problem was extensions conflicts. For some reason, people don’t hold Linux to the same standard they hold Windows and Macs to. People complain, but when was the last time you heard someone say Windows or Mac OS wasn’t ready for the desktop, or the server room, or the enterprise, or your widowed great aunt?

He also talks about not worrying about bloat. I take issue with that. When it’s possible to make a graphical Linux distribution that fits on a handful of floppies, there’s no reason not to make a system smooth and fast. That means you do a lot of things. Compile for an advanced architecture and use the -O3 options. Use an advanced compiler like CGG 3.2 or Intel’s ICC 7.0 while you’re at it. Prelink the binaries. Use a fast-booting init and a high-performance system logger. Mount filesystems with the highest-performing options by default. Partition off /var and /tmp so those directories don’t fragment the rest of your filesystem. Linux can outperform other operating systems on like hardware, so it should.

But when you do those things, then it necessarily follows that people are going to want to run your distribution on marginal hardware, and you can’t count on marginal hardware having a 20-gig hard drive. It’s possible to give people the basic utilities, XFree86, a reasonably slick window manager or environment, and the apps everyone wants (word processing, e-mail, personal finance, a web browser, instant messaging, a media player, a graphics viewer, a few card games, and–I’ll say it–file sharing) in a few hundred megabytes. So why not give it to them?

I guess all of this brings up the nicest thing about Linux. All the source code to anything desirable and all the tools are out there, so a person with vision can take them and build the ultimate distribution with it.

Yes, the idea is tempting.

Linux gets more attractive on the Xbox

There’s been another milestone in getting Linux running on Microsoft’s Xbox game console. It’s now possible to get it going if you bridge a couple of solder points on the motherboard to enable flashing the unit’s BIOS, then you use the James Bond 007 game and a save game that exploits a buffer overflow, and with a few more tricks, you can unlock the hard drive, put it in a Linux PC, install Linux, then move the drive back to the Xbox and turn it into a cheap Linux box.

Read more

A tribute to Adam Osborne

A tribute to Adam Osborne

One of my Wikipedia entries has been doing some time on the front page. Computer pioneer Adam Osborne (the “Osborne” in Osborne/McGraw-Hill and in the Osborne 1 portable computer) died last week after an 11-year illness. Read more