More on building under a small Linux environment

Well, I’ve been playing a little bit with Erik Anderson’s uClibc-based development environment mentioned in the previous two posts.
When I compile, I issue the command export CFLAGS='-Os -s -mcpu=i386 -march=i386' to create small-as-possible binaries. Using the default flags, the Links web browser balloons to nearly 2.6 megs on my dual Celeron, mostly due to the debug symbols. It drops to around 760K with those options. Specifying i386 binaries shrinks them down at the expense of some speed on some CPUs (especially 486s and first-generation Pentiums), so you have to set your priorities. It doesn’t matter nearly as much on newer CPUs. But I’m pretty sure if you’re interested in uClibc you’re not just running it on Pentium 4s.

For the record, Links compiles without warnings without doing anything special to its configuration and seems to run without incident (I immediately used it to locate and download more source code to compile). Samba’s more difficult, giving some warnings in various places. It may or may not require some special configuration in order to actually run (I didn’t have time tonight to test it), and of course that could result in some reduced functionality. The binaries total 9.3 meg, which isn’t bad considering it implements a complete Windows NT-compatible file server as well as some simple client utilities for connecting to NT shares on a network. The files themselves are about 20% smaller than on a stock Debian system.

Erik Anderson says the majority of Unix software will compile under uClibc, which is probably true. I generally see compiler warnings occasionally even when using a completely mainstream system.

Why my ramdisk techniques don’t work with XP

I got a question today in a roundabout way asking about ramdisks in Windows, specifically, where to find my instructions for loading Win98 into a ramdisk, and how to do the same in XP.
I haven’t thought about any of this kind of stuff for more than two years. It seems like two lifetimes.

The original instructions appeared in my book, Optimizing Windows (now in the half-price bin at Amazon.com), and instructions to use DriveSpace to compress the disk appear here. You can get the freeware xmsdisk utility this trick requires from simtel.

These techniques absolutely do not work with Windows NT4, 2000, or XP. Despite the similar name, Windows NT/2000/XP are very different operating systems than Windows 9x. Believe it or not, they’re much more closely related to IBM’s OS/2 than they are to Windows 98. Since there is no DOS laying underneath it all, there’s no easy way to do the trickery that the bootable ramdisk tricks use. What these two tricks do is literally intercept the boot process, copy Windows into the ramdisk, then continue booting.

There’s a $99 piece of software called SuperSpeed that gives the NT-based operating systems this capability. I haven’t used it. I imagine it works using the same principle, hooking into the boot process and moving stuff around before booting continues.

The downside, no matter what OS you use, is the boot time. XP boots in seconds, and my book talks about the trickery necessary to get 95 and 98 to boot in 30 seconds or less. But any time you’re moving a few hundred megs or–yikes–a gig or two of data off a disk into a ramdisk, the boot process is going to end up taking minutes instead.

Is it worth it? For some people, yes. It’s nice to have applications load instantly. A lot of things aren’t CPU intensive. You spend more time waiting for your productivity apps to load than you do waiting for them to do anything. Web browsing and e-mail are generally more bandwidth- and disk-intensive than they are CPU-intensive (although CSS seems determined to change that).

But a lot of games aren’t especially disk-intensive, with the possible exception of when they’re loading a new level. So loading the flavor-of-the-week FPS game into a ramdisk isn’t going to speed it up very much.

Of course, XP is far, far more stable than 98. Windows 9x’s lack of stability absolutely drives me up the wall, and for that matter, I don’t think 2000 or XP are as stable as they should be. Given the choice between XP or 98 in a ramdisk, I’d go for XP, with or without speedup utilities.

I’ve made my choice. As I write, I’m sitting in front of a laptop running 2000 (it’s VPNed into work so I can keep an eye on tape backup jobs) and a desktop PC running Linux. I have a 400 MHz Celeron with Windows 98 on it, but it’s the last Win9x box I have (I think I had 4 at one point when I was writing the aforementioned book). Sometimes I use it to play Baseball Mogul and Railroad Tycoon. Right now it doesn’t even have a keyboard or monitor connected to it.

I guess in a way it feels like hypocrisy, but I wrote the first couple of chapters of that book with a word processor running in Red Hat Linux 5.2 (much to my editor’s chagrin), so I started down that path a long, long time ago.

What large market for x86 Unix?

What large market for x86 Unix?

In a bizarre turn of events, SCO has sued IBM for not less than $1 billion, claiming IBM willfully destroyed SCO’s business by handing its intellectual property over to the Linux movement.

Read more

Why I dislike Microsoft

“Windows 2000,” I muttered as one of my computers fired up so my girlfriend could use it. “Must mean something about the number of bugs that’ll be discovered tomorrow.”
She told me she liked Windows and asked me why I hated Microsoft so much.

It’s been a while since I thought about that. She speculated that I was annoyed that Bill Gates is smarter than me. (Which he probably is, but aside from a couple more books in print, it hasn’t gotten him anything I don’t have that I want.) There’s more to it than that.

I’m still annoyed about the foundation Microsoft built its evil empire upon. In the ’70s, Microsoft was a languages company, and they specialized in the language Basic. Microsoft Basic wasn’t the best Basic on the market, but it was the standard. And when IBM decided it wanted to enter the personal computer market, IBM wanted Microsoft Basic because nobody would take them seriously if they didn’t. So they started talking to Microsoft.

IBM also wanted the CP/M operating system. CP/M wasn’t the best operating system either, but it was the standard. IBM was getting ready to negotiate with Gary Kildall, owner of Digital Research and primary author of the OS, and ran into snags. Gates’ account was that Kildall went flying and kept the IBM suits waiting and then refused to work with them. More likely, the free-spirited and rebellious Kildall didn’t want to sign all the NDAs IBM wanted him to sign.

Microsoft was, at the time, a CP/M subcontractor. Microsoft sold a plug-in board for Apple II computers that made them CP/M-compatible. So IBM approached Microsoft about re-selling CP/M. Microsoft couldn’t do it. And that bothered Gates.

But another Microsoft employee had a friend named Tim Patterson. Tim Patterson was an employee of Seattle Computer Products, a company that sold an 8086-based personal computer similar to the computer IBM was developing. CP/M was designed for computers based on the earlier 8080 and 8085 CPUs. Patterson, tired of waiting for a version of CP/M for the 8086, cloned it.

So Seattle Computer Products had something IBM wanted, and Microsoft was the only one who knew it. So Microsoft worked out a secret deal. For $50,000, they got Patterson and his operating system, which they then licensed to IBM. Patterson’s operating system became PC DOS 1.0.

Back in the mid-1990s, PC Magazine columnist John C. Dvorak wrote something curious about this operating system. He said he knew of an easter egg present in CP/M in the late 1970s that caused Kildall’s name and a copyright notice to be printed. Very early versions (presumably before the 1.0 release) of DOS had this same easter egg. This of course screams copyright violation.

Copyright violation or none, Kildall was enraged the first time he saw DOS 1.0 because it was little more than a second-rate copy of his life’s work. And while Digital Research easily could have taken on Microsoft (it was the bigger company at the time), the company didn’t stand a prayer in court against the mighty IBM. So the three companies made some secret deals. The big winner was Microsoft, who got to keep its (possibly illegal) operating system.

Digital Research eventually released CP/M-86, but since IBM sold CP/M-86 for $240 and DOS for $60, it’s easy to see which one gained marketshare, especially since the two systems weren’t completely compatible. Digital Research even added multiuser and multitasking abilities to it, but they were ignored. In 1988, DR-DOS was released. It was nearly 100% compatible with MS-DOS, faster, less expensive, and had more features. Microsoft strong-armed computer manufacturers into not using it and even put cryptic error messages in Windows to discourage the end users who had purchased DR-DOS as an upgrade from using it. During 1992, DR-DOS lost nearly 90% of its marketshare, declining from $15.5 million in sales in the first quarter to just $1.4 million in the fourth quarter.

Digital Research atrophied away and was eventually bought out by Novell in 1991. Novell, although the larger company, fared no better in the DOS battle. They released Novell DOS 7, based on DR-DOS, in 1993, but it was mostly ignored. Novell pulled it from the market within months. Novell eventually sold the remnants of Digital Research to Caldera Inc., who created a spinoff company with the primary purpose of suing Microsoft for predatory behavior that locked a potential competitor out of the marketplace.

Caldera and Microsoft settled out of court in January 2000. The exact terms were never disclosed.

Interestingly, even though it was its partnership with IBM that protected Microsoft from the wrath of Gary Kildall in 1981, Microsoft didn’t hesitate to backstab IBM when it got the chance. By 1982, clones of IBM’s PC were beginning to appear on the market. Microsoft sold the companies MS-DOS, and even developed a custom version of Basic for them that worked around a ROM compatibility issue. While there was nothing illegal about turning around and selling DOS to its partner’s competitors, it’s certainly nobody’s idea of a thank-you.

Microsoft’s predatory behavior in the 1980s and early ’90s wasn’t limited to DOS. History is littered with other operating systems that tried to take on DOS and Windows and lost: GeoWorks. BeOS. OS/2. GeoWorks was an early GUI programmed in assembly language by a bunch of former videogame programmers. It was lightning fast and multitasked, even on 10 MHz XTs and 286s. It was the most successful of the bunch in getting OEM deals, but you’ve probably never heard of it. OS/2 was a superfast and stable 32-bit operating system that ran DOS and Windows software as well as its own, a lot like Windows NT. By Gates’ own admission it was better than anything Microsoft had in the 1990s. But it never really took off, partly because of IBM’s terrible marketing, but partly because Microsoft’s strong-arm tactics kept even IBM’s PC division from shipping PCs with it much of the time. BeOS was a completely new operating system, written from scratch, that was highly regarded for its speed. It never got off the ground because Microsoft completely locked it out of new computer bundles.

Microsoft used its leverage in operating systems to help it gain ground in applications as well. In the 1980s, the market-leading spreadsheet was Lotus 1-2-3. There was an alleged saying inside Microsoft’s DOS development group: DOS ain’t done ’til Lotus won’t run. Each new DOS revision, from version 3 onward, broke third-party applications. Lotus 1-2-3, although once highly regarded, is a noncontender in today’s marketplace.

Once Windows came into being, things only got worse. Microsoft’s treatment of Netscape was deplorable. For all intents and purposes, Microsoft had a monopoly on operating systems by 1996, and Netscape had a monopoly on Web browsers. Netscape was a commercial product, sold in retail stores for about $40, but most of its distribution came through ISPs, who bought it at a reduced rate and provided it to their subscribers. Students could use it for free. Since the Web was becoming a killer app, Netscape had a booming business. Microsoft saw this as a threat to its Windows franchise, since Netscape ran well not only on Windows, but also on the Mac, OS/2 and on a number of flavors of Unix. So Microsoft started bunding Internet Explorer with Windows and offering it as a free download for those who already had Windows, or had an operating system other than Windows, such as Mac OS. In other industries, this is called tying or dumping, and it’s illegal. Netscape, once the darling of Wall Street, was bought for pennies on the dollar by AOL, and AOL-Time Warner is still trying to figure out what to do with it. Once Microsoft attained a monopoly on Web browsers, innovation in that space stopped. Internet Explorer has gotten a little bit faster and more standards compliant since IE4, but Microsoft hasn’t put any innovation in the browser for five years. Want popup blocking or tabs? You won’t find either in IE. All of the innovation in that space has come in browsers with a tiny piece of the market.

One could argue that consumers now get Web browsers for free, where they didn’t before. Except every new computer came with a Web browser, and most ISPs provided a browser when you signed up. So there were lots of ways to get a Web browser for free in the mid-’90s.

And when it came to the excesses of the dotcom era, Netscape was among the worst. But whether Netscape could have kept up its perks given its business model is irrelevant when a predator comes in and overnight renders unsalable the product that accounts for 90% of your revenue.

Allegations popped up again after Windows 95’s release that Win95 sabotoged competitors’ office software, such as WordPerfect and Lotus 1-2-3. Within a couple of years, Microsoft Office was a virtual monopoly, with Lotus SmartSuite existing almost exclusively as a budget throw-in with new PCs and WordPerfect Office being slightly more common on new PCs and an also-ran in the marketplace. It’s been five years since any compelling new feature has appeared in Microsoft Office. The most glaring example of this is spam filtering. Innovative e-mail clients today have some form of automatic spam filtering, either present or in development. Outlook doesn’t. “Microsoft Innovation” today means cartoon characters telling you how to indent paragraphs.

And the pricing hasn’t really come down either. When office suites first appeared in 1994, they cost around $500. A complete, non-upgrade retail copy of Microsoft Office XP still costs about $500.

Pricing hasn’t come down on Windows either. In the early 90s, the DOS/Windows bundle cost PC manufacturers about $75. Today, Windows XP Home costs PC manufacturers about $100. The justification is that Windows XP Home is more stable and has more features than Windows 3.1. Of course, the Pentium 4 is faster and less buggy than the original Pentium of 1994, but it costs a lot less. Neither chip can touch Windows’ 85% profit margin.

And when Microsoft wasn’t busy sabotaging competitors’ apps, it was raiding its personnel. Microsoft’s only really big rival in the languages business in the ’80s and early ’90s was Borland, a company founded by the flambouyant Phillippe Kahn. Gates had a nasty habit of raiding Borland’s staff and picking off their stars. It didn’t go both ways. If a Microsoft employee defected, the employee could expect a lawsuit.

Well, Kahn decided to play the game once. He warmed up to a Microsoft staffer whose talents he believed weren’t being fully utilized. The employee didn’t want to jump ship because Microsoft would sue him. Kahn said fine, let Microsoft sue, and Borland would pay whatever was necessary. So he defected. As expected, Gates was enraged and Microsoft sued.

Soon afterward, Kahn and his new hire were in an airport when a Hare Krishna solicited a donation. Kahn handed him $100 on the spot and told him there was a whole lot more in it for him if he’d deliver a message to Bill Gates: “Phillippe just gave us $100 for hot food because he suspects after this lawsuit, your employees are going to need it.”

He delivered the message. Gates wasn’t amused.

It was a bold, brash move. And I think it was pretty darn funny too. But smart? Not really. Borland’s glory days were pretty much over 10 years ago. For every star Borland could lure away, Microsoft could lure away three. Borland’s still in business today, which makes it fairly unique among companies that have taken on Microsoft head-on, but only after several reorganizations and major asset selloffs.

The only notable company that’s taken on Microsoft in the marketplace directly and won has been Intuit, the makers of Quicken. Microsoft even gave away its Quicken competitor, Microsoft Money, for a time, a la Internet Explorer, in an effort to gain market share. When that failed, Microsoft bought Intuit outright. The FTC stepped in and axed the deal.

The thanks Microsoft has given the world for making it the world’s largest software company has been to sell buggy software and do everything it could to force companies and individuals to buy upgrades every couple of years, even when existing software is adequate for the task. While hardware manufacturers scrape for tiny margins, Microsoft enjoys 85% profit margins on its product. But Microsoft mostly sits on its cash, or uses it to buy companies or products since it has a terrible track record of coming up with ideas on its own. The company has never paid dividends, so it’s not even all that much of a friend to its own investors.

For me, the question isn’t why I dislike Microsoft. The question for me is why Microsoft has any friends left.

News flash: Windows is cheaper than Linux!

Lots of people asked me today what I thought about the IDC study that says Windows is cheaper than Linux. I yawned.
Consider the source. Microsoft paid for the thing. You think IDC was going to come back and say Linux is cheaper all around if Microsoft was paying the bill?

Yes, sometimes it’s cheaper. If all your sysadmins know NT and don’t know Unix well, then yes, Windows is going to be cheaper.

But I can think of some times when it’s not. Like if downtime means anything to you at all. My clients scream when I have to reboot an NT server. But I can count on having to reboot a busy NT server once a year due to a lockup or general server stupidity. And virtually every security update is going to require a reboot. I can slipstream a Linux security update almost every time without a reboot–unless it’s a patch to the kernel, which is rare. With the right distribution, I can even upgrade distributions without a reboot. Try that when going from Windows NT or 2000 to something else.

I saw a story on DebianPlanet today about someone bragging he’d done a server migration in 3 hours. You’ll never do that with Windows. But you can do a migration even faster than that–copy everything over somehow to the new server, either through a tape backup or disk cloning, then adjust /etc/fstab as necessary, plop down a generic kernel straight from a distribution, configure the NIC if it’s not a close relative of the old one, and reboot. If you want to get fancy, compile a custom kernel tuned to the new server’s hardware. You can do it all in an hour. We dread the day any of our Windows servers is destroyed by some kind of accident and we can’t find an identical replacement. It’ll take us a minimum of 5 hours to install and update the OS and re-install whatever apps are on it and re-create whatever shares are on it, because that’s how long it takes us to set up a new one out of the box.

And maybe you’ve got picky clients like some of mine. One of them decided out of the blue that they didn’t like how their network shares were named. Never mind that everyone just calls it “the O drive.” Yes, they’re anal-retentive morons, but the client is always right. So one of my coworkers spent a thrilling Saturday un-sharing folders and re-sharing them with new names. On a Samba server, you can just load a text file, change some names, and restart the daemon. Done. The job that took 6 hours and was full of potential for human error is reduced to a few minutes. There’s still potential for human error, but it’s much less because the job isn’t as tedious and boring. And it’s much quicker to fix.

And don’t even get me started on tracking server licenses and CALs. Many organizations, when faced with a Microsoft audit, find it cheaper to just re-buy all of them than to spend the time tracking down the documentation that proves they’re honest. With Linux and open source, there’s no danger of having to pay for something twice, not counting the upgrades. (Those are free too, if you want them.)

Well, that was unproductive

I didn’t do much yesterday but lay around and read ancient Dave Barry columns.
Well, I fixed a longstanding problem with Debby’s site (debugging PHP code is such a pain when the curly braces don’t all line up) and I returned a batch of car fuses I bought a couple of weeks ago that it turned out I didn’t need. Six bucks is six bucks, you know. That’s lunch for two days if I’m careful.

Other than that, I took a nap. Actually I might have dozed off twice. I don’t remember. It was a tough week. On Monday I got paged at 11:30 at night with a tape backup problem and ended up having to go in to work to fix it. Tuesday was quiet. Wednesday and Thursday I got paged late. How late, I don’t remember. But late enough that I’d gone to sleep. Wednesday’s problem I fixed remotely. Thursday’s problem might or might not have been fixable remotely, but the operator kept talking about blinking lights on the tape drive (it’s an internal drive, and I’m convinced he was referring to the blinking lights on the hard drives) and multiple blinking lights on a tape drive usually indicates big trouble, I ended up going in. I stumbled through the problem and finally went home. It wasn’t a hardware problem.

On Friday one of my coworkers took a digital picture of a tape drive for me so I can ask pointed questions about the blinky lights if when the problem comes up again. Looking at the picture, now I remember: blinky lights on the left is big trouble. Blinky lights on the right is highly unlikely, so I guess that’s even bigger trouble.

On my way home from Promise Keepers on Friday, I told my buddies I fully expected to get paged that night about a tape backup problem. They all thought that was pretty awful. I got home, plugged my work laptop in and booted it up, intending to be pre-emptive. I didn’t want to get paged at 1 in the morning and get told a 9:00 backup job failed–not when I needed to be at church at 7 in the morning with an 11-hour day (minimum) ahead of me. As I was firing up pcAnywhere, my phone rang. It was one of the operators, and a backup job had failed. I went in and fixed it.

But, seeing as I didn’t sleep more than six hours uninterrupted any night this week and operated two nights on four (I know, when parenthood comes I’m in trouble, but I really work best on 7 hours during the week and 8 on weekends), I slept in yesterday. I was late for church. The 10:45 service. Pathetic, I know. Then there was a post-service meeting I’d forgotten about. Oops. So I was there for two hours. They mercifully cut it off at two hours. I was out of fuel and was getting irate at the weird questions some people were starting to ask.

Then I came home, got irritated that my SWBell e-mail still isn’t working (six days and counting–makes me wonder if they’ve ditched their Sun equipment for Windows NT), tried to remember how to set up my own mail server again, decided it was too much thinking, and took a nap instead. That set the pace for the day.

I’m hoping this week isn’t a repeat performance of last week.

The pundits are wrong about Apple’s defection

Remember the days when knowing something about computers was a prerequisite for writing about them?
ZDNet’s David Coursey continues to astound me. Yesterday he wondered aloud what Apple could do to keep OS X from running on standard PCs if Apple were to ditch the PowerPC line for an x86-based CPU, or to keep Windows from running on Apple Macs if they became x86-based.

I’d link to the editorial but it’s really not worth the minimal effort it would take.

First, there’s the question of whether it’s even necessary for Apple to migrate. Charlie pointed out that Apple remains profitable. It has 5% of the market, but that’s beside the point. They’re making money. People use Apple Macs for a variety of reasons, and those reasons seem to vary, but speed rarely seems to be the clinching factor. A decade ago, the fastest Mac money could buy was an Amiga with Mac emulation hardware–an Amiga clocked at the same speed would run Mac OS and related software about 10% faster than the real thing. And in 1993, Intel pulled ahead of Motorola in the speed race. Intel had 486s running as fast as 66 MHz, while Motorola’s 68040 topped out at 40 MHz. Apple jumped to the PowerPC line, whose clock rate pretty much kept up with the Pentium line until the last couple of years. While the PowerPCs would occasionally beat an x86 at some benchmark or another, the speed was more a point of advocacy than anything else. When a Mac user quoted one benchmark only to be countered by another benchmark that made the PowerPC look bad, the Mac user just shrugged and moved on to some other advocacy point.

Now that the megahertz gap has become the gigahertz gap, the Mac doesn’t look especially good on paper next to an equivalently priced PC. Apple could close the gigahertz gap and shave a hundred bucks or two off the price of the Mac by leaving Motorola at the altar and shacking up with Intel or AMD. And that’s why every pundit seems to expect the change to happen.

But Steve Jobs won’t do anything unless he thinks it’ll get him something. And Apple offers a highly styled, high-priced, anti-establishment machine. Hippie computers, yuppie price. Well, that was especially true of the now-defunct Flower Power and Blue Dalmation iMacs.

But if Apple puts Intel Inside, some of that anti-establishment lustre goes away. That’s not enough to make or break the deal.

But breaking compatibility with the few million G3- and G4-based Macs already out there might be. The software vendors aren’t going to appreciate the change. Now Apple’s been jerking the software vendors around for years, but a computer is worthless without software. Foisting an instruction set change on them isn’t something Apple can do lightly. And Steve Jobs knows that.

I’m not saying a change won’t happen. But it’s not the sure deal most pundits seem to think it is. More likely, Apple is just pulling a Dell. You know the Dell maneuver. Dell is the only PC vendor that uses Intel CPUs exclusively. But Dell holds routine talks with AMD and shows the guest book signatures to Intel occasionally. Being the last dance partner gives Dell leverage in negotiating with Intel.

I think Apple’s doing the same thing. Apple’s in a stronger negotiating position with Motorola if Steve Jobs can casually mention he’s been playing around with Pentium 4s and Athlon XPs in the labs and really likes what he sees.

But eventually Motorola might decide the CPU business isn’t profitable enough to be worth messing with, or it might decide that it’s a lot easier and more profitable to market the PowerPC as a set of brains for things like printers and routers. Or Apple might decide the gigahertz gap is getting too wide and defect. I’d put the odds of a divorce somewhere below 50 percent. I think I’ll see an AMD CPU in a Mac before I’ll see it in a Dell, but I don’t think either event will happen next year.

But what if it does? Will Apple have to go to AMD and have them design a custom, slightly incompatible CPU as David Coursey hypothesizes?

Worm sweat. Remember the early 1980s, when there were dozens of machines that had Intel CPUs and even ran MS-DOS, yet were, at best, only slightly IBM compatible? OK, David Coursey doesn’t, so I can’t hold it against you if you don’t. But trust me. They existed, and they infuriated a lot of people. There were subtle differences that kept IBM-compatible software from running unmodified. Sometimes the end user could work around those differences, but more often than not, they couldn’t.

All Apple has to do is continue designing their motherboards the way they always have. The Mac ROM bears very little resemblance to the standard PC BIOS. The Mac’s boot block and partition table are all different. If Mac OS X continues to look for those things, it’ll never boot on a standard PC, even if the CPU is the same.

The same differences that keep Mac OS X off of Dells will also keep Windows off Macs. Windows could be modified to compensate for those differences, and there’s a precedent for that–Windows NT 4.0 originally ran on Intel, MIPS, PowerPC, and Alpha CPUs. I used to know someone who swore he ran the PowerPC versions of Windows NT 3.51 and even Windows NT 4.0 natively on a PowerPC-based Mac. NT 3.51 would install on a Mac of comparable vintage, he said. And while NT 4.0 wouldn’t, he said you could upgrade from 3.51 to 4.0 and it would work.

I’m not sure I believe either claim, but you can search Usenet on Google and find plenty of people who ran the PowerPC version of NT on IBM and Motorola workstations. And guess what? Even though those workstations had PowerPC CPUs, they didn’t have a prayer of running Mac OS, for lack of a Mac ROM.

Windows 2000 and XP were exclusively x86-based (although there were beta versions of 2000 for the Alpha), but adjusting to accomodate an x86-based Mac would be much easier than adjusting to another CPU architecture. Would Microsoft go to the trouble just to get at the remaining 5% of the market? Probably. But it’s not guaranteed. And Apple could turn it into a game of leapfrog by modifying its ROM with every machine release. It already does that anyway.

The problem’s a whole lot easier than Coursey thinks.

Copyright terriorists can’t take what they dish out

Aw, poow widdle awe-aye-ay-ay! Poow widdle bay-bee!
The RIAA, if you recall correctly, is endorsing legislation that would permit copyright terrorists holders to knock off or hack into computers they suspect are being used to violate copyright law. So I guess calling what they want “copyright terrorism” is apt. Read more

Analysis of the Apple Mac Xserver

Given my positive reaction to the Compaq Proliant DL320, Svenson e-mailed and asked me what I thought of Apple’s Xserver.
In truest Slashdot fashion, I’m going to present strong opinions about something I’ve never seen. Well, not necessarily the strong opinions compared to some of what you’re used to seeing from my direction. But still…

Short answer: I like the idea. The PPC is a fine chip, and I’ve got a couple of old Macs at work (a 7300 and a 7500) running Debian. One of them keeps an eye on the DHCP servers and mails out daily reports (DHCP on Windows NT is really awful; I didn’t think it was possible to mess it up but Microsoft found a way) and acts as a backup listserver (we make changes on it and see if it breaks before we break the production server). The other one is currently acting as an IMAP/Webmail server that served as an outstanding proof of concept for our next big project. I don’t know that the machines are really any faster than a comparable Pentium-class CPU would be, but they’re robust and solid machines. I wouldn’t hesitate to press them into mission-critical duty if the need arose. For example, if the door opened, I’d be falling all over myself to make those two machines handle DHCP, WINS, and caching DNS for our two remote sites.

So… Apples running Linux are a fine thing. A 1U rack-mount unit with a pair of fast PPC chips in it and capable of running Linux is certainly a fine thing. It’ll suck down less CPU power than an equivalent Intel-based system would, which is an important consideration for densely-packed data centers. I wouldn’t run Mac OS X Server on it because I’d want all of its CPU power to go towards real work, rather than putting pretty pictures on a non-existent screen. Real servers are administered via telnet or dumb terminal.

What I don’t like about the Xserver is the price. As usual, you get more bang for the buck from an x86-based product. The entry-level Xserver has a single 1 GHz PowerPC, 256 megs of RAM, and a 60-gig IDE disk. It’ll set you back a cool 3 grand. We just paid just over $1300 for a Proliant DL320 with a 1.13 GHz P3 CPU, 128 megs of RAM, and a 40-gig IDE disk. Adding 256 megs of RAM is a hundred bucks, and the price difference between a 40- and a 60-gig drive is trivial. Now, granted, Apple’s price includes a server license, and I’m assuming you’ll run Linux or FreeBSD or OpenBSD on the Intel-based system. But Linux and BSD are hardly unproven; you can easily expect them to give you the same reliability as OS X Server and possibly better performance.

But the other thing that makes me uncomfortable is Apple’s experience making and selling and supporting servers, or rather its lack thereof. Compaq is used to making servers that sit in the datacenter and run 24/7. Big businesses have been running their businesses on Compaq servers for more than a decade. Compaq knows how to give businesses what they need. (So does HP, which is a good thing considering HP now owns Compaq.) If anything ever goes wrong with an Apple product, don’t bother calling Apple customer service. If you want to hear a more pleasant, helpful, and unsuspicious voice on the other end, call the IRS. You might even get better advice on how to fix your Mac from the IRS. (Apple will just tell you to remove the third-party memory in the machine. You’ll respond that you have no third-party memory, and they’ll repeat the demand. There. I just saved you a phone call. You don’t have to thank me.)

I know Apple makes good iron that’s capable of running a long time, assuming it has a quality OS on it. I’ve also been around long enough to know that hardware failures happen, regardless of how good the iron is, so you want someone to stand behind it. Compaq knows that IBM and Dell are constantly sitting on the fence like vultures, wanting to grab its business if it messes up, and it acts accordingly. That’s the beauty of competition.

So, what of the Xserver? It’ll be very interesting to see how much less electricity it uses than a comparable Intel-based system. It’ll be very interesting to see whether Apple’s experiment with IDE disks in the enterprise works out. It’ll be even more interesting to see how Apple adjusts to meeting the demands of the enterprise.

It sounds like a great job for Somebody Else.

I’ll be watching that guy’s experience closely.

First look: The Proliant DL320

I’ve had the opportunity the past two days to work with Compaq’s Proliant DL320, an impossibly thin 1U rack-mount server. All I can say is I’m impressed.
When I was in college, a couple of the nearby pizza joints sold oversized 20″ pizzas. The DL320 reminded me of the boxes these pizzas came in. The resemblance isn’t lost on IBM: In its early ads for a competing product, I remember IBM using an impossibly thin young female model holding a 1U server on a pizza-joint set.

HP announced last week that Compaq’s Proliant series will remain basically unchanged, it will just be re-branded with the HP name. HP had no product comparable to the DL320.

I evaluated the entry-level model. It’s a P3 1.13 GHz with 128 MB RAM, dual Intel 100-megabit NICs, and a single 40-gigabyte 7200-rpm Maxtor/Quantum IDE drive. It’s not a heavy-duty server, but it’s not designed to be. It’s designed for businesses that need to get a lot of CPU power into the smallest possible amount of rack space. And in that regard, the DL320 delivers.

Popping the hood reveals a well-designed layout. The P3 is near the front, with three small fans blowing right over it. Two more fans in the rear of the unit pull air out, and two fans in the power supply keep it cool. The unit has four DIMM sockets (one occupied). There’s room for one additional 3.5″ hard drive, and a single 64-bit PCI slot. Obvious applications for that slot include a gigabit Ethernet adapter or a high-end SCSI host adapter. The machine uses a ServerWorks chipset, augmented by a CMD 649 for UMDA-133 support. Compaq utilizes laptop-style floppy and CD-ROM drives to cram all of this into a 1U space.

The fit and finish is very good. The machine looks and feels solid, not flimsy, which is a bit surprising for a server in this price range. Looks-wise, it brings back memories of the old DEC Prioris line.

The rear of the machine has a fairly spartan number of ports: PS/2 keyboard and mouse, two RJ-45 jacks, VGA, one serial port, and two USB ports. There’s no room for luxuries, and such things as a parallel port are questionable in this type of server anyway.

Upon initial powerup, the DL320 asks a number of questions, including what OS you want to run. Directly supported are Windows NT 4.0, Windows 2000, Novell NetWare, and Linux.

Linux installs quickly and the 2.4.18 kernel directly supports the machine’s EtherExpress Pro/100 NICs, CMD 649 IDE, and ServerWorks chipset. A minimal installation of Debian 3.0 booted in 23 seconds, once the machine finished POST. After compiling and installing a kernel with support for all the hardware not in the DL320 removed, that boot time dropped to 15 seconds. That’s less time than it takes for the machine to POST.

Incidentally, that custom kernel was a scant 681K in size. It was befitting of a server with this kind of footprint.

As configured, the DL320 is more than up to the tasks asked of low-end servers, such as user authentication, DNS and DHCP, and mail, file and print services for small workgroups. It would also make a nice applications server, since the applications only need to load once. It would also be outstanding for clustering. For Web server duty or heavier-duty mail, file and print serving, it would be a good idea to upgrade to one of the higher-end DL320s that includes SCSI.

It’s hard to find fault with the DL320. At $1300 for an IDE configuration, it’s a steal. A SCSI-equipped version will run closer to $1900.