Craig Mundie’s infamous speech

I haven’t said anything about Microsoft Executive Craig Mundie’s speech yet. Everyone’s heard of it, of course, and the typical response has been something along the lines of “Now we know Microsoft’s stance on Open Source.”

No, we’ve always known Microsoft’s stance on that. They’re scared of it. Remember the stereotype of open-source programmers: college students and college dropouts writing software in their basements that a lot of people are using, with the goal of toppling an industry giant. Seem far-fetched? Friends, that’s the story of Microsoft itself. Microsoft became an underground sensation in the late 1970s with Microsoft Basic, a programming language for the Altair and other kit computers and later for CP/M. And while we’ll probably never know the entire story of how and why this happened, when IBM decided to outsource the operating system for the IBM PC, they went to Microsoft and got both an OS and the must-have Microsoft Basic. Ten years later, IBM was just another hardware maker–really big, but getting squeezed. Today, 20 years later, IBM’s still a huge force in the computing industry, but in the PC industry, aside from selling ThinkPads, IBM’s a nobody. There may be hardware enthusiasts out there who’d be surprised to hear IBM makes and sells more than just hard drives.

Ironically, Microsoft’s response to this new threat is to act more and more like the giant it toppled. Shared Source isn’t a new idea. IBM was doing that in the 1960s. If you were big enough, you could see the source code. DEC did it too. At work, we have the source code to most of the big VMS applications we depend on day-to-day. Most big operations insist on having that kind of access, so their programmers can add features and fix bugs quickly. If Windows 2000 is ever going to get beyond the small server space, they really have no choice. But they do it with strings attached and without going far enough. An operation the size of the one I work for can’t get the source and fix bugs or optimize the code for a particular application. You’re only permitted to use the source code to help you develop drivers or applications. Meet the new Microsoft: same as the old Microsoft.

Some people have read this speech and concluded that Microsoft believes open-source software killed the dot-com boom. That’s ludicrous, and I don’t see that in the text. OSS was very good for the dot-com boom. OSS lowered the cost of entry: Operating systems such as FreeBSD and Linux ran on cheap PCs, rather than proprietary hardware. The OSs themselves were free, and there was lots of great free software available, such as the Apache Web server, and scripting languages like Python and Perl. You could do all this cool stuff, the same cool stuff you could do with a Sun or SGI server, for the price of a PC. And not only was it cheaper than everybody else, it was also really reliable.

The way I read it, Microsoft didn’t blame OSS for the dot-com bust. Microsoft blamed the advertising model, valuing market share over revenue, and giving stuff away now and then trying to get people to pay later.

I agree. The dot-com boom died because companies couldn’t find ways to make money. But I’m not convinced the dot-com boom was a big mistake. It put the Internet on the map. Before 1995, when the first banner ad ran, there wasn’t much to the Internet. I remember those early days. As a college student in 1993, the Internet was a bonanza to me, even though I wasn’t using it to the extent a lot of my peers were. For me, the Internet was FTP and Gopher and e-mail. I mostly ignored Usenet and IRC. That was pretty much the extent of the Internet. You had to be really determined or really bored or really geeky to get much of anything out of it. The World Wide Web existed, but that was a great mystery to most of us. The SGI workstations on campus had Web browsers. We knew that Mosaic had been ported to Windows, but no one in the crowd I ran in knew how to get it working. When we finally got it running on some of our PCs in 1994, what we found was mostly personal homepages. “Hi, my name is Darren and this is my homepage. Here are some pictures of my cat. Here’s a listing of all the CDs I own. Here are links to all my friends who have homepages.” The running joke then was that there were only 12 pages on the Web, and the main attraction of the 12 was links to the other 11.

By 1995, we had the first signs of business. Banner ads appeared, and graduating students (or dropouts) started trying to build companies around their ideas. The big attraction of the Web was that there was all this information out there, and it was mostly free. Online newspapers and magazines sprung up. Then vendors sprung up, offering huge selections and low prices. You could go to Amazon.com and find any book in print, and you’d pay less for it than you would at Barnes & Noble. CDNow.com did the same thing for music. And their ads supported places that were giving information away. So people started buying computers so they could be part of the show. People flocked from closed services like CompuServe and Prodigy to plain-old Internet, which offered so much more and was cheaper.

Now the party’s ending as dot-coms close up shop, often with their content gone forever. To me, that’s a loss only slightly greater than the loss of the Great Library. There’s some comfort for me: Five years from now, most of that information would be obsolete anyway. But its historical value would remain. But setting sentiment aside, that bonanza of freebies was absolutely necessary. When I was selling computers in 1994, people frequently asked me what a computer was good for. In 1995, it was an easier sell. Some still asked that question, but a lot of people came in wanting “whatever I need to get to be able to get on the Internet.” Our best-selling software package, besides Myst, was Internet In A Box, which bundled dialup software, a Web browser, and access to some nationwide provider. I imagine sales were easier still in 1996 and beyond, but I was out of retail by then. Suddenly, you could buy this $2,000 computer and get all this stuff for free. A lot of companies made a lot of money off that business model. Microsoft made a killing. Dell and Gateway became behemoths. Compaq made enough to buy DEC. AOL made enough to buy Time Warner. Companies like Oracle and Cisco, who sold infrastructure, had licenses to print money. Now the party’s mostly over and these companies have massive hangovers, but what’s the answer to the Ronald Reagan question? Hangover or no hangover, yes, they’re a whole heck of a lot better off than they were four years ago.

I’m shocked that Microsoft thinks the dot-com phenomenon was a bad thing.

If, in 1995, the Web came into its own but every site had been subscription-based, this stuff wouldn’t have happened. It was hard enough to swallow $2,000 for a new PC, plus 20 bucks a month for Internet. Now I have to pay $9.95 a month to read a magazine? I could just subscribe to the paper edition and save $2,500!

The new Internet would have been the same as the old Internet, only you’d have to be more than just bored, determined, and geeky to make it happen. You’d also have to have a pretty big pile of cash.

The dot-com boom put the Internet on the map, made it the hot ticket. The dot-com bust hurt. Now that sites are dropping out of the sky or at least scaling operations way back, more than half of the Web sites I read regularly are Weblogs–today’s new and improved personal home page. People just like me. The biggest difference between 1994 and 2001? The personal home pages are better. Yeah, the pictures of the cat are still there sometimes, but at least there’s wit and wisdom and insight added. When I click on those links to the left, I usually learn something.

But there is another difference. Now we know why it would make sense to pay for a magazine on the Internet instead of paper. Information that takes a month to make it into print goes online in minutes. It’s much easier and faster to type a word into a search engine than to leaf through a magazine. We can hear any baseball game we want, whether a local radio station carries our favorite team or not. The world’s a lot smaller and faster now, and we’ve found we like it.

The pump is primed. Now we have to figure out how to make this profitable. The free ride is pretty much over. But now that we’ve seen what’s possible, we’re willing to start thinking about whipping out the credit cards again and signing up, provided the cost isn’t outrageous.

The only thing in Mundie’s speech that I can see that Linus Torvalds and Alan Cox and Dan Gillmor should take offense to is Microsoft’s suspicion of anyone giving something away for free. Sure, Microsoft gives lots of stuff away, but always with ulterior motives. Internet Explorer is free because Microsoft was afraid of Netscape. Outlook 98 was free for a while to hurt Lotus Notes. Microsoft Money was free for a while so Microsoft could get some share from Quicken. It stopped being free when Microsoft signed a deal with Intuit to bundle Internet Explorer with Quicken instead of Netscape. And there are other examples.

Microsoft knows that you can give stuff away with strings attached and make money off the residuals. What Microsoft hasn’t learned is that you can give stuff away without the strings attached and still make money off the residuals. The dot-com bust only proves that you can’t necessarily make as much as you may have thought, and that you’d better spend what you do make very wisely.

The Internet needs to be remade, yes, and it needs to find some sustainable business models (one size doesn’t fit all). But if Mundie thinks the world is chomping at the bit to have Microsoft remake the Internet their way, he’s in for a rude awakening.

More Like This: Microsoft Linux Weblogs Internet Commentary

03/31/2001

I got the call late last night. My great aunt in Cleveland died yesterday.

It’s kind of become tradition, on my mom’s side of the family at least, for me to write the tribute when a relative dies. Somehow I’m good at expressing those sentiments, and, well, I am a writer. But I can’t write Aunt Lilian’s tribute, and it has nothing to do with Aunt Lilian being on my dad’s side of the family.

I hardly knew her.

When my dad moved to Kansas City in the early 1970s, he never really looked back. He adopted Kansas City as his hometown, and after he and my mom married, he adopted them as his family. His father probably saw me fewer than a dozen times. His mother only saw me six or seven times more than that. I’ve seen one of Dad’s cousins twice, and his other cousin once. I met his aunt and uncle once, at their 40th or 50th wedding anniversary, in 1989. I only remember it being a big number in a day and age when few people make it to their tenth.

Once I got out of college and on my own, I always said I’d make it back to Cleveland. Some Thanksgiving, or sometime when I had some vacation time due, I’d fly out or take a road trip. I never did. It was always easier to just go to Kansas City. It’s closer, and economies of scale were on my side. One year I even had an airline ticket. I ended up not using it.

Then Uncle Bob died. I didn’t even make it to the funeral. I knew Aunt Lilian wouldn’t have a whole lot of time left. When you’ve been married that long, once your partner dies, you generally follow pretty soon.

Last Thanksgiving, I thought about going. I didn’t. I was thinking maybe this year would be the year. But I know good and well I probably wouldn’t have.

This has been all about me. That’s terrible. So what do I know about Aunt Lilian?

She was my dad’s favorite aunt. I think she may have been his only aunt, but that’s OK. She was worthy of the title. Her brother was my dad’s father and my grandfather. You can say a lot of things about Dr. Ralph–he was a brilliant man, a wise man, a great doctor, a small-time tycoon. But he wasn’t a nice man. Aunt Lilian was much more pleasant than her brother.

Dad didn’t talk about his family much. But he’d talk about Uncle Bob and Aunt Lilian and their sons, Bobby and Sterling. I think that says a lot.

Aunt Lilian was known for her chocolate chip cookies. That was the first thing Dad said about her. She didn’t know what the fuss was about. The recipe was right there on the back of the package of the brand of chocolate chips she bought and had been buying for most of the century. So anyone else could follow the same recipe, but somehow it wasn’t ever the same. I remember Dad and his cousins, Bob and Sterling, discussing why at one point. It was funny hearing a doctor, a physical therapist, and an electrical engineer talking about why a cookie recipe couldn’t be duplicated. These three great minds couldn’t figure it out. Aunt Lilian did her best to ignore them, and that was probably for the best.

And sadly, that’s the only story I can tell about her. She must have been pushing 100 when she died; I know Uncle Bob was 95 or 96 when he passed on. Her brother died more than 21 years ago, and he was in his 70s. All that time. I’m 26. In my 26 years, I managed to spend one weekend with her.

Don’t make the same mistake I made. A life is a terrible thing to waste.

A free memory tester and a Linux tip

I lost my notes for today somehow, and I’ve been home a grand total of 14 hours the past 48 hours (I think), so you’ll have to excuse this quickie.

Free memory tester. I found this over the weekend:

www.memtest86.com

It’s a memory test disk. Self-booting, about 74K in memory, builds from DOS, Windows, or Linux (and possibly others too). I use and recommend RAM Stress Test, by Ultra-X Inc. ( www.uxd.com ), but this seems nearly as good and it’s free. If you’ve got frequent bluescreens, download this and try it on your PC. A lot of problems are caused by bad memory, and the power-on memory test usually won’t find it. Neither will most DOS-based memory utilities.

MemTest is still no substitute for buying brand-name memory, though I’d never let commodity memory sit on the same table with my hardware without testing it first. About 1 in 1,000 brand-name sticks are bad, as opposed to about 1 in 12 commodity sticks, in my extensive experience. One of the first things I do when faced with an unstable system is test the memory overnight, just in case.

Linux (and Unix) tip of the day. If you vaguely remember a command but can’t completely recall it, type the part you remember, then hit tab. A list of possibilities will appear. Hopefully the command you’re looking for is among them.

And if any of the possibilities sound interesting, type man command. The online documentation will come up and explain usage.

Don’t let anyone fool you. You never master this OS. You just learn how to find what you need to get a job done quickly. And hopefully you develop a long memory.

Outta here. And if you’ve mailed me over the last couple of days, my apologies. I’ll get back to you tonight after work.

02/11/2001

Mailbag:

Innovation

Steve DeLassus asked me for some ideas of where I see innovation, since I said Microsoft isn’t it. That’s a tough question. On the end-user side, it’s definitely not Microsoft. They’ve refined some old ideas, but most of their idea of Innovation is taking utilities that were once separate products from companies Microsoft wants to drive out of business, then grafting them onto the OS in such a way as to make them appear integrated. What purpose does making the Explorer interface look like a Web browser serve? Doesn’t everyone who’s used a real file manager (e.g. Norton Commander or Directory Opus) agree that the consumer would have been better served by replicating something along those lines? Not that that’s particularly innovative either, but at least it’s improving. The only innovation Microsoft does outside of the software development arena (and that makes sense; Microsoft is first and foremost a languages company and always has been) seems to be to try to find ways to drive other companies out of business or to extract more money out of their customers.

Richard Stallman’s GNU movement has very rarely been innovative; it’s been all about cloning software they like and making their versions free all along. It’s probably fair to call Emacs innovative; it was a text editor with a built-in programming environment long before MS Word had that capability. But I don’t see a whole lot of innovation coming out of the Open Source arena–they’re just trying to do the same thing cheaper, and in some cases, shorter and faster, than everyone else.

So, where is there innovation? I was thinking there was more innovation on the hardware side of things, but then I realized that a lot of those “innovations” are just refinements that most people think should have been there in the first place–drives capable of writing to both DVD-R and CD-R media, for instance. Hardware acceleration of sound and network cards is another. Amiga had hardware acceleration of its sound in 1985, so it’s hard to call that innovation. It’s an obvious idea.

A lot of people think Apple and Microsoft are being really innovative with their optical mice, but optical mice were around for years and years before either of those companies “invented” them. The optical mice of 2000 are much better than the optical mice of 1991–no longer requiring a gridded mouse pad and providing smoother movement–but remember, in 1991, the mainstream CPUs were the Intel 80286 and the 80386sx. That’s a far, far cry from the Thunderbird-core AMD Athlon. You would expect a certain degree of improvement.

I’d say the PalmPilot is innovative, but all they really did was take a failed product, Apple’s Newton, and figure out what went wrong and make it better. So I guess you could say Apple innovated there, but that was a long time ago.

So I guess the only big innovation I’ve seen recently from the end-user side of things has been in the software arena after all. I’m still not sold on Ray Ozzie’s Groove, but have to admit it’s much more forward-thinking than most of the things I’ve seen. Sure, it looks like he’s aping Napster, but he started working on Groove in 1997, long before Napster. Napster’s just file sharing, which has been going on since the 1960s at least, but in a new way. There again, I’m not sure that it’s quite right to call it true innovation, but I think it’s more innovative than most of the things I’ve seen come out of Microsoft and Apple, who are mostly content to just copy each other and SGI and Amiga and Xerox. If they’re going to steal, they should at least steal the best ideas SGI and Amiga had. Amiga hid its menu bars to save screen space. Maybe that shouldn’t be the default behavior, but it would be nice to make that an option. SGI went one further, making the pull-down menus accessible anywhere onscreen by right-clicking. This isn’t the same as the context menu–the program’s main menu came up this way. This saved real estate and mouse movements.

I’m sure I could think of some others but I’m out of time this morning. I’d like to hear what some other people think is innovative. And yes, I’m going to try to catch up on e-mail, either this afternoon or this evening. I’ve got a pretty big backlog now.

Mailbag:

Innovation

HappyThanksgiving, 2000 edition

Happy Thanksgiving. Mail and everything will wait. I just got home from Thanksgiving Eve services and I’m getting ready to hit the road. I’m not taking a computer with me. Not sure yet what day I’m coming back. Bad for readership, I know, but it’s not like there’ll be much these next few days to write about anyway.

Expect me back in full force Sunday. I may do a short shrift Saturday. I’ll part with a paraphrase from pastor: If you’re grateful, you’re rich, no matter what you have or don’t have.
So, on that note, may God grant us grateful hearts for this great day. Enjoy it, and I’ll be back with you soon.

AMD roadmap and analysis. Finally! Something to analyze. AnandTech posted AMD’s roadmap for the next couple of years sometime yesterday (I missed it). With AMD saying again and again that they aren’t a chipset company, I’m starting to wonder… Why not bring in the necessary engineers to make good, really good chipsets, then outsource manufacturing to NatSemi and/or IBM? That saves AMD’s precious fab capacity–capacity that’s sorely needed to produce profitable CPUs and flash memory–and reduces their dependence on VIA, SiS and ALi, of which only VIA has any kind of a decent track record of late.
Of course, this arrangement would make chipsets even less profitable, which is one of the reasons AMD tries to stay out of that business, but the better your chipsets, the more CPUs you’ll sell. AMD’s currently selling every CPU they can make, but that won’t necessarily last forever. They need to build another fab, and they need to start taking measures now to ensure that there’ll be sufficient demand once that fab is up and going to sell every chip the new fab is capable of producing.

Nothing happened yesterday for me to analyze? That sure doesn’t happen often. I still remember a personality test I took a couple of years ago (I have a friend to thank for reminding me of that test). At my worst, I can be described in two words: overdominant overanalysis.

But yesterday provided next to nothing in the way of analysis fodder. Slow news day. At work, I planned upcoming projects with some higher-ups. And I built shelves. Hey, mindless work is good for you occasionally. One of the women in the office walked by as we were assembling the shelves and said, “I’ll bet you guys love Christmas.” Heh. Except for one thing. I’m single, and the guy I was working with is married but hasn’t started his family yet. So Christmas doesn’t consist of building all that much stuff. But it made me wonder. Do guys start families for the sake of starting families, or to get an excuse to put together and play with toys?

There’s mail. But I’m out of time this morning. As some of you know, I hand mail off to my sister to post, in order to cut down on my keystrokes and mouse movements and give my wrists a break. I suspect Di’s left town for the holiday already, so I’ll see about handling the mail myself tonight, before I skip town for a couple of days. I’ll try to post something while I’m gone, but I won’t have any way of reading my mail on the road.

Mac mice, PC data recovery

A two-button Mac mouse!? Frank McPherson asked what I would think of the multibutton/scroll wheel support in Mac OS X. Third-party multibutton mice have been supported via extensions for several years, but not officially from Ye Olde Apple. So what do I think? About stinkin’ time!

I use 3-button mice on my Windows boxes. The middle button double-clicks. Cuts down on clicks. I like it. On Unix, where the middle button brings up menus, I’d prefer a fourth button for double-clicking. Scroll wheels I don’t care about. The page up/down keys have performed that function just fine for 20 years. But some people like them; no harm done.

Data recovery. One of my users had a disk yesterday that wouldn’t read. Scandisk wouldn’t fix it. Norton Utilities 2000 wouldn’t fix it. I called in Norton Utilities 8. Its disktool.exe includes an option to revive a disk, essentially by doing a low-level format in place (presumably it reads the data, formats the cylinder, then writes the data back). That did the trick wonderfully. Run Disktool, then run NDD, then copy the contents to a fresh disk immediately.

So, if you ever run across an old DOS version of the Norton Utilities (version 7 or 8 certainly; earlier versions may be useful too), keep them! It’s something you’ll maybe need once a year. But when you need them, you need them badly. (Or someone you support does, since those in the know never rely on floppies for long-term data storage.) Recent versions of Norton Utilities for Win32 don’t include all of the old command-line utilities.

Hey, who was the genius who decided it was a good idea to cut, copy and paste files from the desktop? One of the nicest people in the world slipped up today copying a file. She hit cut instead of copy, then when she went to paste the file to the destination, she got an error message. Bye-bye file. Cut/copy-paste works fine for small files, but this was a 30-meg PowerPoint presentation. My colleague who supports her department couldn’t get the file back. I ride in on my white horse, Norton Utilities 4.0 for Windows in hand, and run Unerase off the CD. I get the file back, or so it appears. The undeleted copy won’t open. On a hunch, I hit paste. Another copy comes up. PowerPoint chokes on it too.

I tried everything. I ran PC Magazine’s Unfrag on it, which sometimes fixes problematic Office documents. No dice. I downloaded a PowerPoint recovery program. The document crashed the program. Thanks guys. Robyn never did you any harm. Now she’s out a presentation. Not that Microsoft cares, seeing as they already have the money.

I walked away wondering what would have happened if Amiga had won…

And there’s more to life than computers. There’s songwriting. After services tonight, the music director, John Scheusner, walks up and points at me. “Don’t go anywhere.” His girlfriend, Jennifer, in earshot, asks what we’re plotting. “I’m gonna play Dave the song that he wrote. You’re more than welcome to join us.”

Actually, it’s the song John and I wrote. I wrote some lyrics. John rearranged them a little (the way I wrote it, the song was too fast–imagine that, something too fast from someone used to writing punk rock) and wrote music.

I wrote the song hearing it sung like The Cars, (along the lines of “Magic,” if you’re familiar with their work) but what John wrote and played sounded more like Joe Jackson. Jazzy. I thought it was great. Jennfier thought it was really great.

Then John tells me they’re playing it Sunday. They’re what!? That will be WEIRD. And after the service will be weird too, seeing as everybody knows me and nobody’s ever seen me take a lick of interest in worship music before.

I like it now, but the lyrics are nothing special, so I don’t know if I’ll like it in six months. We’ll see. Some people will think it’s the greatest thing there ever was, just because two people they know wrote it. Others will call it a crappy worship song, but hopefully they’ll give us a little credit: At least we’re producing our own crappy worship songs instead of playing someone else’s.

Then John turns to me on the way out. “Hey, you’re a writer. How do we go about copyrighting this thing?” Besides writing “Copyright 2000 by John Scheusner and Dave Farquhar” on every copy, there’s this.  That’s what the Web is for, friends.

~~~~~~~~~~

Note: I post this letter without comment, since it’s a response to a letter I wrote. My stuff is in italics. I’m not sure I totally agree with all of it, but it certainly made me think a lot and I can’t fault the logic.

From: John Klos
Subject: Re: Your letter on Jerry Pournelle’s site

Hello, Dave,

I found both your writeup and this letter interesting. Especially interesting is both your reaction and Jerry’s reaction to my initial letter, which had little to do with my server.To restate my feelings, I was disturbed about Jerry’s column because it sounded so damned unscientific, and I felt that he had a responsibility to do better.
His conclusion sounded like something a salesperson would say, and in fact did sound like things I have heard from salespeople and self-promoted, wannabe geeks. I’ve heard all sorts of tales from people like this, such as the fact that computers get slower with age because the ram wears out…

Mentioning my Amiga was simply meant to point out that not only was I talking about something that bothered me, but I am running systems that “conventional wisdom” would say are underpowered. However, based upon what both you and Jerry have replied, I suppose I should’ve explained more about my Amiga.

I have about 50 users on erika (named after a dear friend). At any one moment, there are anywhere from half a dozen to a dozen people logged on. Now, I don’t claim to know what a Microsoft Terminal Server is, nor what it does, but it sounds something like an ’80s way of Microsoft subverting telnet.

My users actually telnet (technically, they all use ssh; telnet is off), they actually do tons of work is a shell, actually use pine for email and links (a lynx successor) for browsing. I have a number of developers who do most of their development work in any of a number of languages on erika (Perl, C, C++, PHP, Python, even Fortran!).

Most of my users can be separated into two groups: geeks and novices. Novices usually want simple email or want to host their domain with a minimum of fuss; most of them actually welcome the simplicity, speed, and consistency of pine as compared to slow and buggy webmail. Who has used webmail and never typed a long letter only to have an error destroy the entire thing?

The geeks are why sixgirls.org got started. We all
had a need for a place
to call home, as we all have experienced the nomadic life of being a geek
on the Internet with no server of our own. We drifted from ISP to ISP
looking for a place where our Unix was nice, where our sysadmins listened,
and where corporate interests weren’t going to yank stuff out from underneath us at any moment. Over the years, many ISPs have stopped
offering shell access and generally have gotten too big for the comfort of
geeks.

If Jerry were replying to this now, I could see him saying that shells are
old school and that erika is perhaps not much more than a home for  orphans and die-hard Unix fans. I used to think so, too, but the more novice users I add, the more convinced I am that people who have had no shell experience at all prefer the ease, speed, and consistency of the shell
over a web browser type interface. They’re amazed at the speed. They’re
surprised over the ability to instantly interact with others using talk and ytalk.

The point is that this is neither a stopgap nor a dead end; this IS the
future. I read your message to Jerry and it got me thinking a lot. An awful
lot. First on the wisdom of using something other than what Intel calls a server, then on the wisdom of using something other than a Wintel box as a server. I probably wouldn’t shout it from the mountaintops if I were doing it, but I’ve done it myself. As an Amiga veteran (I once published an article in Amazing Computing), I smiled when I saw what you were doing with your A4000. And some people no doubt are very interested in that. I wrote some about that on my Weblogs site (address below if you’re interested).

I am a Unix Systems Administrator, and I’ve set up lots of servers. I made
my decision to run everything on my Amiga based upon several
criteria:
One, x86 hardware is low quality. I stress test all of the servers I
build, and most x86 hardware is flawed in one way or another. Even if
those flaws are so insignificant that they never affect the running of a
server, I cannot help but wonder why my stress testing code will run just
fine on one computer for months and will run fine on another computer for
a week, but then dump a core or stop with an error. But this is quite
commonplace with x86 hardware.

For example, my girlfriend’s IBM brand FreeBSD computer can run the stress testing software indefinitely while she is running the GIMP, Netscape, and all sorts of other things. This is one of the few PCs that never has any problems with this stress testing software. But most of the other servers I set up, from PIIIs, dual processor PIIIs and dual Celerons, to Cyrix 6×86 and MII, end up having a problem with my software after anywhere from a few days to a few weeks. But they all have remarkable uptimes, and none crash for any reason other than human error (like kicking the cord).

However, my Amigas and my PowerMacs can run this software indefinitely.

So although I work with x86 extensively, it’s not my ideal choice. So what
else is there? There’s SPARC, MIPS, m68k, PowerPC, Alpha, StrongARM… pleanty of choices.

I have a few PowerMacs and a dual processor Amiga (68060 and 200 mhz PPC 604e); however, NetBSD for PowerMacs is not yet as mature as I need it to be. For one, there is no port of MIT pthreads, which is required for MySQL. Several of my users depend on MySQL, so until that is fixed, I can’t consider using my PowerMac. Also, because of the need to boot using Open Firmware, I cannot set up my PowerMac to boot unattended. Since my machine is colocated, I would have to be able to run down to the colocation facility if anything ever happened to it. That’s
fine if I’m in the city, but what happens when I’m travelling in Europe?

SPARC is nice, but expensive. If I could afford a nice UltraSPARC, I
would. However, this porject started as a way to have a home for
geeks; coming up with a minimum of $3000 for something I didn’t even plan to charge for wasn’t an option.

Alpha seems too much like PC hardware, but I’d certainly be willing to
give it a try should send me an old Alpha box.

With MIPS, again, the issue is price. I’ve always respected the quality of
SGI hardware, so I’d definitely set one up if one were donated.

StrongARM is decent. I even researched this a bit; I can get an ATX
motherboard from the UK with a 233 mhz StrongARM for about 310 quid. Not too bad.

But short of all of that, I had a nice Amiga 4000 with a 66 mhz 68060, 64
bit ram, and wide ultra SCSI on board. Now what impresses me about this
hardware is that I’ve run it constantly. When I went to New Orleans last
year during the summer, I left it in the apartment, running, while the
temperatures were up around 100 degrees. When I came back, it was
fine. Not a complaint.

That’s the way it’s always been with all of my Amigas. I plug them in,
they run; when I’m done, I turn off the monitor. So when I was considering
what computer to use as a server when I’d be paying for a burstable 10
Mbps colocation, I wanted something that would be stable and consistent.

 Hence Amiga.

One of my users, after reading your letter (and, I guess, Jerry’s),
thought that I should mention the load average of the server; I assume
this is because of the indirectly stated assumption that a 66 mhz 68060 is
just squeaking by. To clarify that, a 66 mhz 68060 is faster per mhz than
any Pentium by a measurable margin when using either optimised code (such as a distributed.net client) or straight compiled code (such as LAME). We get about 25,000 hits a day, for a total of about 200 megs a day, which accounts for one e

ighth of one percent of the CPU time. We run as a Stratum 2 time server for several hundred computers, we run POP and IMAP services, sendmail, and we’re the primary nameserver for perhaps a hundred machines. With a distributed.net client running, our load average hovers arount 1.18, which means that without the dnet client, we’d be idle most of the time.

If that weren’t good enough, NetBSD 1.5 (we’re running 1.4.2) has a much
improved virtual memory system (UVM), improvements and speedups in the TCP stack (and complete IPv6 support), scheduler enhancements, good softdep support in the filesystem (as if two 10k rpm 18 gig IBM wide ultra drives aren’t fast enough), and more.

In other words, things are only going to get better.

The other question you raise (sort of) is why Linux gets so much more
attention than the BSD flavors. I’m still trying to figure that one
out. Part of it is probably due to the existance of Red Hat and
Caldera and others. FreeBSD gets some promotion from Walnut
Creek/BSDi, but one only has to look at the success of Slackware to
see how that compares.

It’s all hype; people love buzz words, and so a cycle begins: people talk
about Linux, companies spring up to provide Linux stuff, and people hear
more and talk more about Linux.

It’s not a bad thing; anything that moves the mainstream away from
Microsoft is good. However, the current trend in Linux is not good. Red
Hat (the company), arguably the biggest force in popularising Linux in the
US, is becoming less and less like Linux and more and more like a software company. They’re releasing unstable release after unstable release with no apologies. Something I said a little while ago, and someone has been using as his quote in his email:
In the Linux world, all of the major distributions have become
companies. How much revenue would Red Hat generate if their product was flawless? How much support would they sell?

I summarise this by saying that it is no longer in their best interest to
have the best product. It appears to be sufficient to have a working
product they can use to “ride the wave” of popularity of Linux.

I used Linux for a long time, but ultimately I was always frustrated with
the (sometimes significant) differences between the distributions, and
sometimes the differences between versions of the same distribution. Why
was it that an Amiga running AmigaDOS was more consistent with Apache and Samba docs than any particular Linux? Where was Linux sticking all of
these config files, and why wasn’t there documentation saying where the
stuff was and why?

When I first started using BSD, I fell in love with its consistency, its
no bull attitude towards ports and packa
ges, and its professional and
clean feel. Needless to say, I don’t do much linux anymore.

It may well be due to the people involved. Linus Torvalds is a
likeable guy, a smart guy, easily identifiable by a largely computer
illiterate press as an anti-Gates. And he looks the part. Bob Young is
loud and flambouyant. Caldera’s the company that sued Microsoft and probably would have won if it hadn’t settled out of court. Richard
Stallman torques a lot of people off, but he’s very good at getting
himself heard, and the GPL seems designed at least in part to attract
attention. The BSD license is more free than the GPL, but while
freedom is one of Stallman’s goals, clearly getting attention for his
movement is another, and in that regard Stallman succeeds much more than the BSD camp. The BSD license may be too free for its own good.

Yes, there aren’t many “figureheads” for BSD; most of the ones I know of
don’t complain about Linux, whereas Linux people often do complain about the BSD folks (the major complaint being the license).

I know Jerry pays more attention to Linux than the BSDs partly because Linux has a bigger audience, but he certainly knows more about Linux than about any other Unix. Very soon after he launched his website, a couple of Linux gurus (most notably Moshe Bar, himself now a Byte columnist) started corresponding with him regularly, and they’ve made Linux a reasonably comfortable place for him, answering his questions and getting him up and going.

So then it should be their responsibility, as Linux advocates, to give
Jerry a slightly more complete story, in my opinion.

As for the rest of the press, most of them pay attention to Linux only because of the aforementioned talking heads. I have a degree in journalism from supposedly the best journalism school in the free world, which gives me some insight into how the press works (or doesn’t, as is usually the case). There are computer journalists who get it, but a g

ood deal of them are writing about computers for no reason in particular, and their previous job and their next job are likely to be writing about something else. In journalism, if three sources corroborate something, you can treat it as fact. Microsoft-sympathetic sources are rampant, wherever you are. The journalist probably has a Mac sympathy since there’s a decent chance that’s what he uses. If he uses a Windows PC, he may or may not realize it. He’s probably heard of Unix, but his chances of having three local Unix-sympathetic sources to use consistently are fairly slim. His chances of having three Unix-sympathetic sources who agree enough for him to treat what they say as fact (especially if one of his Microsofties contradicts it) are probably even more slim.

Which furthers my previous point: Jerry’s Linux friends should be more
complete in their advocacy.

Media often seems to desire to cater to the lowest common denominator, but it is refreshing to see what happens when it doesn’t; I can’t stand US
news on TV, but I’ll willingly watch BBC news, and will often learn more
about US news than if I had watched a US news program.

But I think that part of the problem, which is compounded by the above, is
that there are too many journaists that are writing about computers,
rather than computer people writing about computers.

After all, which is more presumptuous: a journaist who thinks that he/she
can enter the technical world of computing and write authoritatively about
it, or a computer person who attempts to be a part time journalist? I’d
prefer the latter, even if it doesn’t include all of the accoutrements
that come from the writings of a real journalist.

And looking at the movement as a whole, keep in mind that journalists look for stories. Let’s face it: A college student from Finland writing an operating system and giving it away and millions of people thinking it’s better than Windows is a big story. And let’s face it, RMS running
around looking like John the Baptist extolling the virtues of something called Free Software is another really good story, though he’d get a lot more press if he’d talk more candidly about the rest of his life, since that might be the hook that gets the story. Can’t you see this one now?

Yes. Both of those stories would seem much more interesting than, “It’s
been over three years and counting since a remote hole was found in
OpenBSD”, because it’s not sensationalistic, nor is it interesting, nor
can someone explain how you might end up running OpenBSD on your
appliances (well, you might, but the fact that it’s secure means that it’d
be as boring as telling you why your bathtub hasn’t collapsed yet).

Richard Stallman used to keep a bed in his office at the MIT Artificial Intelligence Lab.

He slept there. He used the shower down the hall. He didn’t have a home outside the office. It would have distracted him from his cause: Giving away software.

Stallman founded the Free Software movement in 1983. Regarded by many as the prophet of his movement (and looking the part, thanks to his long, unkempt hair and beard), Stallman is both one of its most highly regarded programmers and perhaps its most outspoken activist, speaking at various functions around the world.

Linux was newsworthy, thanks to the people behind it, way back in 1993 when hardly anyone was using it. Back then, they were the story. Now, they can still be the story, depending on the writer’s approach.

If there are similar stories in the BSD camp, I’m not aware of them. (I can tell you the philosophical differences between OpenBSD,  NetBSD and FreeBSD and I know a little about the BSD directory structure, but that’s where my knowledge runs up against its limits. I’d say I’m more familiar with BSD than the average computer user but that’s not saying much.) But I can tell you my editor would have absolutely eaten this up. After he or she confirmed it wasn’t fiction.

The history is a little dry; the only “juicy” part is where Berkeley had
to deal with a lawsuit from AT&T (or Bell Labs; I’m not doing my research
here) before they could make their source free.

Nowadays, people are interested because a major layer of Mac OS X is BSD, and is taken from the FreeBSD and NetBSD source trees. Therefore, millions of people who otherwise know nothing about BSD or its history will end up running it when Mac OS X Final comes out in January; lots of people already are running Mac OS X Beta, but chances are good that the people who bought the Beta know about the fact that it’s running on BSD.

And it’s certainly arguable that BSD is much more powerful and robust than Windows 2000. So there’s a story for you. Does that answer any of your question?

Yes; I hope I’ve clarified my issues, too.

Neat site! I’ll have to keep up on it.

Thanks,
John Klos