Tag Archives: hertz

E.R. Johnston, the train dealer, the myth, the legend

Something today made me think of Johnston Electric, a legendary, long-gone train store in St. Louis’ Dutchtown neighborhood that sold Lionel, American Flyer, and HO scale trains.

I was in the old Marty’s Model Railroads store in Affton one afternoon several years ago while Marty was going through a box of trains he had bought earlier in the day. He found some manuals, catalogs, and other paperwork, which he set aside. Then he pulled out an old newspaper page. “I wonder why he saved that?” he asked. He set the paper down, then something caught his eye. “Oh, that’s why,” he said, and pointed at an ad on the page.

“Johnston’s,” it read at the bottom. “3118 Chippewa Street.”

“I spent many, many hours at that place when I was younger,” Marty said.
Continue reading E.R. Johnston, the train dealer, the myth, the legend

The "good enough" PC

PC World has a treatise on “good enough” computing. This isn’t actually a new trend but it’s never stood still for as long as it has now.Jerry Pournelle used to describe cheap CPUs from Cyrix and IDT in the late 1990s as “good enough.” Running at 166 and 200 MHz, they ran Windows 95 and NT4 and Office 97 just fine. They weren’t good gaming CPUs, but for everything else, they were great, and you could build a computer with one of those and save $100 or more over using a comparable Intel CPU.

Trouble was, the mainstream moved. Intel knocked off all the upstarts by starting a megahertz war, and AMD came back from a near-death experience to compete. The requirements to run Windows increased nearly as rapidly, and it wasn’t all that long before 900 MHz was pretty much the bare minimum to run Windows comfortably.

But chips kept getting cheaper, and today you can buy a 2 GHz CPU for pretty close to what a Cyrix or WinChip CPU cost. But you get more than 10 times the power for that money. And Windows XP runs perfectly comfortably on a 2 GHz CPU, whether it’s a new Intel Atom or Celeron or a 5-year-old corporate discard. So does Office 2003, which is the very last version of Office that any sane person would want to use.*

*Besides being the evil spawn of Windows Vista and Microsoft Bob, Office 2007 also crashes more often than Windows 3.0 did. The only way I can go a week without losing work from Office 2007 crashing is to go on vacation.

The PC World author claims that Linux and Open Office running on Intel Atom CPUs will be the undoing of Microsoft. I think that’s a bit of a stretch. Netbooks running Linux got returned to the vendor a lot. I suspect the biggest reason is because they probably couldn’t figure out how to get their USB mobile broadband cards–I’m talking the stuff that cellphone vendors offer for 50 bucks a month–working in Linux. That, and they probably couldn’t get Flash working so they couldn’t see Facebook and other popular sites the way they could on their regular PCs.

Frankly, the two things that keep me from buying a $200 Dell Vostro netbook this weekend are the price of mobile broadband ($50 a month), and my concerns about the reliability of anything sold by Dell in the last 5-6 years. I work with a lot of Dell equipment, and once the warranty goes, their machines do not age gracefully at all. But I think Dell will sell a lot of these units, because the price is absurdly low, they weigh two pounds, and they run anything but 3D games and intensive graphics apps nice and fast. Sure, a dual-core system with its memory maxed out and a solid state disk will outrun it, sometimes even running circles around it, but that system will also cost 10 times as much.

I do think Office 2007 is the best thing that ever happened to Open Office. Open Office’s interface is a lot more familiar and doesn’t hide anything, and while it may not be as fast as Office 2003, it’s certainly faster at most things than Office 2007 is.

Linux has been usable for basic computing for a very long time, but getting it installed and configured remains a challenge at times. A netbook that connects painlessly to the wireless networks in restaurants and to cellphone makers’ mobile broadband cards while running Linux probably stands a chance. Giving some automated, easy means to synchronize application data and web bookmarks between the netbook and a desktop PC would probably help a lot too–something that does the same thing that Activesync does for moving data between Windows PCs and Windows Mobile PDAs. Will these things happen?

But I do think an era of “good enough” is upon us. There was a time when the top-of-the-line PC would be entry level within a year or two, and that’s not really true anymore. The entry-level PC of today is comparable to the mid-range PC of five years ago. For most of my lifetime, basic computing on a five-year-old PC was always painful, no matter how good that PC was when it was new. That’s not the case today.

Graphic designers, video producers, and scientists will always need ever-more powerful systems for their work, so they’ll continue to drive the cutting edge. But everyday computing is stabilizing. I don’t think Intel wants the future of everyday computing to be the cheap Atom CPU, but at this point it may be impossible to avoid it. If Intel decides to quit playing in this space, AMD can design something comparable to replace it in the marketplace. The Geode won’t cut it, but something based on the Athlon XP architecture and built using a modern process certainly would.

And frankly I’m glad about this development. It’s been nice not having to buy a new computer every three years or so.

The Christmas train

I saw a story on one of my train boards today that illustrates just how much the world has changed since 1923.

This story came from the 1950 book Messrs. Ives of Bridgeport, by Louis H. Hertz.

In 1923, the biggest name in toys wasn’t Mattel or Hasbro, but Ives, a company based in Bridgeport, Connecticut. The company was so huge that letters addressed simply as “Mr. Ives, USA” were properly delivered.

In September, the owner of the company, Harry Ives, put his oldest daughter in charge of handling the volumes of mail the company received every day. He wanted her to find the special requests.

One day, she noticed a grimy, heavier-than-usual envelope. Rather than finding the usual handwritten request for a catalog along with a dime inside, she found a request accompanied by ten dull pennies. It came from a newsboy, who explained that his father was dead and his mother was struggling. He’d saved to buy an Ives catalog for his younger brother for Christmas. He would have liked to have bought a train set, but couldn’t afford it, but at least if he gave his brother a catalog, he’d be able to look at some pictures of nice trains.

When Harry Ives saw the letter, he sent the boy a train set, along with the catalog he requested.

It seems like acts of kindness like this used to be more common. I told my wife this story, and she said that people aren’t as honest anymore. She reminded me of the time earlier this year that I sent an inexpensive Marx locomotive to someone who claimed to be a disabled Gulf War II veteran who was having trouble getting his trains running. The truth was he was running a scam, getting lots of people to feel sorry for him and send him trains, and he ended up selling all of them on Craigslist and eBay.

It’s easier to be kind when you believe people are honest.

Replace your Antivirus software with this freebie and regain your performance

Antivirus software is the worst culprit in PC slowdowns. I am not alone in this belief. I don’t suggest going without (not completely) but it’s certainly possible to save lots of money, eliminate subscriptions, eliminate most of the overhead, and still practice (relatively) safe computing while running Windows.

Use Clamwin, the Windows version of ClamAV, and don’t engage in risky behavior (more on that later).Clamwin is free, GPL software, meaning you never have to pay for or renew it. It lacks a realtime scanner, which is the main resource hog for PCs. This may leave you vulnerable to infections, but think about where the majority of infections come from: E-mail, downloads, and drive-by installations. Clamwin comes with hooks into Outlook to scan e-mail attachments for you, and Clamglue is a plugin for Firefox that automatically scans all downloaded files. Of course you’re using Firefox, right? Using a non-Internet Explorer browser is the most effective way to prevent drive-by installations. I don’t use IE on my personal PCs for anything other than running Windows update.

Realtime protection made lots of sense when the main distribution point for viruses was infected floppies, but those days are long gone. This approach protects you against modern viruses without making your multi-gigahertz computer run like a Pentium-75.

I do suggest periodically scanning your system, something that even antivirus packages with realtime protection do. It makes you wonder how much confidence they have in that resource-hogging realtime protection, doesn’t it? Weekly scans are usually adequate; daily scans are better if you suspect some users of your computer engage in risky behavior.

Risky computer behavior

The last virus that ever hit any computer I was using was LoveLetter, which was way back in May 2000. The only reason I got that one was because I had a client who got infected and she just happened to have me in her address book. I don’t know the last time I got a virus before that.

It’s not because I’m lucky, it’s because I’m careful. There are lots of things I don’t do with my computers.

I stay off filesharing networks. Not everything on your favorite MP3-sharing site is what it claims to be, and there are people who believe that if you’re downloading music without paying them for it, they are entirely justified in doing anything they want to you, such as infecting you with a computer virus.

I don’t open e-mail attachments from strangers, or unexpected e-mail attachments from people I know. For that matter, if I don’t recognize the sender of an e-mail message, I probably won’t open it at all, attachment or no attachment.

I don’t run Internet Explorer if I can possibly avoid it. Internet Explorer’s tight integration into the operating system makes it far too easy for people to run software on your computer if you so much as visit a web page. Google tries to identify web pages that might be trying to do this, but a safer option is to use a different web browser that doesn’t understand ActiveX and doesn’t have ties into your underlying operating system.

I don’t install a lot of software downloaded from the Internet. A good rule is not to install any “free” software whatsoever unless it’s licensed under the GNU GPL or another similar open-source license. If you don’t know what that means, learn. Open source means the computer code behind the program is freely available and outside programmers can examine it. If a program distributed that way does anything malicious, someone’s going to figure it out really fast. If I’m going to download and install something that isn’t open source, I only do so when somebody I trust (be it a trusted colleague, a magazine columnist, etc.) recommends it.

I don’t rely on software firewalls. I have a separate cable/DSL router that acts as a firewall and sits between my computers and the Internet. So when the random virus comes around looking for a computer to infect, my firewall doesn’t even speak their language (it doesn’t run Windows and doesn’t have an Intel or AMD processor inside), so the potential infection just bounces right off.

Use a web-based e-mail service instead of a program like Outlook or Outlook Express if you can. If you use something like Yahoo Mail or Hotmail, that company’s servers scan your incoming and outgoing e-mail for viruses, so if someone sends a virus to your Yahoo account, you won’t get it. Does your ISP scan your e-mail for you? If you don’t know, you probably should consider getting your e-mail from someone else. Your antivirus should catch it, of course, but it never hurts to have someone else looking out for you too.

If you avoid these practices, you can join me in throwing out your commercial, for-pay antivirus software and reclaim a lot of computer performance too.

I still leave my PCs powered on

There’s advice flying around the ‘net today about how much energy we save by shutting PCs off when they’re not in use.

Having widely dispensed the advice to leave PCs on all the time (but I’ve been saying for 15 years to turn monitors off), let me be the contrarian and talk about the counterpoint.The issue is the amount of energy an idle PC wastes doing nothing. And that’s the main reason I’ve always recommended turning monitors off–monitors use a lot of energy and give off a lot of heat, and there’s no particular advantage to leaving them on either. Picture tubes degrade rapidly if they’re left on all the time–this is why every monitor in a used computer store requires you to turn the brightness and contrast all the way up for the display to be readable. Turning the monitor off and on repeatedly isn’t good for it either, but it saves the picture tube.

Now, on to the PC. My PC on its own consumes less electricity than the light bulbs in the room it sits in. Energy costs are going up, but that’s still only a few dollars a year it’s burning. In the 15 years I’ve been leaving computers on all the time, I’ve had a very small number of hardware failures–I’ve lost maybe four hard drives, and one or two power supplies. And I own a lot more computers than the average person. That averages out to one repair every two years on a house full of computers. If I were having to replace a hard drive every year, I’d be spending more money.

Aside from the money, how much energy am I saving by not having to replace lots of parts every year? Isn’t the increased lifespan of my computers worth something?

That’s not the only issue, of course. A computer generates some heat, and in the summer you have to get rid of that heat. But rather than turning PCs off all the time, I prefer to minimize their power consumption. If I don’t need the hottest new 3D video card (which I don’t), then I don’t use it. And the majority of my CPUs are in the gigahertz neighborhood. They do everything I need. So I save energy that way. I get energy savings elsewhere (by using compact flourescent bulbs, for example), so to me, leaving the PCs on makes sense.

In the winter, of course, the heat given off by PCs is a nice benefit. The more heat my computer gives off, the less work my furnace has to do.

I think some common sense is in order. I turn my monitors off when they’re not in use (though LCDs use little power, I recommend shutting them off too in order to conserve the backlight). The PCs I use every day stay powered on. PCs that I only use occasionally–say, once or twice a week–get powered down when I am finished with them for the day. Admittedly I’m more likely to leave a little-used PC powered on in the winter or summer.

Intel inside the Mac–no more question mark

OK, it’s official. Intel has conquered one of the last holdouts: Soon you’ll be able to buy a Pentium-powered Mac.

Of course there are lots of questions now.First of all, Apple having problems with its CPU suppliers is nothing new. Apple’s first CPU supplier was a small firm called MOS Technology. You’ve probably never heard of it, but MOS was a subsidiary of a company you may have heard of: Commodore. Commodore, of course, was one of two other companies to release a ready-built home computer about the same time Apple did. The problem was that the Commodore and Apple computers had the same CPU. Commodore, of course, could undercut Apple’s price. And it did. Commodore president Jack Tramiel was an Auschwitz survivor, and Tramiel pretty much assumed his competitors were going to treat him the same way the Nazis did, so he never cut them any breaks either. At least not intentionally.

When other companies released licensed versions of MOS’ 6502 processor, Apple was the biggest customer. Rumor had it that Commodore was hoarding 6502s.

When Motorola released its legendary 68000 CPU, Apple was one of the first companies to sign up, and the first two commercially successful computers to use the m68K were made by Apple. And life was good. Apple wasn’t Motorola’s only customer but it was one of the biggest. Life was good for the better part of a decade, when Intel finally managed to out-muscle the performance of the Motorola 68040. So Apple conspired with Motorola and IBM to come up with something better, and the result was the PowerPC. And life was good again. The PowerPC wasn’t the best chip on the market, but of the two architectures that you could buy at every strip mall on the continent, it was clearly the better of the two.

Over time Apple’s relationship with Motorola cooled, and the relationship with IBM was off again and on again. Intel meanwhile kept trotting out bigger and bigger sledgehammers, and by brute force alone was able to out-muscle the PowerPC. Steve Jobs got creative, but eventually he just ran out of tricks. Switching to Intel in 2006 may or may not be the best option, but it’s just as easy to do now as it’s ever going to be.

So, now there’s the question of whether this will hurt Microsoft or Linux or both. The answer is yes. The real question isn’t whether it will hurt, but how much. As soon as Microsoft loses one sale, it’s hurt. The same goes for Red Hat.

To me, the question hinges on how attached Apple is to its hardware business. Steve Jobs has only said that OS X has been running on Intel in the labs for years. I have never heard him mention whether the hardware was a standard PC clone motherboard, or something of Apple’s design. I suspect he’s avoiding the question.

It would be possible to make OS X run on Apple hardware and only Apple hardware, even if the CPU is a standard Pentium 4 just like Dell uses. And at least at the outset, I expect Apple will do that. Apple may only have 3-5 percent of the market, but it’s 3-5 percent of a really big pie. The company is profitable.

It would also be possible to let Windows run on this hardware. That may be a good idea. Apple still has something to offer that nobody else does: The slick, easy to use and stable OS X, but on top of that, you can boot into Windows to play games or whatever. It makes Apple hardware worth paying a premium to get.

If Apple chooses to let OS X run on anything and everything, it hurts Linux and Windows more, but it probably hurts Apple too. There’s a lot of hardware out there, and a lot of it isn’t any good. Apple probably doesn’t want that support nightmare.

I think this will narrow the gigahertz gap and, consequently, the speed gap. I think it will help Apple’s marketshare, especially if they allow Windows to run on the hardware. I don’t see it having a devestating effect on any other operating system though. It will hurt marginal PC manufacturers before it hurts software companies.

Vindicated?

This article on Windows installation at Firing Squad preaches all the same things I was preaching nearly six years ago in my Windows 9x book.

Where to find the stuff has almost all changed, and msot of the old utilities don’t work anymore, but these are exactly the same concepts I yammered on and on about. Funny, I’ve been told system optimization is a waste of time…Incidentally, this is the second article on optimization that I’ve seen in less than a month. The other one read an awful lot like a Windows XP translation of an article I published in Computer Shopper UK back in 2000, which in turn was a shortened version of one of the chapters in the same book.

So I guess people don’t just throw their 2-gigahertz computers away and buy new ones when they start to seem slow?

It really makes me wonder what would have happened if, after the book received a gushing review in Canada and was perpetually sold out in stores up north, if those 3,000 copies of the book that languished in a warehouse in Tennessee had made their way into those stores.

That’s OK. That was five years ago, nothing can change it, and I really don’t have any desire to be a computer author anymore. I find the only way to really know a lot about computers is to work with them for 40-60 hours a week in a production environment. Labs don’t cut it–you can never underestimate the effect of 1,000+ users hammering on what you built. Never. And if you spend those hours working, that doesn’t leave enough time to write books and release them in a timely fashion.

So rather than write mediocre computer books or send myself to an early grave by working full time in addition to writing for 30-45 hours a week, I’d rather have a life, make a decent living, and not write computer books.

Things to look for in a flatbed scanner

David Huff asked today about scanners, and I started to reply as a comment but decided it was too long-winded and ought to be a separate discussion.

So, how does one cut through the hype and get a really good scanner for not a lot of money?The short answer to David’s question is that I like the Canon Canoscan LIDE series. Both my mom and my girlfriend have the LIDE 80 and have been happy with it.

For the long answer to the question, let’s step through several things that I look for when choosing a scanner.

Manufacurer. There are lots of makers of cheap and cheerful scanners out there. Chances are there are some cheap and nasty ones too. Today’s cheap and nasty scanners will be a lot better than 1995’s crop of cheap and nasties, since the PC parallel port was a huge source of incompatibilities, but I want a scanner from a company with some experience making scanners and with good chances of still being around in five years.

Driver support. Much is made of this issue. But past track record isn’t much of an indicator of future results. HP and Umax infamously began charging for updated drivers, for example. But at least I could get a driver from HP or Umax, even if it costs money. My Acer scanner is forever tethered to a Windows 98 box because I can’t get a working driver for Windows 2000 or XP for it.

Umax used to have a stellar track record for providing scanner drivers, which was why I started buying and recommending them several years ago. I don’t know what their current policy is but I know some people have sworn them off because they have charged for drivers, at least for some scanners, in the recent past. But you can get newer drivers, in many cases, from Umax UK.

But that’s why I like to stick with someone like Canon, HP, Umax, or Epson, who’ve been making scanners for several years and are likely to continue doing so. Even if I have to pay for a driver, I’d rather pay for one than not be able to get one. Keep in mind that you’ll be running Windows XP until at least 2006 anyway.

Optical resolution. Resolution is overrated, like megahertz. It’s what everyone plays up. It’s also a source of confusion. Sometimes manufacturers play up interpolated resolution or somesuch nonsense. This is where the scanner fakes it. It’s nice to have, but there are better ways to artificially increase resolution if that’s what you’re seeking.

Look for hardware or optical resolution. Ignore interpolated resolution.

Back to that overrated comment… Few of us need more than 1200dpi optical resolution. For one thing, not so long ago, nobody had enough memory to hold a decent-sized 4800dpi image in memory in order to edit it. If you’re scanning images to put them on the Web, remember, computer screen resolution ranges from 75 to 96dpi, generally speaking. Anything more than that just slows download speed. For printing, higher resolution is useful, but there’s little to no point in your scanner having a higher resolution than your printer.

I just did a search, and while I was able to find inkjet printers with a horizontal resolution of up to 5760dpi, I found exactly one printer with a vertical resolution of 2400dpi. The overwhelming majority were 1200dpi max, going up and down.

Your inkjet printer and your glossy magazines use different measurements for printing, but a true 1200dpi is going to be comparable to National Geographic quality. If your photography isn’t up to National Geographic standards, megaresolution isn’t going to help it.

Bit depth. If resolution is the most overrated factor, bit depth is the most underrated. Generally speaking, the better the bit depth, the more accurate the color recognition. While even 24 bits gives more colors than the human eye can distinguish, there is a noticeable difference in accuracy between scans done on a 24-bit scanner and scans from a 36-bit scanner.

If you have to choose between resolution and bit depth, go for bit depth every time. Even if you intend to print magazines out of your spare bedroom or basement. After all, if the color on the photograph is off, nobody is going to pay any attention to how clear it is.

Size and weight. Some flatbed scanners are smaller and lighter than a laptop. If they can draw their power from the USB port, so much the better. You might not plan to take one with you, but it’s funny how unplanned things seem to happen.

Intel scraps its 4 GHz P4!

Intel has announced it’s scrapping its 4 GHz P4. That’s a big turnaround.Intel got where it is today by cranking the megahertz, and then the gigahertz, just as high as it could and as quickly as it could, hoping competitors wouldn’t be able to keep up, and trumpeting clock speed as the only thing that really mattered.

When it designed the P4, it extended its pipeline to ridiculously long lengths, allowing it to pump up the clock rate, but the efficiency was so low that Intel had to be ashamed of it. The last of the P3s cleaned the P4’s clock. As did a number of AMD’s chips.

Now Intel is having difficulty reaching 4 Ghz. AMD still has room to ramp up its speeds, but it hasn’t even reached 3 GHz yet. They’ve been taking other approaches to increasing speed.

Now Intel’s taking yet another page from AMD’s book. First, Intel clones AMD’s 64-bit instruction set, next, Intel replaces clock speed with model numbers, and now it throws in the towel on the gigahertz race.

It’ll be interesting to see how Intel’s marketing adjusts. And while I don’t expect AMD to topple them any time soon, if ever, it’ll be interesting to see if AMD manages to turn this into another opportunity.

VMWare is in Microsoft\’s sights

Microsoft has released its Virtual Server product, aimed at VMWare. Price is an aggressive $499.

I have mixed feelings about it.VMWare is expensive, with a list price of about 8 times as much. But I’m still not terribly impressed.

For one, with VMWware ESX Server, you get everything you need, including a host OS. With Microsoft Virtual Server, you have to provide Windows Server 2003. By the time you do that, Virtual Server is about half the price of VMWare.

I think you can make up the rest of that difference very quickly on TCO. VMWare’s professional server products run on a Linux base that requires about 256 MB of overhead. Ever seen Windows Server 2003 on 256 megs of RAM? The CPU overhead of the VMWare host is also very low. When you size a VMWare server, you can pretty much go on a 1:1 basis. Add up the CPU speed and memory of the servers you’re consolidating, buy a server that size, put VMWare on it, and then move your servers to it. They’ll perform as well, if not a little bit better since at peak times they can steal some resources from an idle server.

Knowing Microsoft, I’d want to give myself at least half gig of RAM and at least half a gigahertz of CPU time for system overhead, minimum. Twice that is probably more realistic.

Like it or not, Linux is a reality these days. Linux is an outstanding choice for a lot of infrastructure-type servers like DHCP, DNS, Web services, mail services, spam filtering, and others, even if you want to maintain a mixed Linux/Windows environment. While Linux will run on MS Virtual Server’s virtual hardware and it’s only a matter of time before adjustments are made to Linux to make it run even better, there’s no official support for it. So PHBs will be more comfortable running their Linux-based VMs under VMWare than under Virtual Server 2003. (There’s always User-Mode Linux for Linux virtual hosts, but that will certainly be an under-the-radar installation in a lot of shops.)

While there have been a number of vulnerabilities in VMWare’s Linux host this year, the number is still lower than Windows 2003. I’d rather take my virtual host server down once a quarter for patching than once a month.

I wouldn’t put either host OS on a public Internet address though. Either one needs to be protected behind a firewall, with its host IP address on a private network, to protect the host as much as possible. Remember, if the host is compromised, you stand to lose all of the servers on it.

The biggest place where Microsoft gives a price advantage is on the migration of existing servers. Microsoft’s migration tool is still in beta, but it’s free–at least for now. VMWare’s P2V Assistant costs a fortune. I was quoted $2,000 for the software and $8,000 for mandatory training, and that was to migrate 25 servers.

If your goal is to get those NT4 servers whose hardware is rapidly approaching the teenage years onto newer hardware with minimal disruption–every organization has those–then Virtual Server is a no-brainer. Buy a copy of Virtual Server and new, reliable server hardware, migrate those aging machines, and save a fortune on your maintenance contract.

I’m glad to see VMWare get some competition. I’ve found it to be a stable product once it’s set up, but the user interface leaves something to be desired. When I build or change a new virtual server, I find myself scratching my head whether certain options are under “Hardware” or under “Memory and Processors”. So it probably takes me twice as long to set up a virtual server as it ought to, but that’s still less time than it takes to spec and order a server, or, for that matter, to unbox a new physical server when it arrives.

On the other hand, I’ve seen what happens to Microsoft products once they feel like they have no real competition. Notice how quickly new, improved versions of Internet Explorer come out? And while Windows XP mostly works, when it fails, it usually fails spectacularly. And don’t even get me started on Office.

The pricing won’t stay the same either. While the price of hardware has come down, the price of Microsoft software hasn’t come down nearly as quickly, and in some cases has increased. That’s not because Microsoft is inherently ruthless or even evil (that’s another discussion), it’s because that’s what monopolies have to do to keep earnings at the level necessary to keep stockholders and the SEC happy. When you can’t grow your revenues by increasing your market share, you have to grow your revenues by raising prices. Watch Wal-Mart. Their behavior over the next couple of decades will closely monitor Microsoft’s. Since they have a bigger industry, they move more slowly. But that’s another discussion too.

The industry can’t afford to hand Microsoft another monopoly.

Some people will buy this product just because it’s from Microsoft. Others will buy it just because it’s cheaper. Since VMWare’s been around a good long while and is mature and stable and established as an industry standard, I hope that means it’ll stick around a while too, and come down in price.

But if you had told me 10 years ago that Novell Netware would have single-digit marketshare now, I wouldn’t have believed you. Then again, the market’s different in 2004 than it was in 1994.

I hope it’s different enough.