The Aero Monorail Company of St. Louis

The Aero Monorail Company of St. Louis

The Aero Monorail was a futuristic monorail train that first hit the market in 1932. Manufactured in St. Louis by the eponymously named Aero Monorail Company, it was designed to suspend over Lionel standard gauge track and run  faster than the standard gauge train.

The stands came in two varieties: a pair of free standing towers, and a series of towers that slipped under Standard gauge track and used the same 42-inch diameter. The motor looked like an Erector motor and ran on 6-8 volts, either DC or AC.

Read more

E.R. Johnston, the train dealer, the myth, the legend

Something today made me think of Johnston Electric, a legendary, long-gone train store in St. Louis’ Dutchtown neighborhood that sold Lionel, American Flyer, and HO scale trains.

I was in the old Marty’s Model Railroads store in Affton one afternoon several years ago while Marty was going through a box of trains he had bought earlier in the day. He found some manuals, catalogs, and other paperwork, which he set aside. Then he pulled out an old newspaper page. “I wonder why he saved that?” he asked. He set the paper down, then something caught his eye. “Oh, that’s why,” he said, and pointed at an ad on the page.

An ad for E. R. Johnston from 1948
An ad for E. R. Johnston from 1948

“Johnston’s,” it read at the bottom. “3118 Chippewa Street.”

“I spent many, many hours at that place when I was younger,” Marty said.
Read more

The "good enough" PC

PC World has a treatise on “good enough” computing. This isn’t actually a new trend but it’s never stood still for as long as it has now.Jerry Pournelle used to describe cheap CPUs from Cyrix and IDT in the late 1990s as “good enough.” Running at 166 and 200 MHz, they ran Windows 95 and NT4 and Office 97 just fine. They weren’t good gaming CPUs, but for everything else, they were great, and you could build a computer with one of those and save $100 or more over using a comparable Intel CPU.

Trouble was, the mainstream moved. Intel knocked off all the upstarts by starting a megahertz war, and AMD came back from a near-death experience to compete. The requirements to run Windows increased nearly as rapidly, and it wasn’t all that long before 900 MHz was pretty much the bare minimum to run Windows comfortably.

But chips kept getting cheaper, and today you can buy a 2 GHz CPU for pretty close to what a Cyrix or WinChip CPU cost. But you get more than 10 times the power for that money. And Windows XP runs perfectly comfortably on a 2 GHz CPU, whether it’s a new Intel Atom or Celeron or a 5-year-old corporate discard. So does Office 2003, which is the very last version of Office that any sane person would want to use.*

*Besides being the evil spawn of Windows Vista and Microsoft Bob, Office 2007 also crashes more often than Windows 3.0 did. The only way I can go a week without losing work from Office 2007 crashing is to go on vacation.

The PC World author claims that Linux and Open Office running on Intel Atom CPUs will be the undoing of Microsoft. I think that’s a bit of a stretch. Netbooks running Linux got returned to the vendor a lot. I suspect the biggest reason is because they probably couldn’t figure out how to get their USB mobile broadband cards–I’m talking the stuff that cellphone vendors offer for 50 bucks a month–working in Linux. That, and they probably couldn’t get Flash working so they couldn’t see Facebook and other popular sites the way they could on their regular PCs.

Frankly, the two things that keep me from buying a $200 Dell Vostro netbook this weekend are the price of mobile broadband ($50 a month), and my concerns about the reliability of anything sold by Dell in the last 5-6 years. I work with a lot of Dell equipment, and once the warranty goes, their machines do not age gracefully at all. But I think Dell will sell a lot of these units, because the price is absurdly low, they weigh two pounds, and they run anything but 3D games and intensive graphics apps nice and fast. Sure, a dual-core system with its memory maxed out and a solid state disk will outrun it, sometimes even running circles around it, but that system will also cost 10 times as much.

I do think Office 2007 is the best thing that ever happened to Open Office. Open Office’s interface is a lot more familiar and doesn’t hide anything, and while it may not be as fast as Office 2003, it’s certainly faster at most things than Office 2007 is.

Linux has been usable for basic computing for a very long time, but getting it installed and configured remains a challenge at times. A netbook that connects painlessly to the wireless networks in restaurants and to cellphone makers’ mobile broadband cards while running Linux probably stands a chance. Giving some automated, easy means to synchronize application data and web bookmarks between the netbook and a desktop PC would probably help a lot too–something that does the same thing that Activesync does for moving data between Windows PCs and Windows Mobile PDAs. Will these things happen?

But I do think an era of “good enough” is upon us. There was a time when the top-of-the-line PC would be entry level within a year or two, and that’s not really true anymore. The entry-level PC of today is comparable to the mid-range PC of five years ago. For most of my lifetime, basic computing on a five-year-old PC was always painful, no matter how good that PC was when it was new. That’s not the case today.

Graphic designers, video producers, and scientists will always need ever-more powerful systems for their work, so they’ll continue to drive the cutting edge. But everyday computing is stabilizing. I don’t think Intel wants the future of everyday computing to be the cheap Atom CPU, but at this point it may be impossible to avoid it. If Intel decides to quit playing in this space, AMD can design something comparable to replace it in the marketplace. The Geode won’t cut it, but something based on the Athlon XP architecture and built using a modern process certainly would.

And frankly I’m glad about this development. It’s been nice not having to buy a new computer every three years or so.

Replace your Antivirus software with this freebie and regain your performance

Antivirus software is the worst culprit in PC slowdowns. I am not alone in this belief. I don’t suggest going without (not completely) but it’s certainly possible to save lots of money, eliminate subscriptions, eliminate most of the overhead, and still practice (relatively) safe computing while running Windows.

Use Clamwin, the Windows version of ClamAV, and don’t engage in risky behavior (more on that later).Clamwin is free, GPL software, meaning you never have to pay for or renew it. It lacks a realtime scanner, which is the main resource hog for PCs. This may leave you vulnerable to infections, but think about where the majority of infections come from: E-mail, downloads, and drive-by installations. Clamwin comes with hooks into Outlook to scan e-mail attachments for you, and Clamglue is a plugin for Firefox that automatically scans all downloaded files. Of course you’re using Firefox, right? Using a non-Internet Explorer browser is the most effective way to prevent drive-by installations. I don’t use IE on my personal PCs for anything other than running Windows update.

Realtime protection made lots of sense when the main distribution point for viruses was infected floppies, but those days are long gone. This approach protects you against modern viruses without making your multi-gigahertz computer run like a Pentium-75.

I do suggest periodically scanning your system, something that even antivirus packages with realtime protection do. It makes you wonder how much confidence they have in that resource-hogging realtime protection, doesn’t it? Weekly scans are usually adequate; daily scans are better if you suspect some users of your computer engage in risky behavior.

Risky computer behavior

The last virus that ever hit any computer I was using was LoveLetter, which was way back in May 2000. The only reason I got that one was because I had a client who got infected and she just happened to have me in her address book. I don’t know the last time I got a virus before that.

It’s not because I’m lucky, it’s because I’m careful. There are lots of things I don’t do with my computers.

I stay off filesharing networks. Not everything on your favorite MP3-sharing site is what it claims to be, and there are people who believe that if you’re downloading music without paying them for it, they are entirely justified in doing anything they want to you, such as infecting you with a computer virus.

I don’t open e-mail attachments from strangers, or unexpected e-mail attachments from people I know. For that matter, if I don’t recognize the sender of an e-mail message, I probably won’t open it at all, attachment or no attachment.

I don’t run Internet Explorer if I can possibly avoid it. Internet Explorer’s tight integration into the operating system makes it far too easy for people to run software on your computer if you so much as visit a web page. Google tries to identify web pages that might be trying to do this, but a safer option is to use a different web browser that doesn’t understand ActiveX and doesn’t have ties into your underlying operating system.

I don’t install a lot of software downloaded from the Internet. A good rule is not to install any “free” software whatsoever unless it’s licensed under the GNU GPL or another similar open-source license. If you don’t know what that means, learn. Open source means the computer code behind the program is freely available and outside programmers can examine it. If a program distributed that way does anything malicious, someone’s going to figure it out really fast. If I’m going to download and install something that isn’t open source, I only do so when somebody I trust (be it a trusted colleague, a magazine columnist, etc.) recommends it.

I don’t rely on software firewalls. I have a separate cable/DSL router that acts as a firewall and sits between my computers and the Internet. So when the random virus comes around looking for a computer to infect, my firewall doesn’t even speak their language (it doesn’t run Windows and doesn’t have an Intel or AMD processor inside), so the potential infection just bounces right off.

Use a web-based e-mail service instead of a program like Outlook or Outlook Express if you can. If you use something like Yahoo Mail or Hotmail, that company’s servers scan your incoming and outgoing e-mail for viruses, so if someone sends a virus to your Yahoo account, you won’t get it. Does your ISP scan your e-mail for you? If you don’t know, you probably should consider getting your e-mail from someone else. Your antivirus should catch it, of course, but it never hurts to have someone else looking out for you too.

If you avoid these practices, you can join me in throwing out your commercial, for-pay antivirus software and reclaim a lot of computer performance too.

I still leave my PCs powered on

There’s advice flying around the ‘net today about how much energy we save by shutting PCs off when they’re not in use.

Having widely dispensed the advice to leave PCs on all the time (but I’ve been saying for 15 years to turn monitors off), let me be the contrarian and talk about the counterpoint.The issue is the amount of energy an idle PC wastes doing nothing. And that’s the main reason I’ve always recommended turning monitors off–monitors use a lot of energy and give off a lot of heat, and there’s no particular advantage to leaving them on either. Picture tubes degrade rapidly if they’re left on all the time–this is why every monitor in a used computer store requires you to turn the brightness and contrast all the way up for the display to be readable. Turning the monitor off and on repeatedly isn’t good for it either, but it saves the picture tube.

Now, on to the PC. My PC on its own consumes less electricity than the light bulbs in the room it sits in. Energy costs are going up, but that’s still only a few dollars a year it’s burning. In the 15 years I’ve been leaving computers on all the time, I’ve had a very small number of hardware failures–I’ve lost maybe four hard drives, and one or two power supplies. And I own a lot more computers than the average person. That averages out to one repair every two years on a house full of computers. If I were having to replace a hard drive every year, I’d be spending more money.

Aside from the money, how much energy am I saving by not having to replace lots of parts every year? Isn’t the increased lifespan of my computers worth something?

That’s not the only issue, of course. A computer generates some heat, and in the summer you have to get rid of that heat. But rather than turning PCs off all the time, I prefer to minimize their power consumption. If I don’t need the hottest new 3D video card (which I don’t), then I don’t use it. And the majority of my CPUs are in the gigahertz neighborhood. They do everything I need. So I save energy that way. I get energy savings elsewhere (by using compact flourescent bulbs, for example), so to me, leaving the PCs on makes sense.

In the winter, of course, the heat given off by PCs is a nice benefit. The more heat my computer gives off, the less work my furnace has to do.

I think some common sense is in order. I turn my monitors off when they’re not in use (though LCDs use little power, I recommend shutting them off too in order to conserve the backlight). The PCs I use every day stay powered on. PCs that I only use occasionally–say, once or twice a week–get powered down when I am finished with them for the day. Admittedly I’m more likely to leave a little-used PC powered on in the winter or summer.

Intel inside the Mac–no more question mark

OK, it’s official. Intel has conquered one of the last holdouts: Soon you’ll be able to buy a Pentium-powered Mac.

Of course there are lots of questions now.First of all, Apple having problems with its CPU suppliers is nothing new. Apple’s first CPU supplier was a small firm called MOS Technology. You’ve probably never heard of it, but MOS was a subsidiary of a company you may have heard of: Commodore. Commodore, of course, was one of two other companies to release a ready-built home computer about the same time Apple did. The problem was that the Commodore and Apple computers had the same CPU. Commodore, of course, could undercut Apple’s price. And it did. Commodore president Jack Tramiel was an Auschwitz survivor, and Tramiel pretty much assumed his competitors were going to treat him the same way the Nazis did, so he never cut them any breaks either. At least not intentionally.

When other companies released licensed versions of MOS’ 6502 processor, Apple was the biggest customer. Rumor had it that Commodore was hoarding 6502s.

When Motorola released its legendary 68000 CPU, Apple was one of the first companies to sign up, and the first two commercially successful computers to use the m68K were made by Apple. And life was good. Apple wasn’t Motorola’s only customer but it was one of the biggest. Life was good for the better part of a decade, when Intel finally managed to out-muscle the performance of the Motorola 68040. So Apple conspired with Motorola and IBM to come up with something better, and the result was the PowerPC. And life was good again. The PowerPC wasn’t the best chip on the market, but of the two architectures that you could buy at every strip mall on the continent, it was clearly the better of the two.

Over time Apple’s relationship with Motorola cooled, and the relationship with IBM was off again and on again. Intel meanwhile kept trotting out bigger and bigger sledgehammers, and by brute force alone was able to out-muscle the PowerPC. Steve Jobs got creative, but eventually he just ran out of tricks. Switching to Intel in 2006 may or may not be the best option, but it’s just as easy to do now as it’s ever going to be.

So, now there’s the question of whether this will hurt Microsoft or Linux or both. The answer is yes. The real question isn’t whether it will hurt, but how much. As soon as Microsoft loses one sale, it’s hurt. The same goes for Red Hat.

To me, the question hinges on how attached Apple is to its hardware business. Steve Jobs has only said that OS X has been running on Intel in the labs for years. I have never heard him mention whether the hardware was a standard PC clone motherboard, or something of Apple’s design. I suspect he’s avoiding the question.

It would be possible to make OS X run on Apple hardware and only Apple hardware, even if the CPU is a standard Pentium 4 just like Dell uses. And at least at the outset, I expect Apple will do that. Apple may only have 3-5 percent of the market, but it’s 3-5 percent of a really big pie. The company is profitable.

It would also be possible to let Windows run on this hardware. That may be a good idea. Apple still has something to offer that nobody else does: The slick, easy to use and stable OS X, but on top of that, you can boot into Windows to play games or whatever. It makes Apple hardware worth paying a premium to get.

If Apple chooses to let OS X run on anything and everything, it hurts Linux and Windows more, but it probably hurts Apple too. There’s a lot of hardware out there, and a lot of it isn’t any good. Apple probably doesn’t want that support nightmare.

I think this will narrow the gigahertz gap and, consequently, the speed gap. I think it will help Apple’s marketshare, especially if they allow Windows to run on the hardware. I don’t see it having a devestating effect on any other operating system though. It will hurt marginal PC manufacturers before it hurts software companies.


This article on Windows installation at Firing Squad preaches all the same things I was preaching nearly six years ago in my Windows 9x book.

Where to find the stuff has almost all changed, and msot of the old utilities don’t work anymore, but these are exactly the same concepts I yammered on and on about. Funny, I’ve been told system optimization is a waste of time…Incidentally, this is the second article on optimization that I’ve seen in less than a month. The other one read an awful lot like a Windows XP translation of an article I published in Computer Shopper UK back in 2000, which in turn was a shortened version of one of the chapters in the same book.

So I guess people don’t just throw their 2-gigahertz computers away and buy new ones when they start to seem slow?

It really makes me wonder what would have happened if, after the book received a gushing review in Canada and was perpetually sold out in stores up north, if those 3,000 copies of the book that languished in a warehouse in Tennessee had made their way into those stores.

That’s OK. That was five years ago, nothing can change it, and I really don’t have any desire to be a computer author anymore. I find the only way to really know a lot about computers is to work with them for 40-60 hours a week in a production environment. Labs don’t cut it–you can never underestimate the effect of 1,000+ users hammering on what you built. Never. And if you spend those hours working, that doesn’t leave enough time to write books and release them in a timely fashion.

So rather than write mediocre computer books or send myself to an early grave by working full time in addition to writing for 30-45 hours a week, I’d rather have a life, make a decent living, and not write computer books.

Things to look for in a flatbed scanner

David Huff asked today about scanners, and I started to reply as a comment but decided it was too long-winded and ought to be a separate discussion.

So, how does one cut through the hype and get a really good scanner for not a lot of money?The short answer to David’s question is that I like the Canon Canoscan LIDE series. Both my mom and my girlfriend have the LIDE 80 and have been happy with it.

For the long answer to the question, let’s step through several things that I look for when choosing a scanner.

Manufacurer. There are lots of makers of cheap and cheerful scanners out there. Chances are there are some cheap and nasty ones too. Today’s cheap and nasty scanners will be a lot better than 1995’s crop of cheap and nasties, since the PC parallel port was a huge source of incompatibilities, but I want a scanner from a company with some experience making scanners and with good chances of still being around in five years.

Driver support. Much is made of this issue. But past track record isn’t much of an indicator of future results. HP and Umax infamously began charging for updated drivers, for example. But at least I could get a driver from HP or Umax, even if it costs money. My Acer scanner is forever tethered to a Windows 98 box because I can’t get a working driver for Windows 2000 or XP for it.

Umax used to have a stellar track record for providing scanner drivers, which was why I started buying and recommending them several years ago. I don’t know what their current policy is but I know some people have sworn them off because they have charged for drivers, at least for some scanners, in the recent past. But you can get newer drivers, in many cases, from Umax UK.

But that’s why I like to stick with someone like Canon, HP, Umax, or Epson, who’ve been making scanners for several years and are likely to continue doing so. Even if I have to pay for a driver, I’d rather pay for one than not be able to get one. Keep in mind that you’ll be running Windows XP until at least 2006 anyway.

Optical resolution. Resolution is overrated, like megahertz. It’s what everyone plays up. It’s also a source of confusion. Sometimes manufacturers play up interpolated resolution or somesuch nonsense. This is where the scanner fakes it. It’s nice to have, but there are better ways to artificially increase resolution if that’s what you’re seeking.

Look for hardware or optical resolution. Ignore interpolated resolution.

Back to that overrated comment… Few of us need more than 1200dpi optical resolution. For one thing, not so long ago, nobody had enough memory to hold a decent-sized 4800dpi image in memory in order to edit it. If you’re scanning images to put them on the Web, remember, computer screen resolution ranges from 75 to 96dpi, generally speaking. Anything more than that just slows download speed. For printing, higher resolution is useful, but there’s little to no point in your scanner having a higher resolution than your printer.

I just did a search, and while I was able to find inkjet printers with a horizontal resolution of up to 5760dpi, I found exactly one printer with a vertical resolution of 2400dpi. The overwhelming majority were 1200dpi max, going up and down.

Your inkjet printer and your glossy magazines use different measurements for printing, but a true 1200dpi is going to be comparable to National Geographic quality. If your photography isn’t up to National Geographic standards, megaresolution isn’t going to help it.

Bit depth. If resolution is the most overrated factor, bit depth is the most underrated. Generally speaking, the better the bit depth, the more accurate the color recognition. While even 24 bits gives more colors than the human eye can distinguish, there is a noticeable difference in accuracy between scans done on a 24-bit scanner and scans from a 36-bit scanner.

If you have to choose between resolution and bit depth, go for bit depth every time. Even if you intend to print magazines out of your spare bedroom or basement. After all, if the color on the photograph is off, nobody is going to pay any attention to how clear it is.

Size and weight. Some flatbed scanners are smaller and lighter than a laptop. If they can draw their power from the USB port, so much the better. You might not plan to take one with you, but it’s funny how unplanned things seem to happen.

Intel scraps its 4 GHz P4!

Intel has announced it’s scrapping its 4 GHz P4. That’s a big turnaround.Intel got where it is today by cranking the megahertz, and then the gigahertz, just as high as it could and as quickly as it could, hoping competitors wouldn’t be able to keep up, and trumpeting clock speed as the only thing that really mattered.

When it designed the P4, it extended its pipeline to ridiculously long lengths, allowing it to pump up the clock rate, but the efficiency was so low that Intel had to be ashamed of it. The last of the P3s cleaned the P4’s clock. As did a number of AMD’s chips.

Now Intel is having difficulty reaching 4 Ghz. AMD still has room to ramp up its speeds, but it hasn’t even reached 3 GHz yet. They’ve been taking other approaches to increasing speed.

Now Intel’s taking yet another page from AMD’s book. First, Intel clones AMD’s 64-bit instruction set, next, Intel replaces clock speed with model numbers, and now it throws in the towel on the gigahertz race.

It’ll be interesting to see how Intel’s marketing adjusts. And while I don’t expect AMD to topple them any time soon, if ever, it’ll be interesting to see if AMD manages to turn this into another opportunity.

WordPress Appliance - Powered by TurnKey Linux