Why IPO?

I’ve been reading a lot lately about Google and the anticipation surrounding its IPO, and I just can’t help but wonder something.
Why?

What do I mean, why? Just that. Why do an IPO? Why go public? What do they gain?

I remember, what seems like a million years ago, I saw a segment on 60 Minutes about software maker SAS. I vaguely remember SAS because they made a–what am I talking about? THE ONLY–highly regarded C compiler for the Amiga. But 60 Minutes wasn’t talking about Amigas. They were talking about how in an era when companies are universally cutting benefits, working for SAS continues to be more like living on a resort. Need to see the doctor? The company doctor is down the hall. Free day care for your kids on campus. You’re encouraged to eat lunch with your family. You’re encouraged to work 35 hours a week. All sorts of exercise equipment, including a pool and a track. Massages. We’re talking the kinds of excesses Netscape was infamous for here. Once you work for SAS, you never, ever leave. When the interviewer told the president he was crazy, he laughed and asked what’s crazy about treating your people well. And he pointed out the company has always been profitable, has never had to lay anyone off. His employees are happy and productive and they do good work. When the interviewer asked why, he said part of it is because there are no investors. He’s accountable only to himself. The investors don’t like the resort-on-campus because that costs money that could be going to dividends. He doesn’t have to worry about that.

Investors don’t think about much of anything but dividends. Except when the time comes to cash out the stock, which is often. Executives need to think long-term, but that’s hard when your main job is to please the investors, who come in with a Las Vegas mentality. And why should you be accountable to investors? Just because they have money doesn’t mean they know anything.

That’s not to say all investors are clueless people who make you question whether mammals really are the highest form of animal. Some companies do just fine in spite of their investors. But how many good companies turned bad once they had vast herds of greedy investors to answer to? Google is cool because it’s so anti-commercial, so unubtrusive, so… Well, have you ever wondered how Google makes its money? I have.

My fear is that the minute after some investor starts asking how Google makes its money, we’ll be seeing X10 popunders when we go there.

Yes, $15 billion is a lot of money. I’m sure Google could come up with lots of cool things to do if it had it. But I remember someone asking once what your soul is worth. Does $15 billion really seem like a lot when compared to the value of Google’s soul?

For some companies, the IPO is the next step on the way to greatness. But for a larger percentage, it seems to be the first step toward mediocrity. If I’d invented Google, I wouldn’t take that chance. Do the IPO after I retire.

But I didn’t invent Google, so I guess that’s not my decision, eh?

What needs to happen for Linux to make it on the desktop

I saw an editorial at Freshmeat that argued that there’s actually too much software for Linux. And you know what? It has a point.
I’m sure some people will be taken aback by that. The number of titles that run under Windows must number into six digits, and it’s hard to walk into a computer store and buy Linux software.

But I agree with his argument, or at least most of it. Back in my Amiga days, the first thing people used to ask me was, “What, do you not like software?” Then I asked why they felt the need to have their choice of 10 different word processors, especially when they’d just buy pirate Microsoft Word or WordPerfect anyway. (Let’s face it: Large numbers of people chose PCs in the early 90s over superior architectures was because they could pirate software from work. Not everyone. Maybe not even the majority. But a lot.) I argued that one competent software title in each category I needed was all I wanted or needed. And for the most part, the Amiga had that, and the software was usually cheaper than the Mac or PC equivalent.

Linux is the new Amiga. Mozilla is a far better Web browser than IE, and OpenOffice provides most of the functionality of Microsoft Office XP–it provides more functionality than most people use, and while it doesn’t always load the most complex MS Office documents correctly, it does a much better job of opening slightly corrupt documents and most people don’t create very complex documents anyway. But let’s face it: Its biggest problem is it takes an eternity to load no matter how fast your computer is. If it would load faster, people would be very happy with it.

But there is nothing that provides an equivalent to a simple database like Access or Filemaker. I know, they’re toys, and MySQL is far more powerful. But end users like dumb, brain-dead databases with clicky GUI interfaces on them that they can migrate to once they realize a spreadsheet isn’t intended to do what they’re trying to do with it. Everyone’s first spreadsheet is Excel. Then someday they realize Excel wasn’t intended to do what they’re using it for. But you don’t instantly dive into Oracle. You need something in between, and Linux doesn’t really have anything for that niche.

People are constantly asking me about a WYSIWYG HTML editor for Linux as well. I stumbled across one. Its name is GINF. Yes, another stupid recursive-acronym name. GINF stands for “GINF is not Frontpage.” How helpful. What’s wrong with a descriptive name like Webpage-edit?

More importantly, what was the first non-game application that caught your fancy? For most people I know, it was Print Shop, or one of the many knockoffs of Print Shop. People love to give and receive greeting cards, and when they can pick their own fonts and graphics and write their own messages, they love it even more. Not having to drive to the store and fork over $3.95 is just a bonus. Most IT professionals have no use for Print Shop, but Linux’s lack of alternatives in that department is hurting it.

Take a computer with a CPU on the brink of obsolesence, a so-so video chipset, 128 megs of RAM and the smallest hard drive on the market, preload Linux on it along with a fast word processor that works (AbiWord, or OpenOffice Writer, except it’s not fast), a nice e-mail client/PIM (Evolution), a nice Web browser (Mozilla), and a Print Shop equivalent (bzzzt!), and a couple of card games (check Freshmeat) and you’d have a computer for the masses.

The masses do not need 385 text editors. Sysadmin types will war over vi and emacs until the end of time; one or two simple text-mode editors as alternatives will suffice, and one or two equivalents of Notepad for X will suffice.

Linux’s day will eventually arrive regardless, if only because Microsoft is learning what every monopolist eventually learns: Predatory pricing stops working once you corner the market. Then you have to raise prices or find new markets. Eventually you run out of worthwhile markets. So in order to sustain growth, you have to raise prices. Microsoft is running out of markets, so it’s going to have to raise prices. Then it will be vulnerable again, just like Apple and CP/M were vulnerable to Microsoft because their offerings cost more than Microsoft was willing to charge. And, as Microsoft showed Netscape, you can’t undercut free.

But that day will arrive sooner if it doesn’t take a week to figure out the name of the Linux equivalent of Notepad because there are 385 icons that vaguely resemble a notepad and most of them have meaningless names.

How IBM and DOS came to dominate the industry

How IBM and DOS came to dominate the industry

Revisionist historians talk about how MS-DOS standardized computer operating systems and changed the industry. That’s very true. But what they’re ignoring is that there were standards before 1981, and the standards established in 1981 took a number of years to take hold.

Read more

Disadvantages of Windows 3.1

Note: I wrote this way back in 2003, so my advice as far as replacing Windows 3.1 is a bit dated, but the strengths and weaknesses remain valid. If you’re thinking of a new computer, please don’t run anything older than Windows 7.

I found a search in my log analysis for “disadvantages of windows 3.1,” which I found interesting. I can talk about that.
Someone asked for it, and I aim to please. So let’s head down memory lane.

In all fairness, let’s talk about what’s good about it first. The main thing is that it’ll run–or at least load and execute–on pretty much anything, as long as it’s old. It’s anything but ideal on a 286, but it’ll execute. And on a 386DX, plain old Windows 3.1 is reasonably zippy if you cut down the number of fonts it has, only load a few applications, and install 16 MB of RAM in it. On a 486 or a low-end Pentium, it’s plenty fast.

Windows 3.1 freeware doesn’t have much in the way of strings attached–no need to worry about spyware. That’s a good thing.

Fine. Now for the hatchet job. To be completely honest, I didn’t like Windows 3.1 in 1993 and 1994 when it was what everyone was using. I ran it for a few months and then went out and bought OS/2 and never looked back. So you’re getting a perspective from someone who’s been willing for a long, long time to run anything other than Windows 3.1. But I’ll do my best to be fair.

You may have trouble running it on newer hardware. Let’s face it, it came on the market 10 years ago and not many people use it anymore. There’s not a lot of demand for drivers, so it can be hard to find a modern video card with Windows 3.1 drivers. And not only does Windows 3.1 have spotty capability with new hardware, it’s very limited in its ability to take advantage of anything made since 1995 or so.

More importantly, modern operating systems give full pre-emptive multitasking, or in the case of Windows 95/98/ME, at least something that vaguely resembles it. Under pre-emptive multitasking, the OS decides what applications get CPU time and how much. In Win3.1’s cooperative multitasking, the apps just have a knock-down, drag-out fight for CPU time. If you send an application to the background, it’ll get some work done, but not as much as it would under a newer OS.

My biggest beef with Windows 3.1 was its crashes. If you just run an app or maybe two all the time, it works reasonably well. But I’m the kind of guy who always has three or four or twelve apps open–the first multitasking systems I ever used, Unix and AmigaOS, had no problem doing that–and if you try that with Windows 3.1 for very long, you’ll see a lot of blue screens.

I wasn’t a fan of the Windows 3.1 Program Manager interface. I’m not in love with the Explorer interface of newer versions either, but it’s easier to use and faster to navigate than Progman was.

And although its software selection is pretty good, I guess Windows 3.1 now falls victim to the same argument I heard time and time again against my preferred alternative operating systems: What, don’t you like software? Sometimes Windows 3.1’s available offerings are adequate and sometimes they aren’t: Microsoft Office 6.0 is certainly adequate for 99% of all people’s needs. If you dig deep enough (I found a copy here), you can find Internet Explorer 5.0 for Windows 3.1. It’s not the best browser in the world but it’s the best one you’ll find for Win3.1 and it may be good enough for you. Sticking with Windows 3.1 limits you to a much smaller selection of software than newer operating systems. At this point, ironically, even Linux, which was once notorious for its lack of software that Joe Sixpack would want to use, now has a better selection of mainstream software than Windows 3.1 had.

At this point in time it’s hard to recommend Windows 3.1. PCs capable of running Windows 95 adequately are very, very cheap (I see 133 MHz Pentium computers sell for $35 when people are willing to mess with them, and a 66 MHz 486 will run Windows 95 decently and just about anyone who works in the computer field can find one of those to give you for free if you ask nicely enough), and although support for Windows 95 is starting to dry up, it’s much easier to find hardware and software compatible with Win95 than it is for Windows 3.1. Windows 98 is better still, but I definitely recommend a 200 MHz Pentium and more than 32 MB of RAM for Win98. Still, that’s doable.

And if you’re thinking that Windows 3.1 is adequate for you and you’re not totally strapped for cash, you might want to give the $199 Wal-Mart PCs running Lindows a look. Lindows is basically Linux with a pretty graphical user interface, and it’s perfectly fine for word processing, web browsing and e-mail. The budget Wal-Mart PC is hardly a barn burner, but it’s much faster than any computer you’re likely to be running Windows 3.1 on, and since it will be much newer, the hardware itself will also be a lot more reliable. Double check with your ISP before you buy one to make sure you can get connected (they’re probably getting used to that question by now), but if you can get connected, think about it.

Fare thee well, goodnight, and goodbye to my friend, OS/2

The Register: IBM has finally brought the Great Rebellion [OS/2] to a close.
The Register was the only online obituary that mentioned eComStation, a third-party OS/2 derivative that everyone forgets about. Interestingly, the product literature never mentions OS/2 by name, only bragging about its technology licensed from IBM.

The Reg also talked about OS/2 version 3 being positioned as a gamer’s OS. Maybe that’s ironic, coming from the suits at IBM, and that wasn’t how I saw it–I switched from Windows 3.1 to OS/2 because, coming from an Amiga, I was used to being able to multitask freely without a lot of crashes. Windows 3.1 crashed pretty much every day if I tried to do that. OS/2 knocked that number down to about once a year, and usually those lockups happened when I was running Windows apps.

Even though I never really thought of it that way, OS/2 was great for games. Since it used virtual DOS machines, each environment had its own memory management, so you could fine-tune it and avoid shuffling around boot disks or struggling to master the DOS 6.0 boot menus. Pretty much no matter what you did, you got 600K or more of conventional memory to work with, and with some fine-tuning, you could bring that total much higher than you could usually attain with DOS. Since 600K was almost always adequate, most games just ran, no sweat.

The other thing I remember is the speed at which DOS games ran. Generally, running it under OS/2 gained you a speed grade. A game running under OS/2 on a DX2/50 felt like a DX2/66 running under DOS would feel. An OS/2 wizard could usually squeeze yet more performance out of the game with some tweaking.

I have fond memories of playing Railroad Tycoon, Civilization, and Tony LaRussa Baseball 2 on my Compaq 486 running OS/2 v3.

And there was another really nice thing about OS/2. When I bought a shiny new Pentium-75 motherboard and CPU and a new case, I pulled the hard drive out of the Compaq and dropped it into the Pentium. It just worked. All I had to do was load drivers for my new video card, since it used a different chipset than my 486.

And the cult of OS/2 won’t go away just yet. The talk over at os2voice.org has me almost ready to install OS/2 again.

apt-get install aclue

My boss called a meeting mid-week last week, and if all goes well, there’ll be some changes at work. That’s a very good thing.
I deliberately don’t write about work very often, and only in vague terms when I do, because some things I wrote about work in the past came back to bite me.

I’ve thought blogs were a very useful tool for a long time. When I started my career in 1997, I found myself gravitating towards some embryonic blog-like sites that offered technical information. Eventually enough people egged me into starting one myself. I found myself posting the solutions to my technical problems there, since searching there was much easier than with any tools we had at work. It’s a good way to work in the public eye and solicit ideas and feedback.

Well, my boss took notice. I blog, and so does one of my coworkers (I hesitate to mention him by name, as it might give away my employer, which I’d still rather not do). He visits from time to time, though the only time he’s tried to post a comment, my DSL connection went down (he naturally asked what I was doing to sabotage IE).

At the meeting, where we were talking about new ways to do things, he asked me point-blank to “Set up a weblog like you and [the guy in the cube next to me] have.”

So this morning I asked my mentor in the cube next to me for a MySQL account on one of our Linux servers. Then I installed Movable Type, mostly because both of us have heard great things about it but neither of us (so far) has been willing to risk everything by switching to it. (I know it’s not free for commercial use; call this “evaluation.” For all I know we’ll end up using b2, which is under the GPL, because for internal, intranet purposes, I don’t know that MT offers anything that b2 doesn’t. But if the boss decides he wants us to go live with MT, we’ll fork over the $150.)

The idea is, we can all log onto the blog at the end of the day and write down any significant things we did. Along the way, hopefully we’ll all learn something. And, as far as I can tell, we won’t block our clients from seeing the blog either. That way they can catch a glimpse into what we do. They won’t understand it all (I know I won’t understand all the VMS stuff on there, and the VMS guys may not understand all the NT stuff) but they’ll see something.

We talked about the cluetrain philosophy a little bit. Essentially, both of us understand it as the idea of being completely open, or at least as open as possible, with the customer. Let them see the internal operations. Let them make suggestions. Let them participate in the design of the product or service.

And I think that’s good up to a point.

Robert Lutz, one of the executives who turned Chrysler around before Daimler-Benz bought the automaker and ran it into the ground, wrote a marketing book called Guts: The Seven Laws of Business That Made Chrysler the World’s Hottest Car Company. I’ve got a copy of it on my shelf at work. One of the chapters of the book is titled, “The Customer Isn’t Always Right.” He argued that customers will follow trends and not necessarily tell the truth. Put out a survey asking people if they’d like a heated cupholder in their car, and most of them will say, yes, they’d love a heated cupholder. Everybody knows that a heated cupholder is a useless gadget no one will use, it won’t work right, and it’ll increase the cost of the car without adding any value, but nobody wants to look cheap.

Lutz argued that experts should make decisions. Since cars are the love of Lutz’s life, Lutz knows how to make killer cars. Lutz observed that the redesigned Dodge Ram pickup elicited extreme reactions. People either loved it or hated it. 70% of respondents loved it; 30% of respondents said they’d never go near the thing. Lutz argued that their then-current design had roughly 30% marketshare, so if half the people who said they loved it bought one, they’d gain 5%. So they brought it to market, and gained marketshare.

I suspect the biggest reason why the cluetrain philosophy works is that it helps to make you experts. See enough opinions, and you’ll learn how to recognize the good ones. When you’re clueless, the cluetrain people are right and you look like geniuses. Eventually, you stop being clueless, and at that point, Lutz is right.

The main reason I’m excited about having a blog in place at work isn’t because blogs in IT are trendy and popular and glitzy. (I’d still be using an Amiga if I could get a 68060 accelerator and a Zorro II Ethernet board without spending a grand.) I’m excited about blogs because I think it’ll get us a clue.

My boss typed apt-get install aclue at work today. I don’t think that’ll get us anything. Bgirwf that blog doesn’t get us a clue, I don’t think anything will.

Old computer magazines

I guess I need a “retro” category here. Anyway, I found this on Slashdot this morning: The Computer Magazine Archive. Don’t let the URL fool you–it’s not just Atari stuff.

Read more

The pundits are wrong about Apple’s defection

Remember the days when knowing something about computers was a prerequisite for writing about them?
ZDNet’s David Coursey continues to astound me. Yesterday he wondered aloud what Apple could do to keep OS X from running on standard PCs if Apple were to ditch the PowerPC line for an x86-based CPU, or to keep Windows from running on Apple Macs if they became x86-based.

I’d link to the editorial but it’s really not worth the minimal effort it would take.

First, there’s the question of whether it’s even necessary for Apple to migrate. Charlie pointed out that Apple remains profitable. It has 5% of the market, but that’s beside the point. They’re making money. People use Apple Macs for a variety of reasons, and those reasons seem to vary, but speed rarely seems to be the clinching factor. A decade ago, the fastest Mac money could buy was an Amiga with Mac emulation hardware–an Amiga clocked at the same speed would run Mac OS and related software about 10% faster than the real thing. And in 1993, Intel pulled ahead of Motorola in the speed race. Intel had 486s running as fast as 66 MHz, while Motorola’s 68040 topped out at 40 MHz. Apple jumped to the PowerPC line, whose clock rate pretty much kept up with the Pentium line until the last couple of years. While the PowerPCs would occasionally beat an x86 at some benchmark or another, the speed was more a point of advocacy than anything else. When a Mac user quoted one benchmark only to be countered by another benchmark that made the PowerPC look bad, the Mac user just shrugged and moved on to some other advocacy point.

Now that the megahertz gap has become the gigahertz gap, the Mac doesn’t look especially good on paper next to an equivalently priced PC. Apple could close the gigahertz gap and shave a hundred bucks or two off the price of the Mac by leaving Motorola at the altar and shacking up with Intel or AMD. And that’s why every pundit seems to expect the change to happen.

But Steve Jobs won’t do anything unless he thinks it’ll get him something. And Apple offers a highly styled, high-priced, anti-establishment machine. Hippie computers, yuppie price. Well, that was especially true of the now-defunct Flower Power and Blue Dalmation iMacs.

But if Apple puts Intel Inside, some of that anti-establishment lustre goes away. That’s not enough to make or break the deal.

But breaking compatibility with the few million G3- and G4-based Macs already out there might be. The software vendors aren’t going to appreciate the change. Now Apple’s been jerking the software vendors around for years, but a computer is worthless without software. Foisting an instruction set change on them isn’t something Apple can do lightly. And Steve Jobs knows that.

I’m not saying a change won’t happen. But it’s not the sure deal most pundits seem to think it is. More likely, Apple is just pulling a Dell. You know the Dell maneuver. Dell is the only PC vendor that uses Intel CPUs exclusively. But Dell holds routine talks with AMD and shows the guest book signatures to Intel occasionally. Being the last dance partner gives Dell leverage in negotiating with Intel.

I think Apple’s doing the same thing. Apple’s in a stronger negotiating position with Motorola if Steve Jobs can casually mention he’s been playing around with Pentium 4s and Athlon XPs in the labs and really likes what he sees.

But eventually Motorola might decide the CPU business isn’t profitable enough to be worth messing with, or it might decide that it’s a lot easier and more profitable to market the PowerPC as a set of brains for things like printers and routers. Or Apple might decide the gigahertz gap is getting too wide and defect. I’d put the odds of a divorce somewhere below 50 percent. I think I’ll see an AMD CPU in a Mac before I’ll see it in a Dell, but I don’t think either event will happen next year.

But what if it does? Will Apple have to go to AMD and have them design a custom, slightly incompatible CPU as David Coursey hypothesizes?

Worm sweat. Remember the early 1980s, when there were dozens of machines that had Intel CPUs and even ran MS-DOS, yet were, at best, only slightly IBM compatible? OK, David Coursey doesn’t, so I can’t hold it against you if you don’t. But trust me. They existed, and they infuriated a lot of people. There were subtle differences that kept IBM-compatible software from running unmodified. Sometimes the end user could work around those differences, but more often than not, they couldn’t.

All Apple has to do is continue designing their motherboards the way they always have. The Mac ROM bears very little resemblance to the standard PC BIOS. The Mac’s boot block and partition table are all different. If Mac OS X continues to look for those things, it’ll never boot on a standard PC, even if the CPU is the same.

The same differences that keep Mac OS X off of Dells will also keep Windows off Macs. Windows could be modified to compensate for those differences, and there’s a precedent for that–Windows NT 4.0 originally ran on Intel, MIPS, PowerPC, and Alpha CPUs. I used to know someone who swore he ran the PowerPC versions of Windows NT 3.51 and even Windows NT 4.0 natively on a PowerPC-based Mac. NT 3.51 would install on a Mac of comparable vintage, he said. And while NT 4.0 wouldn’t, he said you could upgrade from 3.51 to 4.0 and it would work.

I’m not sure I believe either claim, but you can search Usenet on Google and find plenty of people who ran the PowerPC version of NT on IBM and Motorola workstations. And guess what? Even though those workstations had PowerPC CPUs, they didn’t have a prayer of running Mac OS, for lack of a Mac ROM.

Windows 2000 and XP were exclusively x86-based (although there were beta versions of 2000 for the Alpha), but adjusting to accomodate an x86-based Mac would be much easier than adjusting to another CPU architecture. Would Microsoft go to the trouble just to get at the remaining 5% of the market? Probably. But it’s not guaranteed. And Apple could turn it into a game of leapfrog by modifying its ROM with every machine release. It already does that anyway.

The problem’s a whole lot easier than Coursey thinks.

I shoulda stayed home and read a book!

The last few days have been nuts. I’ve been wrestling with tape drives, trying to get them to work on a brain-dead operating system from a company in Redmond whose project is headed up by a potty-mouthed ex-DEC employee. Its initials are N and T.
And, riddle me this, someone, please. On Unix, I just hook up the tape drive, then I type this:


tar -cf /dev/tape /home

Badda bing, badda boom, I got me a backup of all my user data, assuming the drive is good. One command, keyed in. One command that’s no harder to remember than the phone number of that pretty girl you met last week. (Or wish you met last week, whatever the case may be.) What’s hard about that?

In NT, you plug in the drive, you load device drivers, you load your backup software, it doesn’t recognize it, so you stop and start 47 services, then it finally recognizes the drive, and then you stumble around the backup software trying to figure out just how you tell it to make you a tape. By the time you figure all this out, in Unix, you’d have finished the backup.

Ugh. So, when I get home, I don’t want to have much of anything to do with these brain-dead machines infected with a virus written in Redmond. And the virus from Cupertino isn’t any better. I don’t have much appetite for my computers that run Linux either, because, well, it reminds me of the crap spewing out of Redmond and Cupertino. It’s kind of like a messy breakup, you know? You meet a girl who’s nothing like the last girl, but you don’t want to have anything to do with her because she’s female, breathes oxygen, and she’s carbon-based, so there’s the off chance she might remind you of that last disaster.

Hence the mail piling up in my inbox and the lack of updates for a couple of days.

So what have I been doing?

I’ve been reading books. I finished Dave Barry Turns 40 a couple of nights ago. It wasn’t as good as his later books, but it had a few howlers and part of a chapter that was actually sincere and serious and really made me think. It was about his mother after his dad died. They lived their lives together in this brick house he built himself, and after he died in 1984, she would write on her calendar, on April 24, “Dave died today, 1984. Come back Dave.” And on the day of their anniversary, she would write, “Married Dave, 1942. Best thing that ever happened to me.”

Finally, the house turned out to be too much for her to handle on her own, so she sold it and moved away.

And he went on for another page or two, talking about the last years of her life, trying to relate to her and failing miserably, as she wandered from place to place, living with relatives, never finding a place to call home, because what she really wanted was that brick house back with Dave Sr. in it.

As she died, she had that smile that all mothers have, that smile that tries to reassure her boy that everything’s going to be OK.

The story had a flashbulb effect on me. Partly because it came from Dave Barry, the guy who went on and on about cell phones, and how people who get cell phones have no escape at all, and sometimes they’re trapped in their cars for months, stuck on the phone, surviving on drive-thru food and peeing in the ashtray.

I can’t say I read very many things that jar me, but that short essay definitely did, especially the insight it gave on his parents’ relationship. How many people feel that way about the person they married 42 years ago? All too few, in this day and age. And since it came from the person I expected it from the least, it made it all the more jarring.

Since then, I’ve been reading White Palace. I understand it was made into a movie in the early 90s. It takes place in St. Louis. It’s a book about a relationship, and the relationship has absolutely zero substance. Sex sex sex sex sex sex sex. And more sex. (I wonder what that’s going to do to my Google rankings…) I really don’t want to like the book, especially after having my world rocked by a short essay that Dave Barry snuck into a comedy book and apologized about.

But I learned something.

The book has no plot. Guy meets girl in a bar. Guy and girl begin torrid affair. It’s a cheesy romance-novel plot. You find better plots laying outside on the sidewalk or in the parking lot.

The book does have compelling characters. The main character is 27 and his beloved wife died tragically when they were both 25. I’m 27 so I can relate to the guy on that level. And all of us have lost someone that we miss. And there’s a lot more about the guy too. I won’t give it all away. His (ahem) girlfriend has more substance than a plastic blow-up doll, although it would have been very easy not to give the character any substance. She’s in her early 40s, she drinks a lot, and she forgets to pay her bills. (At least she has priorities.) She works in a fast-food joint, and at at least one point in the book, she stops dead in her tracks, looks the character in the eye, and asks, “Why are you so good to me?”

Heart-wrenching line, that.

OK, so the book’s got good, well-developed characters. It also has a good setting. It takes place in St. Louis, and you can tell from the way he describes it all that he’s actually lived here. The main character lives in Kirkwood, and any St. Louisan instantly draws a mental picture. She lives in Dogtown, and any St. Louisan instantly draws a mental picture. He draws in places that St. Louisans are familiar with. He talks about Tony’s restaurant, and the book’s name comes from a fast-food joint that litters the St. Louis landscape (without infringing on a trademark). He even works in Concordia Seminary, and Cindy’s Motel. Any St. Louisan will instantly love the book because it describes home. I wonder how many St. Louisans utter aloud the words, “Where’d you go to high school?” while reading it.

He made St. Louis real, and he made it compelling.

Great characters, great setting… He didn’t need a plot.

And now I find myself itching to write fiction. I get that bug every couple of years. I wrote 100 pages’ worth of novel while I was in college. It was the opposite of White Palace. It had a good plot. Maybe even better than good, but I can’t be objective about my own work. But to the very few people I’ve described it to, it’s been riveting. But the characters were awful and so was the setting.

That manuscript is lost, as far as I know. Some version of it might be on my Amiga’s hard drive, but I wouldn’t hold my breath. No great loss. I intend some day to revisit that plot, plop it down in a compelling setting, and drop some compelling characters into it. There’s really only one question.

Have I lived enough yet to pull it off?

Who knows. Right now, who cares? I’m gonna go read some more. I think the UV from this monitor is getting to my head.

How to get my job

I’ve had a couple of people ask me in the past couple of weeks how to break into the computer field. It was a tough question. I literally got into fixing these things because I couldn’t find a repair shop in St. Louis that I felt I could trust. So I started trying to fix them myself. I might break it beyond repair, but one time we had a repair done that cost more than replacing the unit outright would have cost. So what did I have to lose, right?
I took my Commodore 128 apart a few times. Usually it was for an upgrade, but once it was to clean the keyboard because keys weren’t working anymore. It was an adventure, and I had to learn how to solder first. My dad’s friend Norb taught me how. He was a building inspector. No wonder I still solder like a plumber, even to this day. So I de-soldered the 6 connections I had to in order to get into the keyboard, removed the dozens of tiny screws, cleaned up the printed circuit board, put it back together, re-soldered those connections, reassembled the computer, and held my breath. It worked. Cool. It didn’t impress the girls, but it saved me at least 50 bucks.

It was my uncle’s approach. I remember riding with him to an auto parts store once, then watching him work on his truck. “I don’t know what I’m doing when it comes to cars,” I said.

“I don’t let that stop me,” he said. “I just have to do it.”

His truck cost more than any computer I’ve ever owned.

Later on, I got an upgrade ROM for my Amiga 2000. So my dad came home one day to find me hovering above my Amiga, which was sprawled across his OMT table. The cover was off, the power supply was out, and the drive cage was out, and there I was, slowly prying out a chip with a screwdriver. Dad gave me a nervous look. “You gonna be able to get that thing back together?” he asked me. “Sure,” I said. I didn’t tell him how many times I’d had it apart before. So he stood there and watched me as I finished extracting the chip, popped the new one into place, and re-installed the power supply and drive cage.

Eventually I got smart and realized I shouldn’t be experimenting on computers that I cared about. XT clones cost about 20 bucks when people wouldn’t just give them to you, so I got a couple. I ripped them apart, figured out how a computer was really put together, and reassembled them. And yes, I even took parts from one and put them in the other to see what would happen. I was pretty sure it would work. It did. Eventually I did something stupid (I don’t remember what anymore) and I killed at least one of those XT clones, but it wasn’t important. I’d learned a lot from them, and I was only out 20 bucks. That’s assuming I wasn’t given the thing outright–I don’t remember that detail anymore either.

I needed that skill the next year. I was living in a fraternity house, and the power supply died in the house computer. I knew enough by then to diagnose it, and I headed off to the local computer shop for parts. They didn’t have any power supplies that would fit, and the motherboard was nonstandard. But they had a lineup of barebones systems sprawled across the floor. A bare 386 cost about $200. I knew the rest of the system worked. So I talked it over with the treasurer, then came back with a house check and bought a 25 MHz 386DX. I took it home, popped the case on the house computer, pulled out the video card and all the I/O cards, installed them in the 386, and found the computer wouldn’t recognize the hard drive. We eventually worked through that one (it turned out we had one of the very few 8-bit IDE drives ever made, and that 8-bit controller did not get along with our 386 one bit) and we got a working system up and going.

By the time I graduated there were at least half a dozen guys in that house capable of doing that job. Times changed (swapping a motherboard was much more of an endeavor in 1993 than it was in 1997, because by then so many components that had once been discrete and configured by jumpers were integrated and configurable through the BIOS Setup), and I’d like to think most of them learned at least a little something from me.

That summer, I got a job selling computers. An opportunity arose when the store technician developed a difficulty showing up for work. They never fired the guy, but since he was only there half the time, I got to be the tech the other half. When he was there, I learned a lot from him.

The next school year, I got wind of a job opportunity on campus. The journalism department had a batch of 300 new IBM PC 330s and 350s. Every last one of them needed to be unboxed and upgraded with extra memory and a NIC, then plugged into the network, where one of the more experienced techs could do a push install of OS/2. I got the job, and I learned a ton from those guys. These are guys who had seen prototypes of the IBM PS/2 Model 80, and who occasionally had to whip out a soldering gun and make a change to the motherboard with an engineer from IBM on the phone. You bet they had a lot to teach me.

That part-time job eventually grew into a full-time job when those guys recognized that I was willing to work hard and willing and able to learn.

That approach worked really well for me. But I had the advantage of being young and being able to wait for opportunities and take them as they came. I also had the advantage of growing up with the things (the schools I went to had computers and taught computer classes, all the way back to when I was in the second grade) and messing with them for the majority of my life.

Realistically, I don’t think that approach would work for an adult with minimal computer skills and a family to support. Or at least it wouldn’t work on a quick timeframe. I’ve tried to teach 24-year-olds starting from ground zero how to do this. It didn’t work very well.

It’s a lot easier to teach someone how to write.