Memoirs of writing a book

Book memoirs. I got e-mail yesterday asking me to reflect back on dealing with a publisher while writing a book. I’ll never talk publicly about specifics, and I’m not even positive how much I’ve told my closest friends, for that matter. But it got me thinking.
And, as it turns out, it was two years ago this month that I sent my manuscript off to O’Reilly, and it was in late October that I got a set of PDF files to read and correct. In hindsight, I should have asked for a hardcopy, because I would have found more mistakes. But in hindsight, I’d do a lot of things differently.

First, I’d ask for more money. That’s not as much about greed as it is about raising the stakes. Supposedly, my advance on my first book was on the high side for first books in the computer field. Whether that’s true or whether that was my agent trying to stroke my ego after the fact, I’m not sure. Fact is, I took their first offer, and I’m pretty sure I took it the same day. Big mistake. I was staying up nights wondering if O’Reilly was interested in me. Nothing a first-time author can do will make an editor do the same thing, but the author needs to give the publisher a little time to wonder. I’m sure if I’d been sitting at my desk when the offer came in, I’d have responded immediately.

Don’t do that. Sleep on it. Then, I’m inclined to suggest you should make a counter-offer. What are they going to do, withdraw the offer if they don’t like your counter-offer? Get real. If they offer you an $8,000 advance and you demand $120,000, they’ll be insulted, yes. They’ll probably tell you you’re being unrealistic. Me, I hate haggling as much as I hate schmoozing, so I can’t give you any meaningful advice on how to dicker. It’s like asking a girl out. You go with your instincts and hope you don’t send off unwelcome I-want-you-to-have-my-children signals. Just as I wouldn’t trust anyone who claimed to have a sure-fire way of asking girls out, I wouldn’t trust anyone who claimed to have bulletproof negotiating technique.

I wondered if any publisher would be interested in me, and that was why I bit so quickly. Once again, a dating analogy helps. If one girl seems semi-interested in you, chances are there’s another girl somewhere who’d be semi-interested in you. Maybe I didn’t realize it at the time, but having O’Reilly interested in me was like having a Prom Queen candidate come sit down at my lunch table. Any overconfident and annoying stud knows when that happens, he’s got a chance with any unattached pretty girl in the room. And there are a lot of pretty girls who were never a candidate for Prom Queen. Likewise, there are a lot of good publishers who aren’t O’Reilly.

I found that out after publishing the first book. Macmillan wouldn’t give me the time of day before then. They were an early candidate for my second. IDG wanted to do my third book, but they wanted it to be a Dummies book and I wanted it to be a standalone, so that one never got beyond proposal stage. No Starch and Sybex also expressed interest in my work at one time or another. That experience made me realize that I didn’t have to marry O’Reilly. Just one date was enough to bring plenty of other suitors.

The issue of representation comes up. I had authors tell me I was a fool for writing my first book without an agent. That makes sense. An agent is better-equipped to play hardball than you are. He knows what his other clients get. He knows what else the publisher is working on. He knows what the publisher’s competition is working on. He’s emotionally detached from the work, so he can afford to make an editor sweat a little. And he knows what risks are worth taking and what he has to do beforehand. (I just realized I implied there are no female literary agents. There are. Every agent I’ve worked with happens to be male.)

But there’s something to remember about agents. Your agent doesn’t just work for you. Your agent has to maintain a good working relationship with every publisher in the business in order to stay in business. So when things get really ugly, your agent might not stand beside you the way you’d like. And no, I’d really rather not elaborate on that, other than to say I worked with an agency, but recently I’ve negotiated all of my magazine contracts myself, sealed deals my agent never would have, and I feel like I got a fair deal on them considering the amount of work required on my part.

I guess the other mistake I made was not talking about the book enough. Sure, all my friends knew about it. Pastor announced it in front of the congregation a couple of times (and he still introduces me as “Author Dave Farquhar” sometimes but not as often now that there’s another author in our congregation who’s published a lot more books than I have), so it seemed like the whole church knew about it. That was the problem. Everyone knew what I thought of the book and its prospects, except my publisher. Yeah, my editor and I talked about the prospective market, and we argued about the title. I backed down way too quickly–I still hate that title, and it takes an awful lot for me to hate something enough to put it in blinky text. Sorry Netscape users.

My editor and I should have talked a lot more about it. We talked some during the negotiating period. We talked briefly about the title once the ink was dry on the contract. I wrongly spent the majority of my non-asleep time just working on it. The result was a critically acclaimed book that sold about as many copies as it would have if I’d hawked it myself on a streetcorner. I should have sat down for three hours, written down every possible title that came to mind, and e-mailed it to my editor. I should have had the courage to suggest that hey, I know it’s not The O’Reilly Way, but this book is targetting consumers, not sysadmins, so why not do a more consumer-oriented cover? Sure, sysadmins will buy it based on the publisher, but there are a lot of consumers out there, and they don’t know Tim O’Reilly from Adam Osborne (sorry, that was bad).

A guy who’ll post every bizarre idea that’s ever crept into his head on his web site should have been willing to send a few bizarre ideas to Cambridge and Sebastopol and be shot down. The idea isn’t so much to get your way as it is to make everyone else think. There’s a fine line between insanity and genius, and it doesn’t matter much which side of the line you’re on, as long as someone involved in the project ends up on the right side.

So yeah. I should have been talking to my editor about the book. I should have found out who the marketing director was and talked to him or her once or twice a month, so that each of us knew what the other was thinking. I still don’t know who the marketing director for Optimizing Windows was. I never did.

Once the thing was on the shelves, I made another mistake. One of the marketing gals got me a gig on a talk radio show. I don’t remember which one anymore. The host flaked out–dropped off the face of the earth the week before my scheduled appearance. He wouldn’t return my calls or hers. As soon as the possibility of being on the show arose, I should have asked for a phone number. That’s fine if she wants to make the arrangements, but I should have talked to the guy–or at least someone with the show–at least once. That way he knows I’m serious, and he’d better be serious.

A little while later, ZDTV’s The Screen Savers came calling. They wanted me to appear, but they wanted me to pay all my own expenses. Well, flying to San Francisco from St. Louis on short notice isn’t cheap. Neither are the other accomodations. I asked her if the appearance would sell enough books to justify the expenses. She talked it over with the rest of her cohorts and said it’d be a ton of fun and I’d get to meet a bunch of interesting people and my Web site counter would go ballistic, but I’d be lucky to sell a couple hundred books based on the appearance.

I turned them down, and rightly so. They had nothing to lose if they bumped me, and then I’d be out the money and everything else. But in retrospect, once again, I should have talked to someone at ZDTV. That way, we’d have at least gotten a feel for how serious the other was.

It seems to me that authors who sell lots of books spend an awful lot more time on the phone than I did. I assumed that if I put my time and effort into making a quality product, it’d sell itself. I underestimated how many titles it was competing with. There are plenty of mediocre (or worse) books that outsold mine, and by a landslide. And it’s pretty obvious why. Their authors were more serious about selling books than I was.

I guess there’s one other question worth answering. What publishers should you work with? I’m hesitant about big publishers. They tend to pay less, and they market whatever they feel like marketing. For every Dan Gookin (DOS for Dummies) and Andy Rathbone (Windows for Dummies), there are plenty of authors who sold fewer books than I did. The stakes aren’t high enough. When you publish hundreds of books a year, only a few of them can be bestsellers, and the potential blockbusters seem to get all of the marketing effort. They can afford to take a chance on a long shot, and you might get your hopes too high. Or they can afford to pick up a book just to keep a competitor from getting it, which is what I suspect happened with my second ill-fated book. That’s not to say I wouldn’t work with a big publisher, but it wouldn’t be my first choice.

Unfortunately, I think O’Reilly’s gotten too big. The stakes weren’t high enough with Optimizing Windows. A company like No Starch publishes a dozen titles a year. Realistically, every title, or nearly every title, has to make money when you publish in those quantities. I think that’s part of the reason why O’Reilly doesn’t release nearly as many new titles now as they did two or three years ago, and why they’ve pulled out of certain markets. Too many titles were losing money, and the big sellers weren’t making up the difference.

If I had it to do all over again, I’d probably still go with O’Reilly on my first title, for the same reason that everyone wants one date with the prom queen. Everyone craves prestige, and the sooner you get it, the better off you think you are. For the second book that never happened? Lots of possibilities, but No Starch seems attractive. No Starch is a cool company, and they’re smaller. Presumably, they’re nicer because they have to be. And while it was cool to be seen with the beauty queen, when there’s another girl who seems nicer, the smart guy will go see what spending time with that nice girl is like.

So when will I write another book and do it right this time? Keep in mind writing a book is like a bad relationship. It has a lot of high points, but nothing feels better than the moment it’s over. And keep in mind that after one bad relationship, I waited four years before starting my next one.

I feel this sudden urge to prove I really exist…

Do one thing every day that scares you.
Sing.
Don’t be reckless with other people’s hearts.
Don’t put up with people who are reckless with yours.
–Mary Schmich, “Everybody’s Free to Wear Sunscreen”

I want to prove I really exist, and I’m trying to figure out how I can do it. What are the tell-tale signs of a hoax? Lack of pictures and a claim of hating to have your picture taken. Well, I hate having my picture taken. Gatermann’s got an album full of pictures of me holding my hands in front of my face. He collects ’em or something. I know of four pictures of me floating around on the Web, total, and two of them were scans off newsprint.

Another sign: Lots of people claiming to have talked to me via e-mail or even over the phone, but not in person. Dan Bowman and I have talked a lot, and I consider him a close friend. Other Daynoters or Webloggers? Tom Syroid and I used to talk on the phone. But that’s it. I’ve had conversations over e-mail with Doc Jim, and with JHR, and with Matt Beland, and with Brian Bilbrey. But who’s seen me in person? Well, Steve DeLassus and Tom Gatermann, both of whom I claim to have known for more than 10 years, but I could have fabricated them too.

Debilitating problem? Well, carpal tunnel syndrome is very small potatoes compared to leukemia, but it is a death sentence for a writer. I disappeared for about six months over it.

Really, it’s pretty hard to prove I’m not a hoax. I can link to my old writings from college that are online, circa 1996, (I published under “Dave Farquhar” in those days) and of course there’s that O’Reilly book and those Computer Shopper UK articles. Those will establish a consistency of writing style. My relatives that I mention don’t Weblog, and their writing styles are pretty distinct from mine–both my mom and sister are pretty good writers but I’ve got a lot of quirks they don’t. And neither have made many appearances on these pages.

I’m going to hold back a lot of personal details, because someone I hadn’t spoken to in about 10 months freaked me out back in January and, after reading my weblogs in their entirety, recited to me virtually every detail of my life based on what I’d written and a few educated guesses. Some of the details were wrong, but not enough of them were.

But if anyone really wants to check, I was born in Kansas City, Mo. I lived a lot of places, but most notably in Farmington, Mo., from 1983 to 1988, and in Fenton, Mo., from 1988 to 1993 (and I continued to call Fenton my home through 1996 when I was in college). I graduated from Lutheran High School South, St. Louis, in 1993. I graduated from the University of Missouri-Columbia, with a degree in journalism (no minor) in 1997. I was employed by the University of Missouri in 1997 and 1998, so I’m even listed in the 1998 issue of the Official Manual of the State of Missouri. All of this should be pretty easily verifiable.

Or you can just take me at my word. It comes down to honesty, and futility. Why would anyone hoax a 20-something systems administrator? And why would they publish a book and a bunch of magazine articles under my name? It would be pointless. A pile of computer tips isn’t a compelling enough story to fake.

So what is compelling? A struggle. This past weekend’s struggle with a system upgrade showed I was human and don’t really care if people think I’m a computer genius or not. I guess that’s kind of compelling, because most of us can’t get our computers working quite right. Netscape cofounder Marc Andreesen endeared himself to thousands when he admitted in a magazine interview that his home PC crashes a lot and he never did get his printer working right. But an underdog is better. Noah Grey is a whole lot more compelling than me, because we’ve all felt a little shy sometimes, so his agoraphobia is something we can somewhat relate to. He can reach out to the world and we can share a little in his struggle and root for him. And Kaycee Nicole Swenson, well, she was just too good to be true–a 19-year-old who was wise and mature well beyond her years, a great writer, insightful, broken-hearted, sincere… Every male over 35 wanted her to be his daughter. As for the males under 35, she’d have made a great kid sister. But I suspect a good percentage of them would have wanted to date her, or someone just like her.

I don’t remember if this was exactly how she put it, but an old classmate once observed that the Internet allows us to safely pick our friends from a pool of millions, and usually we can find people who at least seem to be a whole lot more interesting (or better matches for us) than the people we can meet face-to-face, and we can quickly and painlessly get new ones and dispose of them on a whim. She wrote those words in 1997, but aren’t they a perfect description of Kaycee and the rest of the Weblogging phenomenon?

Steve DeLassus raised an interesting point this afternoon. He asked why a 19-year-old dying of leukemia or complications from leukemia would weblog at all. Wouldn’t she have better things to do? That’s an honest question, but I know if something like that were happening to me, I’d weblog. It’s cathartic, for one thing. When I was struggling with depression, I wrote about it in my newspaper column. I found it a whole lot easier to just pour my heart and soul into my word processor than to talk to someone about what I was feeling. I needed to get it out of my system, but you never know how people are going to react. When you can detach yourself from the words, it doesn’t matter. Some will scoff, but you won’t know. Some will totally understand, and you won’t know. Others will totally get it, and they’ll reach out to you, and then it’s all totally worth it. You know there’s something to them, because they had to make an effort to find your words, probably, and then they had to make an effort to communicate with you. You find special people that way.

Yeah, it’s kinda selfish. But it’s safe, and when you’re vulnerable, you need safe.

I’ve given zero enlightenment into the whole Kaycee Nicole hoax. I know a lot of people are hurting. I never got attached to her, because I only read her a couple of times a month. Over the weekend, I went back to Week 1 and started reading from there, to see what I missed. I guess I figured catching the reruns was better than missing it entirely. And I started to understand her appeal a bit more. And now I understand the hurt. It’s not nice to play with people’s hearts.

And some people will probably put up their walls and vow never to be hurt that way again. It’d be hard not to blame them.

But I hope they don’t. Because the only thing worse than the feeling after someone played with your heart is the feeling of being alone.

More Like This: Personal Weblogs

Two phone calls, one weird, one sad

My phone rang at about 1 p.m. this afternoon. I picked up. “Hello?”
“Hi!” Some unfamiliar female voice was overly happy to hear mine.

“Hi!” I said back, figuring I’d play along and try to buy some time to figure out who on earth this was.

“Dave?” she asked.

“Uhh, yee-ah,” I said, slowly.

“How are you!?” she asked forcefully, still way too happy.

I paused and analyzed the voice. Adult. Pre-middle age, probably in the 20s or 30s. Female. African-American. I ran through the list of people I know matching that description. No match. “Umm, I’m sorry, but I have no idea who this is,” I said.

“Yes you do!” her enthusiasm was unwaning.

“Umm,” I know, I know, I’m just a stupid male, but I honestly was drawing a blank.

“It’s your lover,” she said, tenderly and huskily enough to really freak me out.

Now, I haven’t had a date in eight months, so the likelihood of any female believing herself to be my lover is, well, really remote. Besides, I’m of the persuasion that the act that most people associate with the endearing term “lover” ought to wait until after marriage. So, very obviously, one of the people in this conversation was mistaken, and I was pretty sure it wasn’t me.

“It’s your lover,” she said again, pretty much the same way. I was starting to wonder if this wasn’t a practical joke someone was pulling off on me. I know more than a few pranksters, after all. I decided to play it safe.

“I… don’t have… one,” I said finally.

She laughed. “Sure you do, Dave!” And she said her name. I didn’t know anybody by that name.

“I’m pretty sure you’ve got the wrong number. This is Dave Farquhar.”

She let out a very embarrassed laugh. “Oh, I’m so sorry,” she said.

“That’s OK,” I said. “Don’t worry about it.”

She laughed some more. She was still laughing when I hung up the phone.

I laid the phone down on its cradle, and it wasn’t 30 seconds later that it was ringing again. I picked up again. “Hello?”

It was the same woman again. Her mood changed quickly. Very quickly. She verified the phone number she dialed. Then she turned desperate. “Are you visiting from out of town?” Nope, this is my phone number, and I live alone. “Did you just get this number?” Nope, I’ve had it for more than two years. I could hear the hurt. She didn’t have to tell me the story. I could pretty much put it together myself. Boy meets girl. Boy sleeps with girl, probably promising something more later. Boy gets what he wants. Boy makes up a phone number and gives it to girl so he won’t have to deal with commitment.

Scumbag.

Mac mice, PC data recovery

A two-button Mac mouse!? Frank McPherson asked what I would think of the multibutton/scroll wheel support in Mac OS X. Third-party multibutton mice have been supported via extensions for several years, but not officially from Ye Olde Apple. So what do I think? About stinkin’ time!

I use 3-button mice on my Windows boxes. The middle button double-clicks. Cuts down on clicks. I like it. On Unix, where the middle button brings up menus, I’d prefer a fourth button for double-clicking. Scroll wheels I don’t care about. The page up/down keys have performed that function just fine for 20 years. But some people like them; no harm done.

Data recovery. One of my users had a disk yesterday that wouldn’t read. Scandisk wouldn’t fix it. Norton Utilities 2000 wouldn’t fix it. I called in Norton Utilities 8. Its disktool.exe includes an option to revive a disk, essentially by doing a low-level format in place (presumably it reads the data, formats the cylinder, then writes the data back). That did the trick wonderfully. Run Disktool, then run NDD, then copy the contents to a fresh disk immediately.

So, if you ever run across an old DOS version of the Norton Utilities (version 7 or 8 certainly; earlier versions may be useful too), keep them! It’s something you’ll maybe need once a year. But when you need them, you need them badly. (Or someone you support does, since those in the know never rely on floppies for long-term data storage.) Recent versions of Norton Utilities for Win32 don’t include all of the old command-line utilities.

Hey, who was the genius who decided it was a good idea to cut, copy and paste files from the desktop? One of the nicest people in the world slipped up today copying a file. She hit cut instead of copy, then when she went to paste the file to the destination, she got an error message. Bye-bye file. Cut/copy-paste works fine for small files, but this was a 30-meg PowerPoint presentation. My colleague who supports her department couldn’t get the file back. I ride in on my white horse, Norton Utilities 4.0 for Windows in hand, and run Unerase off the CD. I get the file back, or so it appears. The undeleted copy won’t open. On a hunch, I hit paste. Another copy comes up. PowerPoint chokes on it too.

I tried everything. I ran PC Magazine’s Unfrag on it, which sometimes fixes problematic Office documents. No dice. I downloaded a PowerPoint recovery program. The document crashed the program. Thanks guys. Robyn never did you any harm. Now she’s out a presentation. Not that Microsoft cares, seeing as they already have the money.

I walked away wondering what would have happened if Amiga had won…

And there’s more to life than computers. There’s songwriting. After services tonight, the music director, John Scheusner, walks up and points at me. “Don’t go anywhere.” His girlfriend, Jennifer, in earshot, asks what we’re plotting. “I’m gonna play Dave the song that he wrote. You’re more than welcome to join us.”

Actually, it’s the song John and I wrote. I wrote some lyrics. John rearranged them a little (the way I wrote it, the song was too fast–imagine that, something too fast from someone used to writing punk rock) and wrote music.

I wrote the song hearing it sung like The Cars, (along the lines of “Magic,” if you’re familiar with their work) but what John wrote and played sounded more like Joe Jackson. Jazzy. I thought it was great. Jennfier thought it was really great.

Then John tells me they’re playing it Sunday. They’re what!? That will be WEIRD. And after the service will be weird too, seeing as everybody knows me and nobody’s ever seen me take a lick of interest in worship music before.

I like it now, but the lyrics are nothing special, so I don’t know if I’ll like it in six months. We’ll see. Some people will think it’s the greatest thing there ever was, just because two people they know wrote it. Others will call it a crappy worship song, but hopefully they’ll give us a little credit: At least we’re producing our own crappy worship songs instead of playing someone else’s.

Then John turns to me on the way out. “Hey, you’re a writer. How do we go about copyrighting this thing?” Besides writing “Copyright 2000 by John Scheusner and Dave Farquhar” on every copy, there’s this.  That’s what the Web is for, friends.

~~~~~~~~~~

Note: I post this letter without comment, since it’s a response to a letter I wrote. My stuff is in italics. I’m not sure I totally agree with all of it, but it certainly made me think a lot and I can’t fault the logic.

From: John Klos
Subject: Re: Your letter on Jerry Pournelle’s site

Hello, Dave,

I found both your writeup and this letter interesting. Especially interesting is both your reaction and Jerry’s reaction to my initial letter, which had little to do with my server.To restate my feelings, I was disturbed about Jerry’s column because it sounded so damned unscientific, and I felt that he had a responsibility to do better.
His conclusion sounded like something a salesperson would say, and in fact did sound like things I have heard from salespeople and self-promoted, wannabe geeks. I’ve heard all sorts of tales from people like this, such as the fact that computers get slower with age because the ram wears out…

Mentioning my Amiga was simply meant to point out that not only was I talking about something that bothered me, but I am running systems that “conventional wisdom” would say are underpowered. However, based upon what both you and Jerry have replied, I suppose I should’ve explained more about my Amiga.

I have about 50 users on erika (named after a dear friend). At any one moment, there are anywhere from half a dozen to a dozen people logged on. Now, I don’t claim to know what a Microsoft Terminal Server is, nor what it does, but it sounds something like an ’80s way of Microsoft subverting telnet.

My users actually telnet (technically, they all use ssh; telnet is off), they actually do tons of work is a shell, actually use pine for email and links (a lynx successor) for browsing. I have a number of developers who do most of their development work in any of a number of languages on erika (Perl, C, C++, PHP, Python, even Fortran!).

Most of my users can be separated into two groups: geeks and novices. Novices usually want simple email or want to host their domain with a minimum of fuss; most of them actually welcome the simplicity, speed, and consistency of pine as compared to slow and buggy webmail. Who has used webmail and never typed a long letter only to have an error destroy the entire thing?

The geeks are why sixgirls.org got started. We all
had a need for a place
to call home, as we all have experienced the nomadic life of being a geek
on the Internet with no server of our own. We drifted from ISP to ISP
looking for a place where our Unix was nice, where our sysadmins listened,
and where corporate interests weren’t going to yank stuff out from underneath us at any moment. Over the years, many ISPs have stopped
offering shell access and generally have gotten too big for the comfort of
geeks.

If Jerry were replying to this now, I could see him saying that shells are
old school and that erika is perhaps not much more than a home for  orphans and die-hard Unix fans. I used to think so, too, but the more novice users I add, the more convinced I am that people who have had no shell experience at all prefer the ease, speed, and consistency of the shell
over a web browser type interface. They’re amazed at the speed. They’re
surprised over the ability to instantly interact with others using talk and ytalk.

The point is that this is neither a stopgap nor a dead end; this IS the
future. I read your message to Jerry and it got me thinking a lot. An awful
lot. First on the wisdom of using something other than what Intel calls a server, then on the wisdom of using something other than a Wintel box as a server. I probably wouldn’t shout it from the mountaintops if I were doing it, but I’ve done it myself. As an Amiga veteran (I once published an article in Amazing Computing), I smiled when I saw what you were doing with your A4000. And some people no doubt are very interested in that. I wrote some about that on my Weblogs site (address below if you’re interested).

I am a Unix Systems Administrator, and I’ve set up lots of servers. I made
my decision to run everything on my Amiga based upon several
criteria:
One, x86 hardware is low quality. I stress test all of the servers I
build, and most x86 hardware is flawed in one way or another. Even if
those flaws are so insignificant that they never affect the running of a
server, I cannot help but wonder why my stress testing code will run just
fine on one computer for months and will run fine on another computer for
a week, but then dump a core or stop with an error. But this is quite
commonplace with x86 hardware.

For example, my girlfriend’s IBM brand FreeBSD computer can run the stress testing software indefinitely while she is running the GIMP, Netscape, and all sorts of other things. This is one of the few PCs that never has any problems with this stress testing software. But most of the other servers I set up, from PIIIs, dual processor PIIIs and dual Celerons, to Cyrix 6×86 and MII, end up having a problem with my software after anywhere from a few days to a few weeks. But they all have remarkable uptimes, and none crash for any reason other than human error (like kicking the cord).

However, my Amigas and my PowerMacs can run this software indefinitely.

So although I work with x86 extensively, it’s not my ideal choice. So what
else is there? There’s SPARC, MIPS, m68k, PowerPC, Alpha, StrongARM… pleanty of choices.

I have a few PowerMacs and a dual processor Amiga (68060 and 200 mhz PPC 604e); however, NetBSD for PowerMacs is not yet as mature as I need it to be. For one, there is no port of MIT pthreads, which is required for MySQL. Several of my users depend on MySQL, so until that is fixed, I can’t consider using my PowerMac. Also, because of the need to boot using Open Firmware, I cannot set up my PowerMac to boot unattended. Since my machine is colocated, I would have to be able to run down to the colocation facility if anything ever happened to it. That’s
fine if I’m in the city, but what happens when I’m travelling in Europe?

SPARC is nice, but expensive. If I could afford a nice UltraSPARC, I
would. However, this porject started as a way to have a home for
geeks; coming up with a minimum of $3000 for something I didn’t even plan to charge for wasn’t an option.

Alpha seems too much like PC hardware, but I’d certainly be willing to
give it a try should send me an old Alpha box.

With MIPS, again, the issue is price. I’ve always respected the quality of
SGI hardware, so I’d definitely set one up if one were donated.

StrongARM is decent. I even researched this a bit; I can get an ATX
motherboard from the UK with a 233 mhz StrongARM for about 310 quid. Not too bad.

But short of all of that, I had a nice Amiga 4000 with a 66 mhz 68060, 64
bit ram, and wide ultra SCSI on board. Now what impresses me about this
hardware is that I’ve run it constantly. When I went to New Orleans last
year during the summer, I left it in the apartment, running, while the
temperatures were up around 100 degrees. When I came back, it was
fine. Not a complaint.

That’s the way it’s always been with all of my Amigas. I plug them in,
they run; when I’m done, I turn off the monitor. So when I was considering
what computer to use as a server when I’d be paying for a burstable 10
Mbps colocation, I wanted something that would be stable and consistent.

 Hence Amiga.

One of my users, after reading your letter (and, I guess, Jerry’s),
thought that I should mention the load average of the server; I assume
this is because of the indirectly stated assumption that a 66 mhz 68060 is
just squeaking by. To clarify that, a 66 mhz 68060 is faster per mhz than
any Pentium by a measurable margin when using either optimised code (such as a distributed.net client) or straight compiled code (such as LAME). We get about 25,000 hits a day, for a total of about 200 megs a day, which accounts for one e

ighth of one percent of the CPU time. We run as a Stratum 2 time server for several hundred computers, we run POP and IMAP services, sendmail, and we’re the primary nameserver for perhaps a hundred machines. With a distributed.net client running, our load average hovers arount 1.18, which means that without the dnet client, we’d be idle most of the time.

If that weren’t good enough, NetBSD 1.5 (we’re running 1.4.2) has a much
improved virtual memory system (UVM), improvements and speedups in the TCP stack (and complete IPv6 support), scheduler enhancements, good softdep support in the filesystem (as if two 10k rpm 18 gig IBM wide ultra drives aren’t fast enough), and more.

In other words, things are only going to get better.

The other question you raise (sort of) is why Linux gets so much more
attention than the BSD flavors. I’m still trying to figure that one
out. Part of it is probably due to the existance of Red Hat and
Caldera and others. FreeBSD gets some promotion from Walnut
Creek/BSDi, but one only has to look at the success of Slackware to
see how that compares.

It’s all hype; people love buzz words, and so a cycle begins: people talk
about Linux, companies spring up to provide Linux stuff, and people hear
more and talk more about Linux.

It’s not a bad thing; anything that moves the mainstream away from
Microsoft is good. However, the current trend in Linux is not good. Red
Hat (the company), arguably the biggest force in popularising Linux in the
US, is becoming less and less like Linux and more and more like a software company. They’re releasing unstable release after unstable release with no apologies. Something I said a little while ago, and someone has been using as his quote in his email:
In the Linux world, all of the major distributions have become
companies. How much revenue would Red Hat generate if their product was flawless? How much support would they sell?

I summarise this by saying that it is no longer in their best interest to
have the best product. It appears to be sufficient to have a working
product they can use to “ride the wave” of popularity of Linux.

I used Linux for a long time, but ultimately I was always frustrated with
the (sometimes significant) differences between the distributions, and
sometimes the differences between versions of the same distribution. Why
was it that an Amiga running AmigaDOS was more consistent with Apache and Samba docs than any particular Linux? Where was Linux sticking all of
these config files, and why wasn’t there documentation saying where the
stuff was and why?

When I first started using BSD, I fell in love with its consistency, its
no bull attitude towards ports and packa
ges, and its professional and
clean feel. Needless to say, I don’t do much linux anymore.

It may well be due to the people involved. Linus Torvalds is a
likeable guy, a smart guy, easily identifiable by a largely computer
illiterate press as an anti-Gates. And he looks the part. Bob Young is
loud and flambouyant. Caldera’s the company that sued Microsoft and probably would have won if it hadn’t settled out of court. Richard
Stallman torques a lot of people off, but he’s very good at getting
himself heard, and the GPL seems designed at least in part to attract
attention. The BSD license is more free than the GPL, but while
freedom is one of Stallman’s goals, clearly getting attention for his
movement is another, and in that regard Stallman succeeds much more than the BSD camp. The BSD license may be too free for its own good.

Yes, there aren’t many “figureheads” for BSD; most of the ones I know of
don’t complain about Linux, whereas Linux people often do complain about the BSD folks (the major complaint being the license).

I know Jerry pays more attention to Linux than the BSDs partly because Linux has a bigger audience, but he certainly knows more about Linux than about any other Unix. Very soon after he launched his website, a couple of Linux gurus (most notably Moshe Bar, himself now a Byte columnist) started corresponding with him regularly, and they’ve made Linux a reasonably comfortable place for him, answering his questions and getting him up and going.

So then it should be their responsibility, as Linux advocates, to give
Jerry a slightly more complete story, in my opinion.

As for the rest of the press, most of them pay attention to Linux only because of the aforementioned talking heads. I have a degree in journalism from supposedly the best journalism school in the free world, which gives me some insight into how the press works (or doesn’t, as is usually the case). There are computer journalists who get it, but a g

ood deal of them are writing about computers for no reason in particular, and their previous job and their next job are likely to be writing about something else. In journalism, if three sources corroborate something, you can treat it as fact. Microsoft-sympathetic sources are rampant, wherever you are. The journalist probably has a Mac sympathy since there’s a decent chance that’s what he uses. If he uses a Windows PC, he may or may not realize it. He’s probably heard of Unix, but his chances of having three local Unix-sympathetic sources to use consistently are fairly slim. His chances of having three Unix-sympathetic sources who agree enough for him to treat what they say as fact (especially if one of his Microsofties contradicts it) are probably even more slim.

Which furthers my previous point: Jerry’s Linux friends should be more
complete in their advocacy.

Media often seems to desire to cater to the lowest common denominator, but it is refreshing to see what happens when it doesn’t; I can’t stand US
news on TV, but I’ll willingly watch BBC news, and will often learn more
about US news than if I had watched a US news program.

But I think that part of the problem, which is compounded by the above, is
that there are too many journaists that are writing about computers,
rather than computer people writing about computers.

After all, which is more presumptuous: a journaist who thinks that he/she
can enter the technical world of computing and write authoritatively about
it, or a computer person who attempts to be a part time journalist? I’d
prefer the latter, even if it doesn’t include all of the accoutrements
that come from the writings of a real journalist.

And looking at the movement as a whole, keep in mind that journalists look for stories. Let’s face it: A college student from Finland writing an operating system and giving it away and millions of people thinking it’s better than Windows is a big story. And let’s face it, RMS running
around looking like John the Baptist extolling the virtues of something called Free Software is another really good story, though he’d get a lot more press if he’d talk more candidly about the rest of his life, since that might be the hook that gets the story. Can’t you see this one now?

Yes. Both of those stories would seem much more interesting than, “It’s
been over three years and counting since a remote hole was found in
OpenBSD”, because it’s not sensationalistic, nor is it interesting, nor
can someone explain how you might end up running OpenBSD on your
appliances (well, you might, but the fact that it’s secure means that it’d
be as boring as telling you why your bathtub hasn’t collapsed yet).

Richard Stallman used to keep a bed in his office at the MIT Artificial Intelligence Lab.

He slept there. He used the shower down the hall. He didn’t have a home outside the office. It would have distracted him from his cause: Giving away software.

Stallman founded the Free Software movement in 1983. Regarded by many as the prophet of his movement (and looking the part, thanks to his long, unkempt hair and beard), Stallman is both one of its most highly regarded programmers and perhaps its most outspoken activist, speaking at various functions around the world.

Linux was newsworthy, thanks to the people behind it, way back in 1993 when hardly anyone was using it. Back then, they were the story. Now, they can still be the story, depending on the writer’s approach.

If there are similar stories in the BSD camp, I’m not aware of them. (I can tell you the philosophical differences between OpenBSD,  NetBSD and FreeBSD and I know a little about the BSD directory structure, but that’s where my knowledge runs up against its limits. I’d say I’m more familiar with BSD than the average computer user but that’s not saying much.) But I can tell you my editor would have absolutely eaten this up. After he or she confirmed it wasn’t fiction.

The history is a little dry; the only “juicy” part is where Berkeley had
to deal with a lawsuit from AT&T (or Bell Labs; I’m not doing my research
here) before they could make their source free.

Nowadays, people are interested because a major layer of Mac OS X is BSD, and is taken from the FreeBSD and NetBSD source trees. Therefore, millions of people who otherwise know nothing about BSD or its history will end up running it when Mac OS X Final comes out in January; lots of people already are running Mac OS X Beta, but chances are good that the people who bought the Beta know about the fact that it’s running on BSD.

And it’s certainly arguable that BSD is much more powerful and robust than Windows 2000. So there’s a story for you. Does that answer any of your question?

Yes; I hope I’ve clarified my issues, too.

Neat site! I’ll have to keep up on it.

Thanks,
John Klos

Scanner troubleshooting secrets

~Mail Follows Today’s Post~

Scanner wisdom. One of the things I did last week was set up a Umax scanner on a new iMac DV. The scanner worked perfectly on a Windows 98 PC, but when I connected it to the Mac it developed all sorts of strange diseases–not warming up properly, only scanning 1/3 of the page before timing out, making really loud noises, crashing the system…

I couldn’t resolve it, so I contacted Umax technical support. The tech I spoke with reminded me of a number of scanner tips I’d heard before but had forgotten, and besides that, I rarely if ever see them in the scanner manuals.

  • Plug scanners directly into the wall, not into a power strip. I’ve never heard a good explanation of why scanners are more sensitive to this than any other peripheral, but I’ve seen it work.
  • Plug USB scanners into a powered hub, or better yet, directly into the computer. USB scanners shouldn’t need power from the USB port, since they have their own power source, but this seems to make a difference.
  • Download the newest drivers, especially if you have a young operating system like MacOS 9, Mac OS X, Windows ME, or Windows 2000. It can take a little while for the scanner drivers to completely stabilize. Don’t install off the CD that came with the scanner, because it might be out of date. Get the newest stuff from the manufacturer’s Web site.
  • Uninstall old drivers before installing the new ones. This was the problem that bit me. The new driver didn’t totally overwrite the old one, creating a conflict that made the scanner go goofy.
  • Buy your scanner from a company that has a track record of providing updated drivers. Yes, that probably means you shouldn’t buy the $15 scanner with the $25 mail-in rebate. Yes, that means don’t buy HP. Up until a couple of years ago, getting NT drivers out of HP was like pulling teeth; now HP is charging for Windows 2000 drivers. HP also likes to abandon and then pick back up Mac support on a whim. Terrible track record.

Umax’s track record is pretty darn good. I’ve downloaded NT drivers for some really ancient Umax scanners after replacing old Macs with NT boxes. I once ran into a weird incompatibility with a seven-year-old Umax scanner–it was a B&W G3 with a wide SCSI controller (why, I don’t know) running Mac OS 8.6. Now that I think about it, I think the incompatibility was with the controller card. The scanner was discontinued years ago (before Mac OS 8 came out), so expecting them to provide a fix was way out of line.
m I’ve ever had with a Umax that they didn’t resolve, so when I spec out a scanner at work, Umax is always on my short list.

And here’s something I just found interesting. Maybe I’m the only one. But in reading the mail on Jerry Pournelle’s site, I found this. John Klos, administrator of sixgirls.org, takes Jerry to task for saying a Celeron can’t be a server. He cites his 66 MHz 68060-based Amiga 4000, which apparently acts as a mail and Web server, as proof. Though the most powerful m68k-based machine ever made, its processing power pales next to any Celeron (spare the original cacheless Celeron 266 and 300).

I think the point he was trying to make was that Unix plays by different rules. Indeed, when your server OS isn’t joined at the hip to a GUI and a Web browser and whatever else Gates tosses in on a whim, you can do a lot more work with less. His Amiga would make a lousy terminal server, but for serving up static Web pages and e-mail, there’s absolutely nothing wrong with it. Hosting a bunch of Web sites on an Amiga 4000 just because I could sounds very much like something I’d try myself if I had the hardware available or was willing to pay for the hardware necessary.

But I see Jerry Pournelle’s point as well.

It’s probably not the soundest business practice to advertise that you’re running off a several-year-old sub-100 MHz server, because that makes people nervous. Microsoft’s done a pretty admirable job of pounding everything slower than 350 MHz into obsolescence and the public knows this. And Intel and AMD have done a good job of marketing their high-end CPUs, resulting in people tending to lay blame at the CPU’s feet if it’s anything but a recent Pentium III. And, well, if you’re running off a shiny new IBM Netfinity, it’s very easy to get it fixed, or if need be, to replace it with another identical one. I know where to get true-blue Amiga parts and I even know which ones are interchangeable with PCs, but you might well be surprised to hear you can still get parts and that some are interchangeable.

But I’m sure there are far, far more sub-100 MHz machines out there in mission-critical situations functioning just fine than anyone wants to admit. I know we had many at my previous employer, and we have several at my current job, and it doesn’t make me nervous. The biggest difference is that most of them have nameplates like Sun and DEC and Compaq and IBM on them, rather than Commodore. But then again, Commodore’s reputation aside, it’s been years since I’ve seen a computer as well built as my Amiga 2000. (The last was the IBM PS/2 Model 80, which cost five times as much.) If I could get Amiga network cards for a decent price, you’d better believe I’d be running that computer as a firewall/proxy and other duties as assigned. I could probably get five years’ uninterrupted service from old Amy. Then I’d just replace her memory and get another ten.

The thing that makes me most nervous about John Klos’ situation is the business model’s dependence on him. I have faith in his A4000. I have faith in his ability to fix it if things do go wrong (anyone running NetBSD on an Amiga knows his machine better than the onsite techs who fix NetFinity servers know theirs). But there’s such thing as too much importance. I don’t let Apple certified techs come onsite to fix our Macs anymore at work, because I got tired of them breaking other things while they did warranty work and having to fix three things after they left. I know their machines better than they do. That makes me irreplaceable. A little job security is good. Too much job sercurity is bad, very bad. I’ll be doing the same thing next year and the year after that. It’s good to be able to say, “Call somebody else.” But that’s his problem, not his company’s or his customers’.

~~~~~~~~~~

From: rock4uandme
To: dfarq@swbell.net
Sent: Wednesday, October 25, 2000 1:22 PM
Subject: i`m having trouble with my canon bjc-210printer…

i`m having trouble with my canon bjc210 printer it`s printing every thing all red..Can you help???
 
 
thank you!!    john c
 
~~~~~~~~~

Printers aren’t my specialty and I don’t think I’ve ever seen a Canon BJC210, but if your printer has replacable printheads (some printers make the printhead part of the ink cartridge while others make them a separate component), try replacing them. That was the problem with the only Canon printer I’ve ever fixed.
 
You might try another color ink cartridge too; sometimes those go bad even if they still have ink in them.
 
If that fails, Canon does have a tech support page for that printer. I gave it a quick look and it’s a bit sketchy, but maybe it’ll help. If nothing else, there’s an e-mail address for questions. The page is at http://209.85.7.18/techsupport.php3?p=bjc210 (to save you from navigating the entire www.ccsi.canon.com page).
 

I hope that helps.

Dave
 
~~~~~~~~~~
 

From: Bruce Edwards
Subject: Crazy Win98 Networking Computer Problem

Dear Dave:

I am having a crazy computer problem which I am hoping you or your readers
may be able to give me a clue to.  I do have this posted on my daily
journal, but since I get very little traffic, I thought your readership or
yourself may be able to help.  Here’s the problem:

My wife’s computer suddenly and inexplicably became very slow when accessing
web sites and usually when accessing her e-mail.  We access the internet
normally through the LAN I installed at home.  This goes to a Wingate
machine which is connected to the aDSL line allowing shared access to the
internet.

My computer still sends and receives e-mail and accesses the web at full
speed.  Alice’s computer now appears to access the web text at about the
speed of a 9600 baud modem with graphics coming down even more slowly if at
all.  Also, her e-mail (Outlook Express) usually times out when going
through the LAN to the Wingate machine and then out over the internet. 
The LAN is working since she is making a connection out that way.

File transfer via the LAN between my PC and hers goes at full speed.
Something is causing her internet access to slow to a crawl while mine is
unaffected.  Also, it appears to be only part of her internet access.  I can
telnet out from her computer and connect to external servers very fast, as
fast as always.  I know telnet is just simple text, but the connection to
the server is very rapid too while connecting to a server via an http
browser is much much slower and then, once connected, the data flows so slow
it’s crazy.

Also, dial-up and connect to the internet via AOL and then use her mail
client and (external to AOL) browser works fine and is as speedy as you
would expect for a 56K modem.  What gives?

I tried reinstalling windows over the existing set-up (did not do anything)
and finally started over from “bare metal” as some like to say.  Reformat
the C drive.  Reinstall Windows 98, reinstall all the drivers, apps, tweak
the configuration, get it all working correctly.  Guess what?  Same slow
speed via the aDSL LAN connection even though my computer zips out via the
same connection.  Any suggestions?

Sincerely,

Bruce W. Edwards
e-mail:  bruce@BruceEdwards.com
Check www.BruceEdwards.com/journal  for my daily journal.

Bruce  🙂
Bruce W. Edwards
Sr. I.S. Auditor  
~~~~~~~~~~

From: Dave Farquhar [mailto:dfarq@swbell.net]Sent: Monday, October 23, 2000 6:16 PM
To: Edwards, Bruce
Cc: Diana Farquhar
Subject: Re: Crazy Win98 Networking Computer Problem

Hi Bruce,
 
The best thing I can think of is your MTU setting–have you run any of those MTU optimization programs? Those can have precisely the effect you describe at times. Try setting yor MTU back to 1500 and see what that does. While I wholeheartedly recommend them for dialup connections, MTU tweaking and any sort of LAN definitely don’t mix–to the point that I almost regret even mentioning the things in Optimizing Windows.
 
Short of that, I’d suggest ripping out all of your networking protocols and adapters from the Network control panel and add back in TCP/IP and only the other things you absolutely need. This’ll keep Windows from getting confused and trying to use the wrong transport, and eliminate the corrupted TCP/IP possibility. These are remote, but possible. Though your reinstall should have eliminated that possibility…
 
If it’s neither of those things, I’d start to suspect hardware. Make sure you don’t have an interrupt conflict (rare these days, but I just saw one a couple weeks ago so I don’t rule them out). Also try swapping in a different cable or NIC in your wife’s machine. Cables of course go bad more frequently than NICs, though I’ve had horrible luck with cheap NICs. At this point I won’t buy any ethernet NIC other than a Bay Netgear, 3Com or Intel.
 
I hope that helps. Let me know how it goes for you.

Dave 
~~~~~~~~~~
From: Bruce Edwards

Hi Dave:
 
Thank you for posting on your web site. I thought you would like an update.
 
I verified the MTU setting was still at 1500 (it was).  I have not used one of the optimizing programs on this PC.
 
I removed all the adapters from the PC via the control panel.  Rebooted and only added back TCP/IP on the Ethernet card. 
 
I double checked the interrupts in the control panel, there do not appear to be any conflicts and all devices report proper function.
 
I still need to 100% verify the wiring/hubs.  I think they are O.K. since that PC, using the same adapter, is able to file share with other PCs on the network.  That also implies that the adapter is O.K.
 
I will plug my PC into the same hub and port as my wife’s using the same cable to verify that the network infrastructure is O.K.
 
Then, I’ll removed the adapter and try a different one.
 
Hopefully one of these things will work.
 
Cheers,
 
Bruce
~~~~~~~~~~

This is a longshot, but… I’m wondering if maybe your DNS settings are off, or if your browser might be set to use a proxy server that doesn’t exist. That’s the only other thing I can think of that can cause sporadic slow access, unless the problem is your Web browser itself. Whichever browser you’re using, have you by any chance tried installing and testing the other one to see if it has the same problems?
 
In my experience, IE 5.5 isn’t exactly the greatest of performers, or when it does perform well, it seems to be by monopolizing CPU time. I’ve gotten much better results with IE 5.0. As for Netscape, I do wish they’d get it right again someday…
 
Thanks for the update. Hopefully we can find an answer.

Dave 
~~~~~~~~~~ 

Protecting your privacy online

If you’re concerned about Amazon, or online privacy in general… On a serious note, Amazon’s policies are gathering attention. As one who, as Pournelle puts it, “makes a living showing off” (and I have more or less since the age of 16), I’ve never worried about privacy. I quickly got used to the idea that if I drove down to Rally’s for a burger, there was a decent chance that someone who knew who I was would see me doing it, and that didn’t bother me much. Once I started seriously writing about computers, I couldn’t go into computer stores without getting a bunch of questions, not to mention introductions (“Hey! This is Dave Farquhar, the computer columnist for the Missourian!”) And of course people wanted to know what I was buying and what I thought of it and/or what I was planning to do with it. That didn’t bother me much either. If people like the stuff I write and respect my opinion enough to care that I like Rally’s hamburgers and Maxtor hard drives, well, that’s a high compliment.
It was a little different after I moved to St. Louis–I had a big crowd of people to lose myself in, but I still have far less privacy than the Average Joe.

Privacy? Never had it. Never really wanted it. But, as one of my friends at work is so fond of pointing out, “We’re not all like you.”

So. How to solve the Amazon (or other Web site) problem if you’re not like me? Spread misinformation. How? Easy. Go get Proxomitron, which, in addition to blocking ads, offers to reject all cookies for you. It also offers to lie about your referring page (it always says you came from a Shoenen Knife fan site), your browser version, browser type, and even your OS (the default is Win67, which makes for some good questions. Windows 1967? Windows 2067? 67-bit Windows?). If you’re paranoid that too many people will use Proxomitron and see the pattern, you can edit the filters yourself. (Try telling ’em you’re running Internet Explorer 7.0 under CP/M 2.2. That’ll get a laugh.) It’s a nice tool.

Remember, incorrect information is far worse than no information. If you want to stop people from gathering information, the trick isn’t to refuse. It’s to give them misinformation. I’m a professional information gatherer. Trust me on this.

Optimizing Windows startup

Mail. More Windows optimization questions.

From: ChiefZeke
Subject: Re: Items to consider
To: Dave Farquhar

Dave,

Would things be a bit faster if the user opted to start programs via the ‘RUN’ function of the Registry rather than via the Startup folder? I have seen this option mentioned in a couple of magazine articles.

Jerry

I imagine they would be slightly faster, since the file and path names, etc. would be stored in registry keys all in one place as opposed to individual icons, one per program, scattered all over the place.

You might also use run= strings in win.ini instead–I suspect that technique would be faster still, being a flat text file rather than a convoluted database.

Now, whether doing this would make any noticeable difference on a modern PC is another question. We may be talking shaving fractions of a second off your boot time. I imagine the difference would be more noticeable on marginal machines (though I’m not very eager to re-commission my 486SX/20 to try it). I just saw a 486DX4/75 laptop today that takes 1.5 minutes to boot Windows even without any items in any of the startup places and a fully optimized msdos.sys–a decked-out modern system similarly configured could boot in about 15 seconds. I can’t imagine your system needing much more than 20 seconds to go from POST to desktop (I’m not familiar with modern Western Digital drives like you have but I imagine their performance must be comparable to the Quantums and Maxtors I use).

This trick dates back to the Win3.1 days, and it was a really good idea way back then–the startup group actually consumed system resources, plus valuable entries in the Windows directory, so eliminating startup and placing items in win.ini could seriously improve a system’s performance back then. Today, Win9x has much better resource management, hard drives and CPUs are much faster, so you don’t hear about it as much anymore.

Very little else for today. I found my copy of the Lost Treasures of Infocom (both volumes) this week, including Bureaucracy, a text adventure I was never able to beat. I found a walk-through that got me past the part that had me stuck.

It’s a whole lot faster on my PC than it was on my Commodore 128 (which was the machine I originally bought it for, what, 11 years ago?). Amazing how much fun a 12K executable paired up with a 240K data file can provide… (And I’m running this on a dual-processor machine with 96 MB RAM and an 8.4-gig hard drive, both due to be upgraded? Something’s wrong here…)

Macintosh buying advice

What’s up with someone asking me for Mac advice? Yeah, Dan Bowman is in the process of selling his soul to (or at least buying a computer from) some egotist in Cupertino.

From: “Bowman, Dan”
Subject: Macs
To: Dave Farquhar
Dead serious request:

We keep getting hammered by graphic artists and printers; the Mac is ubiquitous in this arena locally. I’ve proposed we purchase a Mac for the GM to use (he’s a passable artist and knows what he wants and is not afraid to do it his way).

What configuration (for that matter, what machine) should I look to price this. We’re bidding another contract and the cost of the machine would likely be saved twice over by the artist fees and the GM’s time (time he could spend just doing it).

Any bets on programs?

Networking issues?

Thanks. Not my idea of fun; but in this case the right tool for the job if he can make it work.

Dan

I can’t recommend packages, they’ve gotta be what he’s comfortable working with. Rent some time at Kinko’s if need be to determine that. I definitely suggest avoiding Adobe PageMaker, because they’re abandoning the thing. Let me take back what I just said. If you can avoid using Adobe products, do it, because the company’s policies… Umm, just take every bad thing I’ve ever said about Microsoft, multiply it by about 10, and you’ve got Adobe. You may not be able to avoid Photoshop, but avoid the rest of it if you can. Macromedia and Quark, between the two of them, make just about everything you need.

If he wants to use a jillion fonts, you need a font management program, because the self-styled King of Desktop Publishing can’t juggle more than 254 fonts, I believe. I’m not certain on the number. Extensis Suitcase will do the job.

Get AlSoft Disk Warrior, Micromat Tech Tool Pro, and Symantec Norton Utilities. Once a month (or whenever you have problems), run Apple’s Disk First Aid (comes with the system), then Disk Warrior, Tech Tool Pro, and Norton Disk Doctor, in that order. Fix all problems. They’ll find a bunch. Also get Font Agent, from Insider Software, and run it once a month. It’ll want to delete any bitmapped fonts over 12 point. Don’t do that, but let it do everything else it wants. That helps a ton.

You’ll spend $500 on utilities software, but if you want your bases covered, you need them. Get them, use them, and you won’t have problems. Neglect to get them, and there’ll be no end to your problems, unless he never uses it.

Hardware: Get a 400-MHz G4, 256 MB RAM, IDE disk (poorly threaded, cooperative multitasking OSs don’t know what to do with SCSI). Frequently you can get a better price by getting the smallest disk possible, then buying a Maxtor drive at your local reseller. I know they were charging $150 a month ago to upgrade a 10-gig disk to a 20-gig disk, and you can buy a 20-gig disk for $150. Video, sound, etc aren’t options. If 450 is the slowest you can get, get that. MacOS doesn’t do a good enough job of keeping the CPU busy to warrant the extra bucks for a higher-end CPU. You’ll want the memory because you have to assign each app’s memory usage (it’s not dynamic like Windows), and it’s not a bad idea to assign 64 MB to a killer app. I also hear that G4s are totally unstable with less than 256 megs. I can’t confirm that. We’ve got G4s with more and we’ve got G4s with less, but I haven’t seen both in the hands of a power user yet.

Networking: NT’s Services for Macintosh are worthless. Don’t use NT for a print server for a Mac (it’ll ruin the prints), and don’t use it as a file server if you can help it (it’ll crash all the time). Linux isn’t much better, but it’s better. (It’ll just crash some of the time, but at least you can restart the daemons without rebooting.) I don’t know if MacOS 9 can talk to printers through TCP/IP or if they still have to use AppleTalk. AppleTalk is an ugly, nasty, very chatty protocol–it makes ugly, nasty NetBEUI look beautiful–but it’s what you get. Turn on AppleTalk on one of your network printers and print to it that way. One Mac and one printer won’t kill a small network, though a big enough network of Macs can keep a 10-megabit network totally overwhelmed with worthless chatter. Killer DTP apps don’t like their PostScript to be reinterpreted, and that’s one of the things NT Server does to mung up the jobs. So that’s the only workaround.

Multitasking: Don’t do it. When I use a Mac like an NT box, keeping several apps and several documents open at once, it’ll crash once a day, almost guaranteed. Don’t push your luck. It’s an Amiga wannabe, not a real Amiga. (Boy, I hope I’ve got my asbestos underwear handy.)

Attempting to optimize Windows with explicit paths

An interesting idea, this. But I’m not sure it’s worth the required time investment to see if it makes a difference for you.

From: ChiefZeke
Subject: Items to consider
To: dfarq@swbell.net

Dave,

A few more items to consider:

The various *.ini files usually point to files to load as oemfonts.fon=vgaoem.fon. Would it not be better to edit all files so that the full path is used instead; as above:
oemfonts.fon=c:\windows\fonts\vgaoem.fon ?

Also, when Folder Options – File Types – Registered File Types is reviewed many items are listed similar to rundll setup.dll ***. Again, would it not be better for the user to edit the complete listing so that the complete path is used; as above:
c:\windows\rundll.exe c:\windows\system\setup.dll *** ?

While I’m well aware of the tedium involved in doing the necessary editing I would think the end result would be worth it.

Jerry

Since Windows only looks in \Windows\Fonts for fonts, I don’t see how specifying a pathname there would help matters, and it might hurt. And I believe the ini files look for device drivers and the like in \Windows\System and possibly \Windows\System32 exclusively.

The registered filetypes is an interesting idea. Since Windows traverses the path (normally C:\Windows;C:\Windows\System;C:\Windows\Command) looking for that stuff, theoretically, putting a pathname in front of stuff that’s in C:\Windows\Command or C:\Windows\System would make it find the file slightly faster. How much faster depends on how full those directories are, of course.

I wouldn’t start editing without first making a full backup of the \Windows tree (or at the very least, a backup copy of the registry). I fear it might be an awful lot of work for very little gain. I’m always interested in even small speedups, and I’m sure I’ll end up trying it at some point (when I’m not banging my head against the wall learning NFS, NIS and NDS so I can write about them).

Proceed with caution, but if you try it I’m of course very interested in the results.

From: ChiefZeke
Subject: Re: Items to consider
To: Dave Farquhar

Dave,
It wasn’t only the .FON files I was talking about. I was also thinking of the .DRV, .ACM. etc files. In fact, I’ve already edited SYSTEM.INI and WIN.INI to add the path in all those places that I’ve determined warrant it.

Also, while it took about three hours. I’ve also edited the entries for registered filetypes and that went smoothly. I feel there is no need to back-up anything, at this time, to accomplish that task. When you’re doing the editing the path and filename are monitored and any errors get a ‘beep’. Further, long-file names are also ‘beeped’ if they are not enclosed in ” “.

Since all operations are subjective as to how fast our computers really are I will confess I noticed no differential in speed during Windows start or program loading.

Jerry

vestigating that. It’s hard to know what tricks are going to make a difference and which ones won’t. I suspect specifying a path would help really slow systems with extremely crammed system directories more than modern systems with optimized directories.

Underachieving Win9x Network performance

David Yerka asked what can cause really slow network performance in Windows 95/98. I mailed him, suggesting maybe someone had run MTUSpeed or some similar utility on the machine to optimize dialup performance. LAN performance tends to go into the toilet after doing that. (Voice of experience speaking… My Win95 box was a real dog until that light went off–long after my book was on store shelves, of course.) He responded with some useful information.

From: David M. Yerka
Subject: Re: Slow Win9x network performance
To: Dave Farquhar

Hi Dave:

Thanks for the reply you win the big bucks! That is exactly what is going on. Apparently Win9x only sets the MaxMTU in one place:

HKey_local machine\system\currentcontrolset\services\class\nettrans000

and while additional information makes this key appear to belong only to dialup networking apparently it is the place where Win9x picks up the settings for the network also. You were right also I remembered (actually, before I got your email) that someone had used MTUSpeed on this machine to optimize dialup before I convinced my clients to get a “webramp appliance” to do sharing. Unfortunately, it appears that even if you tell MTUSpeed to “remove all settings” it leaves the MaxMTU setting at say 576 (which is usually the best for dialup ISP’s). You must explicitly change the settings in MTUSpeed to 1500 and reboot BEFORE have MTUSpeed “remove all settings.”

Interestingly, I found that you could sort of hack the registry with a combination of stuff and seemingly get both optimizations: stick a string in the key below of MaxMTU=”1500″

HKey_local machine\system\currentcontrolset\services\class\nettrans\netservice000

Use MTUSpeed to set the MTU to 1500 reboot

edit the first key …\nettrans000 to MaxMTU=”576″ and reboot.

checking with MTUspeed (and don’t under any circumstance change anything) shows the MTU to be 576 while network performance approaches 950K for a 10T UTP network.

Isn’t Window just wonderful and weird.. or something!!

Thanks again David Yerka