Scanner troubleshooting secrets

~Mail Follows Today’s Post~

Scanner wisdom. One of the things I did last week was set up a Umax scanner on a new iMac DV. The scanner worked perfectly on a Windows 98 PC, but when I connected it to the Mac it developed all sorts of strange diseases–not warming up properly, only scanning 1/3 of the page before timing out, making really loud noises, crashing the system…

I couldn’t resolve it, so I contacted Umax technical support. The tech I spoke with reminded me of a number of scanner tips I’d heard before but had forgotten, and besides that, I rarely if ever see them in the scanner manuals.

  • Plug scanners directly into the wall, not into a power strip. I’ve never heard a good explanation of why scanners are more sensitive to this than any other peripheral, but I’ve seen it work.
  • Plug USB scanners into a powered hub, or better yet, directly into the computer. USB scanners shouldn’t need power from the USB port, since they have their own power source, but this seems to make a difference.
  • Download the newest drivers, especially if you have a young operating system like MacOS 9, Mac OS X, Windows ME, or Windows 2000. It can take a little while for the scanner drivers to completely stabilize. Don’t install off the CD that came with the scanner, because it might be out of date. Get the newest stuff from the manufacturer’s Web site.
  • Uninstall old drivers before installing the new ones. This was the problem that bit me. The new driver didn’t totally overwrite the old one, creating a conflict that made the scanner go goofy.
  • Buy your scanner from a company that has a track record of providing updated drivers. Yes, that probably means you shouldn’t buy the $15 scanner with the $25 mail-in rebate. Yes, that means don’t buy HP. Up until a couple of years ago, getting NT drivers out of HP was like pulling teeth; now HP is charging for Windows 2000 drivers. HP also likes to abandon and then pick back up Mac support on a whim. Terrible track record.

Umax’s track record is pretty darn good. I’ve downloaded NT drivers for some really ancient Umax scanners after replacing old Macs with NT boxes. I once ran into a weird incompatibility with a seven-year-old Umax scanner–it was a B&W G3 with a wide SCSI controller (why, I don’t know) running Mac OS 8.6. Now that I think about it, I think the incompatibility was with the controller card. The scanner was discontinued years ago (before Mac OS 8 came out), so expecting them to provide a fix was way out of line.
m I’ve ever had with a Umax that they didn’t resolve, so when I spec out a scanner at work, Umax is always on my short list.

And here’s something I just found interesting. Maybe I’m the only one. But in reading the mail on Jerry Pournelle’s site, I found this. John Klos, administrator of sixgirls.org, takes Jerry to task for saying a Celeron can’t be a server. He cites his 66 MHz 68060-based Amiga 4000, which apparently acts as a mail and Web server, as proof. Though the most powerful m68k-based machine ever made, its processing power pales next to any Celeron (spare the original cacheless Celeron 266 and 300).

I think the point he was trying to make was that Unix plays by different rules. Indeed, when your server OS isn’t joined at the hip to a GUI and a Web browser and whatever else Gates tosses in on a whim, you can do a lot more work with less. His Amiga would make a lousy terminal server, but for serving up static Web pages and e-mail, there’s absolutely nothing wrong with it. Hosting a bunch of Web sites on an Amiga 4000 just because I could sounds very much like something I’d try myself if I had the hardware available or was willing to pay for the hardware necessary.

But I see Jerry Pournelle’s point as well.

It’s probably not the soundest business practice to advertise that you’re running off a several-year-old sub-100 MHz server, because that makes people nervous. Microsoft’s done a pretty admirable job of pounding everything slower than 350 MHz into obsolescence and the public knows this. And Intel and AMD have done a good job of marketing their high-end CPUs, resulting in people tending to lay blame at the CPU’s feet if it’s anything but a recent Pentium III. And, well, if you’re running off a shiny new IBM Netfinity, it’s very easy to get it fixed, or if need be, to replace it with another identical one. I know where to get true-blue Amiga parts and I even know which ones are interchangeable with PCs, but you might well be surprised to hear you can still get parts and that some are interchangeable.

But I’m sure there are far, far more sub-100 MHz machines out there in mission-critical situations functioning just fine than anyone wants to admit. I know we had many at my previous employer, and we have several at my current job, and it doesn’t make me nervous. The biggest difference is that most of them have nameplates like Sun and DEC and Compaq and IBM on them, rather than Commodore. But then again, Commodore’s reputation aside, it’s been years since I’ve seen a computer as well built as my Amiga 2000. (The last was the IBM PS/2 Model 80, which cost five times as much.) If I could get Amiga network cards for a decent price, you’d better believe I’d be running that computer as a firewall/proxy and other duties as assigned. I could probably get five years’ uninterrupted service from old Amy. Then I’d just replace her memory and get another ten.

The thing that makes me most nervous about John Klos’ situation is the business model’s dependence on him. I have faith in his A4000. I have faith in his ability to fix it if things do go wrong (anyone running NetBSD on an Amiga knows his machine better than the onsite techs who fix NetFinity servers know theirs). But there’s such thing as too much importance. I don’t let Apple certified techs come onsite to fix our Macs anymore at work, because I got tired of them breaking other things while they did warranty work and having to fix three things after they left. I know their machines better than they do. That makes me irreplaceable. A little job security is good. Too much job sercurity is bad, very bad. I’ll be doing the same thing next year and the year after that. It’s good to be able to say, “Call somebody else.” But that’s his problem, not his company’s or his customers’.

~~~~~~~~~~

From: rock4uandme
To: dfarq@swbell.net
Sent: Wednesday, October 25, 2000 1:22 PM
Subject: i`m having trouble with my canon bjc-210printer…

i`m having trouble with my canon bjc210 printer it`s printing every thing all red..Can you help???
 
 
thank you!!    john c
 
~~~~~~~~~

Printers aren’t my specialty and I don’t think I’ve ever seen a Canon BJC210, but if your printer has replacable printheads (some printers make the printhead part of the ink cartridge while others make them a separate component), try replacing them. That was the problem with the only Canon printer I’ve ever fixed.
 
You might try another color ink cartridge too; sometimes those go bad even if they still have ink in them.
 
If that fails, Canon does have a tech support page for that printer. I gave it a quick look and it’s a bit sketchy, but maybe it’ll help. If nothing else, there’s an e-mail address for questions. The page is at http://209.85.7.18/techsupport.php3?p=bjc210 (to save you from navigating the entire www.ccsi.canon.com page).
 

I hope that helps.

Dave
 
~~~~~~~~~~
 

From: Bruce Edwards
Subject: Crazy Win98 Networking Computer Problem

Dear Dave:

I am having a crazy computer problem which I am hoping you or your readers
may be able to give me a clue to.  I do have this posted on my daily
journal, but since I get very little traffic, I thought your readership or
yourself may be able to help.  Here’s the problem:

My wife’s computer suddenly and inexplicably became very slow when accessing
web sites and usually when accessing her e-mail.  We access the internet
normally through the LAN I installed at home.  This goes to a Wingate
machine which is connected to the aDSL line allowing shared access to the
internet.

My computer still sends and receives e-mail and accesses the web at full
speed.  Alice’s computer now appears to access the web text at about the
speed of a 9600 baud modem with graphics coming down even more slowly if at
all.  Also, her e-mail (Outlook Express) usually times out when going
through the LAN to the Wingate machine and then out over the internet. 
The LAN is working since she is making a connection out that way.

File transfer via the LAN between my PC and hers goes at full speed.
Something is causing her internet access to slow to a crawl while mine is
unaffected.  Also, it appears to be only part of her internet access.  I can
telnet out from her computer and connect to external servers very fast, as
fast as always.  I know telnet is just simple text, but the connection to
the server is very rapid too while connecting to a server via an http
browser is much much slower and then, once connected, the data flows so slow
it’s crazy.

Also, dial-up and connect to the internet via AOL and then use her mail
client and (external to AOL) browser works fine and is as speedy as you
would expect for a 56K modem.  What gives?

I tried reinstalling windows over the existing set-up (did not do anything)
and finally started over from “bare metal” as some like to say.  Reformat
the C drive.  Reinstall Windows 98, reinstall all the drivers, apps, tweak
the configuration, get it all working correctly.  Guess what?  Same slow
speed via the aDSL LAN connection even though my computer zips out via the
same connection.  Any suggestions?

Sincerely,

Bruce W. Edwards
e-mail:  bruce@BruceEdwards.com
Check www.BruceEdwards.com/journal  for my daily journal.

Bruce  🙂
Bruce W. Edwards
Sr. I.S. Auditor  
~~~~~~~~~~

From: Dave Farquhar [mailto:dfarq@swbell.net]Sent: Monday, October 23, 2000 6:16 PM
To: Edwards, Bruce
Cc: Diana Farquhar
Subject: Re: Crazy Win98 Networking Computer Problem

Hi Bruce,
 
The best thing I can think of is your MTU setting–have you run any of those MTU optimization programs? Those can have precisely the effect you describe at times. Try setting yor MTU back to 1500 and see what that does. While I wholeheartedly recommend them for dialup connections, MTU tweaking and any sort of LAN definitely don’t mix–to the point that I almost regret even mentioning the things in Optimizing Windows.
 
Short of that, I’d suggest ripping out all of your networking protocols and adapters from the Network control panel and add back in TCP/IP and only the other things you absolutely need. This’ll keep Windows from getting confused and trying to use the wrong transport, and eliminate the corrupted TCP/IP possibility. These are remote, but possible. Though your reinstall should have eliminated that possibility…
 
If it’s neither of those things, I’d start to suspect hardware. Make sure you don’t have an interrupt conflict (rare these days, but I just saw one a couple weeks ago so I don’t rule them out). Also try swapping in a different cable or NIC in your wife’s machine. Cables of course go bad more frequently than NICs, though I’ve had horrible luck with cheap NICs. At this point I won’t buy any ethernet NIC other than a Bay Netgear, 3Com or Intel.
 
I hope that helps. Let me know how it goes for you.

Dave 
~~~~~~~~~~
From: Bruce Edwards

Hi Dave:
 
Thank you for posting on your web site. I thought you would like an update.
 
I verified the MTU setting was still at 1500 (it was).  I have not used one of the optimizing programs on this PC.
 
I removed all the adapters from the PC via the control panel.  Rebooted and only added back TCP/IP on the Ethernet card. 
 
I double checked the interrupts in the control panel, there do not appear to be any conflicts and all devices report proper function.
 
I still need to 100% verify the wiring/hubs.  I think they are O.K. since that PC, using the same adapter, is able to file share with other PCs on the network.  That also implies that the adapter is O.K.
 
I will plug my PC into the same hub and port as my wife’s using the same cable to verify that the network infrastructure is O.K.
 
Then, I’ll removed the adapter and try a different one.
 
Hopefully one of these things will work.
 
Cheers,
 
Bruce
~~~~~~~~~~

This is a longshot, but… I’m wondering if maybe your DNS settings are off, or if your browser might be set to use a proxy server that doesn’t exist. That’s the only other thing I can think of that can cause sporadic slow access, unless the problem is your Web browser itself. Whichever browser you’re using, have you by any chance tried installing and testing the other one to see if it has the same problems?
 
In my experience, IE 5.5 isn’t exactly the greatest of performers, or when it does perform well, it seems to be by monopolizing CPU time. I’ve gotten much better results with IE 5.0. As for Netscape, I do wish they’d get it right again someday…
 
Thanks for the update. Hopefully we can find an answer.

Dave 
~~~~~~~~~~ 

10/28/2000

~Mail Follows Today’s Post~

Microsoft hacked! In case you haven’t heard, some hackers in St. Petersburg, Russia, had access to Windows source code for three months and the intrusion was only discovered this week. This could end up being a very good thing for you and me, believe it or not. (And this is even assuming the hackers didn’t fix any of the bugs they found.) As security consultant Andrew Antipass told Wired magazine, “It is interesting in a kind of cruel way that Microsoft has been eaten by the monsters it created.”

Microsoft has always been oblivious to security in their products. The only way they were going to learn was to be bitten, and hard. Now something has happened that calls their network infrastructure into question, the security of their products (which they’ve tried to present as more secure than Unix) into question, and even the integrity of the code they’ve produced in the last three months into question–Microsoft can say what they want about it being impossible to change the code. Of course they’re going to say that. Will the public believe it? Some will believe anything Microsoft says. Others wisely will believe exactly the opposite of anything Microsoft says. Still others (like me) will believe the worst no matter what Microsoft says.

If this incident doesn’t force Microsoft to start taking security seriously, nothing will.

The downside, however, is that if the hackers did indeed get Windows and/or Office source code, vulnerabilities become potentially easier to spot (not that access to Linux source code has significantly increased the number of vulnerabilities–remember, most hackers are script kiddies at best, writing in batch languages, and aren’t any more proficient in C++ than you or me).

All of this overshadowed Microsoft’s Internic entry being hacked (Apple’s entry got hacked too, though less creatively), which you can read about in The Register.

Enough of this computer junk. Let’s talk about music.

Review of U2’s All That You Can’t Leave Behind. If U2 were to call it quits right now and we had to pick out U2’s defining album, this would be it. That’s not to say it’s their best album–it’s awfully difficult to match the raw energy and wonder of Boy, the raw power of Achtung Baby, and if this one sells like The Joshua Tree, it’ll only be because there are so many fewer bands making good music in 2000 than there were in 1987.

That said, U2 seems to have finally answered the quintessential question of how to sound like U2 without sounding like selling out. For the past 13 years, every time U2 released an album, people expressed disappointment that it didn’t sound like Joshua Tree. But others would point to the inevitable single track on each album that did sound like Joshua Tree, then wave it in the band’s face: Can’t you do anything original?

You can divide U2’s music into roughly three phases: 1979-1983 (Boy through War), 1984-1989 (Unforgettable Fire through Rattle and Hum), and 1991-1997 (Achtung Baby through Pop). Although the band has reinvented itself with every album — sometimes for the better and sometimes not — they tend to hold on to their sound for a couple of albums at a time before they make major changes.

But the album title might as well be referring to those sonic changes: This album manages to incorporate all of those previous sounds while not sounding too much like any of the previous albums. You could take the defining song off any previous U2 album, drop it randomly into this album’s mix, and it would manage to fit.

The Edge’s jangly guitar? It’s there. Larry Mullen’s precise, militaristic drumming? It’s there. Adam Clayton’s low, thumping bass? It’s there. And of course, there’s also Bono’s wailing vocals. The experimentation? That’s there too, and that’s the bit that always scares people.

Make no bones about it: U2 is a rock band, and this is a rock record. But listen closely, and the experimental elements are still there. The synths are there. The sequencers are there. So is the drum machine. In fact, they lead off the album. But whereas in the past they have sometimes been the focus, now they complement the band’s sound rather than defining it.

They pull out a unexpected tricks as well. Listening to “When I Look At the World” for the first time, I half expected to hear Frankie Vallie filling in on lead vocals. Bono’s soaring falsetto doesn’t reach as high anymore as Vallie did in his prime, but the crafty veteran vocalist makes what he has left work. Meanwhile, in the tracks “New York” and “Grace,” Bono manages to out-Lou Reed the real article, though as a closer “Grace” is just not up there with U2’s great closing tracks of the past (War’s “40,” Achtung Baby’s “Love is Blindness,” Pop’s “Wake Up Dead Man,” or Joshua Tree’s “Mothers of the Disappeared”).

The biggest surprise is track 8. In that track, titled “In A Little While,” U2 finally succeeds in sounding soulful. So much of Rattle and Hum was contrived, a bunch of Irish guys in their late 20s trying to sound like B.B. King or Bob Dylan, and they clearly hadn’t lived long enough yet to pull it off. Now in their early 40s, they nail it.

The opening track and first single, “Beautiful Day,” is a good introduction to the album. That song’s sonic elements are for the most part present throughout. Like most U2 songs, to the casual listener it sounds good immediately. As one who picks apart lyrics, I initially didn’t like the song because it seemed too superficial. So what if it’s a beautiful day? Even a no-talent Kurt Cobain wannabe like Gavin Rossdale can say that! Only upon closer listening does the real meaning surface: the story of someone who has lost everything, yet never felt better. That sounds a lot like me. It probably sounds like you too, or someone you know.

So, how’s it stack up? Most people rank War and The Joshua Tree as U2’s finest albums. I buck convention and place Achtung Baby (their amazing 1991 comeback) and Boy (their 1980 international debut) ahead of those two. At the bottom, I’d rank October, Rattle and Hum, Zooropa, The Unforgettable Fire, and Pop. All That You Can’t Leave Behind definitely blows away the lesser five albums.

However, the album falls a bit flat after the first four delightful tracks. It picks it back up again for a track or two here and there, but the immediate greatness that grabbed you when you first heard The Joshua Tree or Achtung Baby just isn’t there throughout. And the superstrong emotions that drove and held together those great albums aren’t here.

This is probably the album of the year, and many bands go their entire careers without recording anything as good as U2’s worst. This effort borders on greatness, but just doesn’t quite manage to cross over.

Strong points: The first four tracks.
Weak points: “Peace on Earth;” “Wild Honey;” “In a Little While,” though good, doesn’t seem to fit (seems to be there only to settle a bet from 1989); “Grace” is a good track but ill-suited to end a U2 album.

And a survey. I’m considering a one-day-per-page format, like Frank McPherson and Chris Ward-Johnson use. When marking up by hand, a weekly format is much easier. When using Manilla, it really makes no difference.

If you don’t want to join the site in order to vote, feel free to just e-mail me. Members can vote here. (I do wish there was an option to open discussions and voting to non-members. I can see why some people would want to require membership, but that should be optional. So it goes.)

~~~~~~~~~~

From: “Dan Bowman” < DanBowman@att.net >
To: <dfarq@swbell.net>
Sent: Saturday, October 28, 2000 11:03 AM
Subject: Music reviews

Now I’m really glad I spent my discretionary funds yesterday! After reading your review (the class is watching a movie while I make copies),
I’m real tempted to pick up the album.
Then again, I have a Costco run set for after class…

Have a great weekend,

dan

~~~~~~~~~~

Don’t look too hard; it’s not available until Tuesday. I got a chance to hear it a few times and I got sick of seeing reviews from people who just listened to the 15-second clips available on the music store sites and said, “This is the best U2 album ever!” based on that, so I wrote it up. Good practice anyway. I think it’s been 3 1/2 years since I’ve written a music review of any significant length.

I wanted to strike a balance between “this may be the year’s best album” and “this is the best album ever!” because it’s not (it’s not even U2’s best).  Hopefully I did that. But I remember when I wrote up a review of Pearl Jam’s No Code in 1996, people said I was too harsh and shouldn’t have compared it to the past (though looking at that album’s longevity or lack thereof, I’m inclined to think I was right).

Reviews are tricky business but I want to stay in practice, so I may start doing a review a week just to get and stay sharp.

Mac emulation and insights

I’m scaring myself. I’ve been playing around with Mac emulation on my PC at home (I can get an old Quadra or something from work for nothing or virtually nothing, but finding space to set it up properly in these cramped quarters would be an issue, especially since I’d have to give it its own keyboard and mouse and possibly its own monitor). My Celeron-400 certainly feels faster than the last 68040 I used, and I greatly prefer my clackety IBM keyboard and my Logitech mouse over anything Apple ever made, so this emulation setup isn’t bad. I’ve got MacOS 8.0 running on my Celeron 400, though on an 040 (especially an emulated 040), 7.6.1 would be much better if I can track down an installation CD for it by some chance.
Of course, there’s the issue of software. A lot of the ancient 68K Mac software is freely available (legally) these days, and it raises the old “Are we better off now than we were 10 years ago?” question. I don’t know. I still think the software of yesterday was much leaner and meaner and less buggy. By the same token, programs didn’t necessarily work together like they do today, and the bundles of today were virtually unheard of. Software ran anywhere from $99 to $999, and it typically did one thing. More, an outliner from Symantec (not to be confused with the Unix paging utility), made charts and outlines. That was it. And it cost around $100. The functionality that’s in MS Office today would have cost many thousands of dollars in 1990. Of course, the very same argument could be made for hardware. You couldn’t get the functionality available in a $399 eMachine for any price in 1990–there were very high-end machines in 1990 with that kind of CPU power, of course, but the applications weren’t there because you don’t buy a supercomputer to run word processing.

Messing around with this old Mac software gave me some insights into the machine. One of the freely available packages is Think Pascal. In high school, we did computer applications on Macs and programming (at least the advanced programming classes I was taking) on IBM PCs. So I know Pascal, but this was my first exposure to it on the Mac. Reading some of the preliminary documentation on programming a Mac in Think Pascal gave me some insight into why the Mac has (and always had) such a rabid following. I don’t really find the Mac any easier to use than Windows (and there are some things I have to do that are far easier in Windows) but I won’t deny the Mac is a whole lot easier to program. Implementing “Hello, World!” in Think Pascal on a Mac is much easier than implementing it in C on Windows, and the Think Pascal version of “Hello, World!” makes more sense to me than even the Visual Basic version of “Hello, World!” on Windows. It’s more complicated than the main() { printf(“Hello, World!\n”); } you would use in DOS or Unix, but if you use all available tools and put the dialog boxes and buttons in resources it’s not much more complex, and programmers can rough in GUI elements and get on with the code while they shove the GUI elements off to artsy people, then it’s easy to use ResEdit or another resource editor to put the final GUI elements in.

And, bite my tongue, it would appear that programming the Mac was easier than programming the Amiga as well. I wrote plenty of command-line tools for the Amiga but I never mastered the GUI on that platform either.

I’m not saying anyone can program a Mac, but having attempted unsuccessfully to learn how to program effectively in Windows, I can say people who wouldn’t program in Windows can (and probably do, or at least did back in the day) program the Mac. My friends Tom Gatermann, Tim Coleman and I stand no chance whatsoever of being able to develop a decent Windows app, but we would have made a decent Mac development team with Tom and Tim handling the GUI and me writing code and all of us contributing ideas.

The next time I need a computer to do something for me that I can’t find a readily made program to do, I’m apt to load up Think Pascal on a Mac emulator and take a crack at it myself. My simple mind can handle programming that platform, and I suspect some of the innovative programs that appeared on the Mac first may have originally been written by people like me who have ideas but don’t think like a traditional programmer.

———-

From: Robert Bruce Thompson

“I can count on one hand the number of people I know who’ve ever built anything from discrete components, myself included…”

You’re hanging out with way too young a crowd. I’m only 47, and I used to build stuff from discrete components, including ham transmitters, receivers, amplifiers, and so on using *tubes*. You probably wouldn’t recognize a tube if it bit you, so I’ll explain that they were glass things kind of like light-bulbs. They were available in hundreds of types, which one used for various purposes–diodes, triodes, and so on. When they were running, they lit up with an orange light. Very pretty. And they did burn out frequently, just like light bulbs.

And I’ll be that if I were pressed hard enough, I could even remember the resistor color codes.

Geez.

———-

Too young and too lazy. But I do know what tubes are–they’re still used in audio equipment, for one, because they give a richer tone than transistors. And I remember when I was really young, there was a drugstore we used to go to that still had a tube tester in back.

But I remember the eyebrows I raised in high school when I was building something that needed a particular logical gate, and I couldn’t quickly locate the appropriate chip. I had a book that told how to build the gate using discrete components, so I did it. Actually I raised eyebrows twice–once for building the thing that required the chip in the first place, and once for making the chip stand-in.

Binary file editing and hardware compatibility

Binary file editing. I’ve recovered many a student’s term paper from munged disks over the years using Norton Disk Edit, from the Norton Utilities (making myself a hero many times). Usually I can only recover the plain text, but that’s a lot better than nothing. Rebuilding an Excel spreadsheet or a QuarkXPress document is much harder–you have to know the file formats, which I don’t.
But at any rate, I’ve on a number of occasions had to run NDE to recover meeting minutes or other documents at work. The sheer number of times I have to do this made me adamantly opposed to widespread use of NTFS at work. Sure, the extra security and other features is nice, but try telling that to an irate user who just lost the day’s work for some reason. The “technical superiority” argument doesn’t hold any water there.

Enter WinHex (www.winhex.com). Now it doesn’t matter so much that the powers that be at work didn’t listen to my arguments. 🙂 (NDE from vanilla DOS would still be safer, since the disk will be in suspended state, but I guess you could yank the drive and put it in another PC for editing.)

For those who’ve never done this before, you can recover data using a brute force method of searching for known text strings that appeared in the file. For example, I once worked on recovering a thesis that contained the line “I walk through a valley of hands.” Chances are, if I search for that, I’m gonna find the rest of the document in close proximity. A Windows-based editor makes this kind of data recovery very nice–search for the string, keeping Notepad open, then copy and paste the strings as you find them.

Knowledge of the underlying filesystem (FAT or NTFS) is helpful but not essential, as is knowledge of the file format involved. If worse comes to worse, you can recover the strings out of the file and have the app open to re-enter it (being aware that you run the risk of overwriting the data, of course).

I found some useful links on the WinHex site detailing certain file formats.

This is a program I suspect I’ll be buying soon, since my need for it is probably more a matter of when rather than if.

———-

From: “James Cooley”

Subject: Tip for tat?

Hi Dave,

I waded through all your views (That’s where all those hits came from!) and I like your style and learned a great deal. Here’s another tip I didn’t see mentioned: in autoexec.bat, add the following: set temp=C:\temp set tmp=C:\temp set tmpdir=C:\temp

You could use the ramdisk drive you mention, of course. I don’t know if this speeds things up, but it sure helps minimize the clutter from most installs when you clean the temp directory periodically. I use C:\temp2 for those disposable downloads because some programs hate extracting into their own directory. Norton Anti-Virus comes to mind: if you run the updates from C:\temp it hangs.

I ordered _UNIX in a Nutshell_ from a recommendation on your site, but got a 500 page tome instead of the 92 pages you mentioned. If you recall the O’Rielly book I’m talking about, could you give me the exact name so I needn’t hunt it down again?

Hope your hands are healing.

Regards,

Jim

———-

Thanks. I’m glad you enjoyed it (but isn’t that an awful lot of reading?)

I’ve seen the tmpdir trick; fortunately not a whole lot of programs use it anymore but that is useful. Thanks.

And yes, as you observe it’s a good idea to use a separate dir for program installs. I try to avoid hanging it directly off the root for speed considerations (a clean root dir is a fast root dir)–I usually stick it on the Windows desktop out of laziness. That’s not the best place for it either, but it’s convenient to get to.

The 92-page book is Learning the Unix Operating System, by Jerry Peek and others. It’s about $12. The 500-page Unix in a Nutshell is useful, but more as a reference. I’ve read it almost cover-to-cover, but I really don’t like to read the big Nutshell books that way. Information overload, you know?

———-

From: “al wynn”

Subject: MAX screen resolution for Win95/98/2000

Do you know the MAXIMUM screen resolutions for Win95/98/2000 (in pixels) ? Which operating systems can support a dual-monitors setting ?

NEC 15′ MultiSync CRT monitors max out at (1280 x 1024 @ 66Hz); for 17′ CRT’s, it’s usually (1600 x 1200 @76Hz). Do you know any 15′ and 17′ models that can handle denser resolutions ? (like (1792 x 1344 @68Hz) or (1920 x 1440 @73Hz) ?

Also, which Manufacturer/Model do you prefer for flat-panel LCD’s ? Which 15′ or 17′ LCD models boast the highest resolution ?

———-

I believe Windows’ limit is determined by the video drivers. So, if a video card ships someday that supports some obnoxious resolution like 3072×2560, Windows should support it. That’s been the case in the past, usually (and not just with the Windows platform–it holds true for other systems as well).

Windows 98 and 2000 support dual monitors.

I’ve never seen a 15″ monitor that does more than 1280×1024, and never seen a 17″ that does more than 1600×1200. I find anything higher than 1024×768 on a 15″ monitor and higher than 1152×864 on a 17″ strains my eyes after a full day of staring at it.

As for flat-panels, I don’t own one so I can’t speak authoritatively. I’d probably buy an NEC or a Mitsubishi if I were going to get one. The price difference between an off-brand flat-panel and a big name is small enough (relative to price) and the price high enough that I’d want to go with someone I know knows how to make quality stuff–I’m not gonna pay $800-900 for something only to have it break after two years. I’m totally sold on NEC, since I bought a used NEC Multisync II monitor in 1990 that was built in 1988. It finally died this year.

A 15″ flat-panel typically does 1024×768, while a 17″ does 1280×1024.

“Apple lost,” Steve Jobs says

Apple obsession continues. See if you can guess who said the following:

The desktop computer industry is dead. Innovation has virtually ceased. Microsoft dominates with very little innovation. That’s over. Apple lost.

Ars Technica readers may already know the answer. The answer (drum roll) is, none other than Steve Jobs, in an interview that appeared in the Feb. 1996 issue of Wired. Jobs was, at the time, CEO of NeXT, maker of overstyled and overpriced Unix boxes (though by then they were out of the hardware business and just selling NeXTStep, their Unix variant). Apple, of course, bought NeXT a few months after Jobs said this, and in a strange turn of events, Jobs ended up becoming Apple’s CEO.

It was an interesting interview. In it, Jobs said he didn’t think there was any way Microsoft could seize control of the Web (they’ve tried, and they’ve succeeded far more than Jobs probably anticipated–exhibit the large number of sites that only look right in Internet Explorer), but I found I agreed with a surprisingly large percentage of the things he said–particularly when he talked about things other than computers.

Here’s the link if you’re interested.

——-

From: Scott Vogt

Subject: Win2k On A Maxtor..

Dave,

I am running Windows 2000 with SP1 on a Maxtor 40gig 7200rpm drive with no troubles at all.

Great site, Glad to see you back!

Scott

———-

Thanks, both for the answer and the compliment.

Sounds cards, hard drives, and initial dual G4 impressions

The underwhelming dual G4. I had a conversation Tuesday with someone who was thinking about ditching his PII to get a dual G4 because he thought it would be faster. I guess he thought if he got VirtualPC or SoftWindows, a dual G4/500 would run like a dual PIII/500 or something, plus give him access to all the Mac software. Nice try.
I’m sure one of these dual G4s would make an outstanding Linux box, but the loss of binary compatibility with all the x86 software is something. Sure you can recompile, but there are those instances where that isn’t an option. And under Mac OS 9, that second CPU sits idle most of the time. Photoshop and a couple of other apps use it, but the OS doesn’t–certainly not to the extent that Windows NT or a Unix variant will use a second CPU.

I’m also very disappointed with the hardware. The dual G4 I’m setting up right now has a 124-watt power supply in it. Yes, 124 watts! Now, the PPC chips use less power than an Intel or AMD CPU, and the G4 uses a microATX-like architecture, but they know full well that graphics professionals are going to buy these things and stick four internal hard drives, a Zip, a DVD-RAM, and a gigabyte of RAM inside. Do that, and you don’t have much punch left to power such “non-essentials” as the video card, extra disk controller, and CPUs… This will cause problems down the line. It would seem they’re paying for the extra CPU without increasing the price dramatically by cutting corners elsewhere.

The G4 remains an excellent example of marketing. IBM could invent sushi, but they’d market it as raw, dead fish (which is why they’ve become a non-contender in the PC arena that they created, with the possible exception of the ThinkPad line) while Apple continues to sell sand in the desert. Remarkable.

AMD pricing. The Duron-600 is a great buy right now; according to Sharky Extreme’s CPU pricing, it’s as low as $51. My motherboard vendor of choice, mwave.com, has the Duron-600 with a Gigabyte 7ZX-1 and fan for $191. Outstanding deal. I’d get a PC Power and Cooling fan for it to replace whatever cheapie they’re bundling.

I prefer Asus motherboards to everything else, but the performance difference between the Gigabyte and Asus offerings is really close (Asus wins some benchmarks by a hair, Gigabyte wins others, with Asus being a bit better overall but we’re talking differences of under 1-4 percent, barely noticeable). The Gigabyte boards cost about $30 less than the Asus. I’m thinking if I were getting a Duron for a value system, I’d go Gigabyte; if I were looking for a Thunderbird-based performance system, I’d go Asus.

I plan to see how Naturally Speaking fares on my Celeron; if it’s not quick enough for me I’ll probably retire my trusty K6-2/350 and replace the board with a Duron or Thunderbird.

Voice recognition. I got my Andrea ANC-600 mic on Monday. Since Naturally Speaking and the SB Live! card hadn’t even shipped yet, I went ahead and put the ANC-600 on my Celeron-400 (still equipped with an ESS sound card) and fired up ViaVoice. The ANC-600 eliminated the background noise and increased accuracy noticeably. ViaVoice still tended to mess up a word per sentence, but at least it was in the neighborhood (it had real problems with past/present tense) and its speed was a little better, though it still tended to drag behind me. The SB Live! should help that; as should the newer software’s reliance on newer processor architecture (ViaVoice 97 was designed with the Pentium-MMX in mind, rather than the PII/Celeron or something newer). I await Naturally Speaking’s arrival with much, much greater confidence now.

———-
From: Dan Bowman

Maxtor HDDs

And the CompUSA down the street always has a good deal on them…

This week, Office Depot is selling Maxtor 15gig drives for $99. That’s a “Warlock’s Mirror” for a little over $200 with tax.

dan

———-

Thanks.

Killing a process in Unix

My Linux gateway likes to fall off the Internet occasionally. I think it’s Southwestern Bell’s fault, because it always seems to happen right after it tries to renew its DHCP lease. Rebooting fixes the problem, but I wanted a cleaner way.
Here it is. Do a tail /var/log/messages to get the PID for pumpd. [Or, better, use the command pidof [program name] –DF, 5/25/02] Do a kill -9 [PID] to eliminate the problem process. (This process tends to keep the network from restarting.) Then, do a /etc/rc.d/rc3.d/S10network restart to stop and restart the network. [Better: use /etc/init.d/network restart, which is runlevel independent and works on more than just Red Hat-derived distros. –DF, 5/25/02] Try pinging out just to make sure the Internet’s working again, and bingo. Back in business.

I don’t know that this is the best or most elegant way of doing it, but it works and it’s much faster than waiting for that old 486 clunker to do a warm boot.

LoveLetter ruins my day

I hate viruses. So. I stumble in to work Thursday. I make the mistake of checking my mail before I’ve had my coffee. Mail from a VIP. “Please kindly check this …” I read no further. I spy an attachment, so I do exactly–in my mind–what it asks. I open the attachment in Notepad and look at it. Hmm. A VBscript program written by someone who doesn’t like school. Hmm. Wait, why’s this thing messing with the registry? Why’s this thing making copies of some files and deleting others? Crap! This is a virus! Who else did she send this to? Meanwhile a neighbor’s jabbering away at me about something or another. “Shuddup!” I tell him as I print it out. I print the code (4 pages I think), grab it, circle a couple of offending lines of code, then rush upstairs. Yep, you’ve got it. We were infected with the now-notorious “Iloveyou” virus.
Yeah, loser. I love you too, but only because Jesus says I have to love my enemies. So, God bless you, whoever you are. You’re gonna need that and more. Bad.

I located two infected computers, then I called the wisest, coolest head in the organization (our Unix ubermeister) for advice on how to proceed. This was a good 2-3 hours before Symantec had a fix posted on its Web site. He said he and one of our ace programmers had dissected the code and determined all of the changes it makes. He had registry entries to fix and files to look for. Armed with that info, I was able to put out the fire pretty quickly (silently reminding myself that using Netscape and Eudora instead of Internet Exploiter and Outlook sometimes really has its advantages), but it turned into a very draining day.

FTP batch files

I found this question on the Sys Admin magazine forum:
Can I create a batch file (or something else) to allow me to execute my file transfer from a Tru64 UNIX to a NT without having to type each command? This is what I’m doing now to transfer a recompiled data base from UNIX to NT: At the NT machine:

C:> cd\sandgis
sandgis> erase *.*
are you sure? Y
sandgis> ftp 000.00.000.0
name: *******
passwd: ********
ftp> cd /data/sandgis
ftp> prompt off
ftp> bin
ftp> mget *.*
ftp> cd /apps/sandcauv
ftp> mget par*.*
ftp> quit
sandgis> cd info
sandgis\info> erase *.*
are you sure? Y
sandgis\info> ftp 000.00.000.0
name: ****
passwd: *****
ftp> cd /data/sandgis/info
ftp> bin
ftp> prompt off
ftp> mget *.*
ftp >quit
(this is half of it)

Well, you get the idea… I can get a batch file to work until it goes into FTP, then it stops. Since I’ve got to do this on five NT machines twice a week and the total files size is near half a gig., this is very time consuming.

And here’s the response I submitted:

Put your pre-FTP commands in a batch file, as it sounds like you already have, then add the -s:[textfile] parameter to your FTP statements containing FTP commands, e.g. ftp -s:ftp1.txt 000.00.000.0.

The contents of ftp1.txt, based on your example:
name
password
prompt off
bin
mget *.*
cd /apps/sandcauv
mget par*.*
quit

Anything you put in a file specified by the -s parameter gets fed to your FTP client.

So, you’ll need a batch file, plus a text file for each FTP session, which could turn into a real mess of files, but it’s a whole lot better than typing all that garbage twice a week.