Binary file editing and hardware compatibility

Binary file editing. I’ve recovered many a student’s term paper from munged disks over the years using Norton Disk Edit, from the Norton Utilities (making myself a hero many times). Usually I can only recover the plain text, but that’s a lot better than nothing. Rebuilding an Excel spreadsheet or a QuarkXPress document is much harder–you have to know the file formats, which I don’t.
But at any rate, I’ve on a number of occasions had to run NDE to recover meeting minutes or other documents at work. The sheer number of times I have to do this made me adamantly opposed to widespread use of NTFS at work. Sure, the extra security and other features is nice, but try telling that to an irate user who just lost the day’s work for some reason. The “technical superiority” argument doesn’t hold any water there.

Enter WinHex (www.winhex.com). Now it doesn’t matter so much that the powers that be at work didn’t listen to my arguments. 🙂 (NDE from vanilla DOS would still be safer, since the disk will be in suspended state, but I guess you could yank the drive and put it in another PC for editing.)

For those who’ve never done this before, you can recover data using a brute force method of searching for known text strings that appeared in the file. For example, I once worked on recovering a thesis that contained the line “I walk through a valley of hands.” Chances are, if I search for that, I’m gonna find the rest of the document in close proximity. A Windows-based editor makes this kind of data recovery very nice–search for the string, keeping Notepad open, then copy and paste the strings as you find them.

Knowledge of the underlying filesystem (FAT or NTFS) is helpful but not essential, as is knowledge of the file format involved. If worse comes to worse, you can recover the strings out of the file and have the app open to re-enter it (being aware that you run the risk of overwriting the data, of course).

I found some useful links on the WinHex site detailing certain file formats.

This is a program I suspect I’ll be buying soon, since my need for it is probably more a matter of when rather than if.

———-

From: “James Cooley”

Subject: Tip for tat?

Hi Dave,

I waded through all your views (That’s where all those hits came from!) and I like your style and learned a great deal. Here’s another tip I didn’t see mentioned: in autoexec.bat, add the following: set temp=C:\temp set tmp=C:\temp set tmpdir=C:\temp

You could use the ramdisk drive you mention, of course. I don’t know if this speeds things up, but it sure helps minimize the clutter from most installs when you clean the temp directory periodically. I use C:\temp2 for those disposable downloads because some programs hate extracting into their own directory. Norton Anti-Virus comes to mind: if you run the updates from C:\temp it hangs.

I ordered _UNIX in a Nutshell_ from a recommendation on your site, but got a 500 page tome instead of the 92 pages you mentioned. If you recall the O’Rielly book I’m talking about, could you give me the exact name so I needn’t hunt it down again?

Hope your hands are healing.

Regards,

Jim

———-

Thanks. I’m glad you enjoyed it (but isn’t that an awful lot of reading?)

I’ve seen the tmpdir trick; fortunately not a whole lot of programs use it anymore but that is useful. Thanks.

And yes, as you observe it’s a good idea to use a separate dir for program installs. I try to avoid hanging it directly off the root for speed considerations (a clean root dir is a fast root dir)–I usually stick it on the Windows desktop out of laziness. That’s not the best place for it either, but it’s convenient to get to.

The 92-page book is Learning the Unix Operating System, by Jerry Peek and others. It’s about $12. The 500-page Unix in a Nutshell is useful, but more as a reference. I’ve read it almost cover-to-cover, but I really don’t like to read the big Nutshell books that way. Information overload, you know?

———-

From: “al wynn”

Subject: MAX screen resolution for Win95/98/2000

Do you know the MAXIMUM screen resolutions for Win95/98/2000 (in pixels) ? Which operating systems can support a dual-monitors setting ?

NEC 15′ MultiSync CRT monitors max out at (1280 x 1024 @ 66Hz); for 17′ CRT’s, it’s usually (1600 x 1200 @76Hz). Do you know any 15′ and 17′ models that can handle denser resolutions ? (like (1792 x 1344 @68Hz) or (1920 x 1440 @73Hz) ?

Also, which Manufacturer/Model do you prefer for flat-panel LCD’s ? Which 15′ or 17′ LCD models boast the highest resolution ?

———-

I believe Windows’ limit is determined by the video drivers. So, if a video card ships someday that supports some obnoxious resolution like 3072×2560, Windows should support it. That’s been the case in the past, usually (and not just with the Windows platform–it holds true for other systems as well).

Windows 98 and 2000 support dual monitors.

I’ve never seen a 15″ monitor that does more than 1280×1024, and never seen a 17″ that does more than 1600×1200. I find anything higher than 1024×768 on a 15″ monitor and higher than 1152×864 on a 17″ strains my eyes after a full day of staring at it.

As for flat-panels, I don’t own one so I can’t speak authoritatively. I’d probably buy an NEC or a Mitsubishi if I were going to get one. The price difference between an off-brand flat-panel and a big name is small enough (relative to price) and the price high enough that I’d want to go with someone I know knows how to make quality stuff–I’m not gonna pay $800-900 for something only to have it break after two years. I’m totally sold on NEC, since I bought a used NEC Multisync II monitor in 1990 that was built in 1988. It finally died this year.

A 15″ flat-panel typically does 1024×768, while a 17″ does 1280×1024.

“Apple lost,” Steve Jobs says

Apple obsession continues. See if you can guess who said the following:

The desktop computer industry is dead. Innovation has virtually ceased. Microsoft dominates with very little innovation. That’s over. Apple lost.

ArsTechnica readers may already know the answer. The answer (drum roll) is, none other than Steve Jobs, in an interview that appeared in the Feb. 1996 issue of Wired. Jobs was, at the time, CEO of NeXT, maker of overstyled and overpriced Unix boxes (though by then they were out of the hardware business and just selling NeXTStep, their Unix variant). Apple, of course, bought NeXT a few months after Jobs said this, and in a strange turn of events, Jobs ended up becoming Apple’s CEO.

It was an interesting interview. In it, Jobs said he didn’t think there was any way Microsoft could seize control of the Web (they’ve tried, and they’ve succeeded far more than Jobs probably anticipated–exhibit the large number of sites that only look right in Internet Explorer), but I found I agreed with a surprisingly large percentage of the things he said–particularly when he talked about things other than computers.

Here’s the link if you’re interested.

——-

From: Scott Vogt

Subject: Win2k On A Maxtor..

Dave,

I am running Windows 2000 with SP1 on a Maxtor 40gig 7200rpm drive with no troubles at all.

Great site, Glad to see you back!

Scott

———-

Thanks, both for the answer and the compliment.

Sounds cards, hard drives, and initial dual G4 impressions

The underwhelming dual G4. I had a conversation Tuesday with someone who was thinking about ditching his PII to get a dual G4 because he thought it would be faster. I guess he thought if he got VirtualPC or SoftWindows, a dual G4/500 would run like a dual PIII/500 or something, plus give him access to all the Mac software. Nice try.
I’m sure one of these dual G4s would make an outstanding Linux box, but the loss of binary compatibility with all the x86 software is something. Sure you can recompile, but there are those instances where that isn’t an option. And under Mac OS 9, that second CPU sits idle most of the time. Photoshop and a couple of other apps use it, but the OS doesn’t–certainly not to the extent that Windows NT or a Unix variant will use a second CPU.

I’m also very disappointed with the hardware. The dual G4 I’m setting up right now has a 124-watt power supply in it. Yes, 124 watts! Now, the PPC chips use less power than an Intel or AMD CPU, and the G4 uses a microATX-like architecture, but they know full well that graphics professionals are going to buy these things and stick four internal hard drives, a Zip, a DVD-RAM, and a gigabyte of RAM inside. Do that, and you don’t have much punch left to power such “non-essentials” as the video card, extra disk controller, and CPUs… This will cause problems down the line. It would seem they’re paying for the extra CPU without increasing the price dramatically by cutting corners elsewhere.

The G4 remains an excellent example of marketing. IBM could invent sushi, but they’d market it as raw, dead fish (which is why they’ve become a non-contender in the PC arena that they created, with the possible exception of the ThinkPad line) while Apple continues to sell sand in the desert. Remarkable.

AMD pricing. The Duron-600 is a great buy right now; according to Sharky Extreme’s CPU pricing, it’s as low as $51. My motherboard vendor of choice, mwave.com, has the Duron-600 with a Gigabyte 7ZX-1 and fan for $191. Outstanding deal. I’d get a PC Power and Cooling fan for it to replace whatever cheapie they’re bundling.

I prefer Asus motherboards to everything else, but the performance difference between the Gigabyte and Asus offerings is really close (Asus wins some benchmarks by a hair, Gigabyte wins others, with Asus being a bit better overall but we’re talking differences of under 1-4 percent, barely noticeable). The Gigabyte boards cost about $30 less than the Asus. I’m thinking if I were getting a Duron for a value system, I’d go Gigabyte; if I were looking for a Thunderbird-based performance system, I’d go Asus.

I plan to see how Naturally Speaking fares on my Celeron; if it’s not quick enough for me I’ll probably retire my trusty K6-2/350 and replace the board with a Duron or Thunderbird.

Voice recognition. I got my Andrea ANC-600 mic on Monday. Since Naturally Speaking and the SB Live! card hadn’t even shipped yet, I went ahead and put the ANC-600 on my Celeron-400 (still equipped with an ESS sound card) and fired up ViaVoice. The ANC-600 eliminated the background noise and increased accuracy noticeably. ViaVoice still tended to mess up a word per sentence, but at least it was in the neighborhood (it had real problems with past/present tense) and its speed was a little better, though it still tended to drag behind me. The SB Live! should help that; as should the newer software’s reliance on newer processor architecture (ViaVoice 97 was designed with the Pentium-MMX in mind, rather than the PII/Celeron or something newer). I await Naturally Speaking’s arrival with much, much greater confidence now.

———-
From: Dan Bowman

Maxtor HDDs

And the CompUSA down the street always has a good deal on them…

This week, Office Depot is selling Maxtor 15gig drives for $99. That’s a “Warlock’s Mirror” for a little over $200 with tax.

dan

———-

Thanks.

Killing a process in Unix

My Linux gateway likes to fall off the Internet occasionally. I think it’s Southwestern Bell’s fault, because it always seems to happen right after it tries to renew its DHCP lease. Rebooting fixes the problem, but I wanted a cleaner way.
Here it is. Do a tail /var/log/messages to get the PID for pumpd. [Or, better, use the command pidof [program name] –DF, 5/25/02] Do a kill -9 [PID] to eliminate the problem process. (This process tends to keep the network from restarting.) Then, do a /etc/rc.d/rc3.d/S10network restart to stop and restart the network. [Better: use /etc/init.d/network restart, which is runlevel independent and works on more than just Red Hat-derived distros. –DF, 5/25/02] Try pinging out just to make sure the Internet’s working again, and bingo. Back in business.

I don’t know that this is the best or most elegant way of doing it, but it works and it’s much faster than waiting for that old 486 clunker to do a warm boot.

LoveLetter ruins my day

I hate viruses. So. I stumble in to work Thursday. I make the mistake of checking my mail before I’ve had my coffee. Mail from a VIP. “Please kindly check this …” I read no further. I spy an attachment, so I do exactly–in my mind–what it asks. I open the attachment in Notepad and look at it. Hmm. A VBscript program written by someone who doesn’t like school. Hmm. Wait, why’s this thing messing with the registry? Why’s this thing making copies of some files and deleting others? Crap! This is a virus! Who else did she send this to? Meanwhile a neighbor’s jabbering away at me about something or another. “Shuddup!” I tell him as I print it out. I print the code (4 pages I think), grab it, circle a couple of offending lines of code, then rush upstairs. Yep, you’ve got it. We were infected with the now-notorious “Iloveyou” virus.
Yeah, loser. I love you too, but only because Jesus says I have to love my enemies. So, God bless you, whoever you are. You’re gonna need that and more. Bad.

I located two infected computers, then I called the wisest, coolest head in the organization (our Unix ubermeister) for advice on how to proceed. This was a good 2-3 hours before Symantec had a fix posted on its Web site. He said he and one of our ace programmers had dissected the code and determined all of the changes it makes. He had registry entries to fix and files to look for. Armed with that info, I was able to put out the fire pretty quickly (silently reminding myself that using Netscape and Eudora instead of Internet Exploiter and Outlook sometimes really has its advantages), but it turned into a very draining day.

FTP batch files

I found this question on the Sys Admin magazine forum:
Can I create a batch file (or something else) to allow me to execute my file transfer from a Tru64 UNIX to a NT without having to type each command? This is what I’m doing now to transfer a recompiled data base from UNIX to NT: At the NT machine:

C:> cd\sandgis
sandgis> erase *.*
are you sure? Y
sandgis> ftp 000.00.000.0
name: *******
passwd: ********
ftp> cd /data/sandgis
ftp> prompt off
ftp> bin
ftp> mget *.*
ftp> cd /apps/sandcauv
ftp> mget par*.*
ftp> quit
sandgis> cd info
sandgis\info> erase *.*
are you sure? Y
sandgis\info> ftp 000.00.000.0
name: ****
passwd: *****
ftp> cd /data/sandgis/info
ftp> bin
ftp> prompt off
ftp> mget *.*
ftp >quit
(this is half of it)

Well, you get the idea… I can get a batch file to work until it goes into FTP, then it stops. Since I’ve got to do this on five NT machines twice a week and the total files size is near half a gig., this is very time consuming.

And here’s the response I submitted:

Put your pre-FTP commands in a batch file, as it sounds like you already have, then add the -s:[textfile] parameter to your FTP statements containing FTP commands, e.g. ftp -s:ftp1.txt 000.00.000.0.

The contents of ftp1.txt, based on your example:
name
password
prompt off
bin
mget *.*
cd /apps/sandcauv
mget par*.*
quit

Anything you put in a file specified by the -s parameter gets fed to your FTP client.

So, you’ll need a batch file, plus a text file for each FTP session, which could turn into a real mess of files, but it’s a whole lot better than typing all that garbage twice a week.