Category Archives: Retro Computing

How IBM and DOS came to dominate the industry

Revisionist historians talk about how MS-DOS standardized computer operating systems and changed the industry. That’s very true. But what they’re ignoring is that there were standards before 1981, and the standards established in 1981 took a number of years to take hold.

The reality is that in 1981, IBM and DOS was just one player, and while businesses did embrace IBM (or its clones) and MS-DOS, during the early 1980s you were just as likely to find a Z-80 based microcomputer running CP/M, a Tandy, or an Apple computer in a business setting, and in the late 1980s the Macintosh was more common in businesses than it is today.

The first mass-produced PCs hit the market in 1978. While all of them claimed to be first, in reality the Apple II, Commodore PET, and Radio Shack (Tandy) TRS-80 Model I all came on the market about the same time. (The Apple I, from 1977, was just a board, not a complete computer.) All three machines sold briskly, to a combination of hobbyists, businesses, and home users. The successors to those three machines would duke it out for years in the home. Business came to be dominated by computers based on Intel 8080 and Zilog Z-80 CPUs from a variety of manufacturers. Kaypro and Osbourne were the most famous. Vector Graphic is the most criminally overlooked. All ran the CP/M operating system and were somewhat compatible with one another.

Apple went so far as to run an ad welcoming IBM to the microcomputer market, thanking them for bringing legitimacy to it. And of course, by 1982, you could buy IBM clones.

While there were plenty of clones available before 1985, proprietary offerings from other companies remained cheaper. In the mid 1980s, inexpensive clones manufactured in the Far East, such as the Leading Edge Model D (which relied heavily on Mitsubishi engineering) and the Blue Chip Personal Computer XT (manufactured by Hyundai) appeared, giving IBM compatibility at a price competitive with Commodore or Atari. Although forgotten today, the Model D got rave reviews from computer magazines and even consumer magazines like Consumer Reports for its high degree of compatibility at an unheard-of price. For $1,495, you got a 4.77 MHz 8088 processor, 640 KB RAM, and dual 360K floppies. For comparison’s sake, a Commodore 128 with a 2 MHz 8-bit 8502 processor, 128 KB RAM, and dual floppies would have cost $950 the same year.

And in 1985, Tandy introduced the Tandy 1000. Tandy hoped it would ride the coattails of the IBM PCjr into the home. While the PCjr flopped, the jr-compatible 1000 sold well. Its successors sold even better.

Availability of IBM-compatible computers at Radio Shack spurred adoption, because at the time, Radio Shack was the only nationwide consumer electronics chain. And the Leading Edge Model D sold briskly as well, spurring something of a price war. Suddenly, IBM-compatible computers were cheap enough that people could think about buying one for the home, and you could buy them in stores that didn’t have commissioned salespeople wearing three-piece suits. Businesses bought them and people bought them for home as well. It was practical to buy a computer like the one you had at the office and take work home.

Prior to 1985, you didn’t see a lot of IBM and IBM-compatible computers in the home partly because there weren’t a lot of games for them. The price hurt too, but an Apple IIe wasn’t much less expensive than an IBM PC. There were people willing to spend money on a computer, but they wanted something other than a spreadsheet to play with. And while there were literally thousands of games available for an Apple II or a Commodore 64 (by the end of the decade both platforms had racked up libraries in excess of 10,000 titles), the count for the IBM PC in the early 1980s numbered in the hundreds. King’s Quest, Zork, and Ultima were all good games and available for PCs, but there really was very little outside of those series. And they were also available for the Apple II.

So there weren’t PCs in the home because there weren’t games for them. But there weren’t games for them because there weren’t PCs in the home.

Tandy wasn’t going to sell a computer without making sure there were games available for it, so that helped. And the Leading Edge Model D was so cheap that some people bought it without even researching what kinds of software ran on it. So by 1986 and 1987, lots of companies were making entertainment software for the growing installed base of IBM-compatible PCs.

Apple, Atari, and Commodore fans mostly ignored what was happening. They still had the bigger libraries and significant marketshare. I remember reading a lament in Run magazine, a Commodore rag, in 1986 complaining that the general press divided the computer field into Apple, IBM, and Tandy, and ignored everyone else. But Commodore was still selling more than a million 8-bit computers a year so they weren’t exactly hurting (yet). The August 1987 issue of Compute! told its readers they could expect to see a lot more entertainment software for IBM compatibles very soon. But its editorial content and advertising was pretty evenly divided between Apple, IBM and compatibles, and Commodore audiences. Atari got some print as well.

And in the August 1988 issue of Info, a Commodore and Amiga magazine, an editorial talked about the looming threat of Tandy. The Amiga was safe, for a time. The Amiga was competing against a 286-based Tandy, and the Amiga had more processing power, better graphics and sound. But it warned that in a year, the Amiga would be competing against a 386-based Tandy that would have more processing power. (The Amiga would retain the edge in graphics and sound for several more years.)

And we all know what happened. The 8-bits from Apple and Commodore made it into the 90s but faded quickly. The Atari ST and Amiga lasted a bit longer–I still have Amiga software I bought at Babbage’s in Crestwood Plaza in 1991 and 1992. But by 1992, pretty much anyone who was buying a new computer was buying a Macintosh or a PC. The editorial in the June 1992 issue of AmigaWorld lamented that the majority of new Amigas were being sold to someone who already had one or six.

The PC, of course, steamrolled the Amiga and Atari ST, and by the late 1990s, reduced the Macintosh’s marketshare to a single digit.

How to connect a C-64 to a modern TV’s S-Video input

In the 1980s, a computer monitor offered a clearer picture than a TV by eliminating the need to modulate/demodulate the video signal, which caused degradation. But in 2003, it’s next to impossible to find affordable composite monitors for 20-year-old computers, and when you can find them, their size pales in comparison to a $99 TV. Why bother with a really old, curvy 13″ monitor when you can retro-compute in luxury on a flat 19″ TV?
Fortunately, if a TV offers composite jacks, you can connect a computer directly to it. No tricks involved–you connect it just like you would a VCR.

But Commodore 8-bit computers (the 64, 128, and Plus/4) used a trick to get a clearer picture: They seperated the chroma and luma signals. This is exactly what S-Video does today. So it’s possible to get a better-still picture out of a Commodore, if your TV has S-Video jacks.

Note: Older C-64s had a 5-pin video connector that only provided straight composite. Those connect just like a VIC-20. Don’t modify it to provide S-Video, the machine is worth much more unmodified.

By far the easiest way to connect a Commodore to S-Video is to buy a cable. They’re common on Ebay for about $20.

You can also make your own if you want. Making video cables isn’t difficult, assuming you have decent soldering skills. Usual disclaimers apply: I make no guarantee as to the accuracy of this information. I believe my sources are accurate but I don’t have a working Commodore to try this on right now. Connecting the cables wrong should only result in lots of noise and lots of snow on your TV screen, but if you somehow mess up your computer, it’s not my responsibility.

Later C-64s, C-128s and Plus/4s used an 8-pin DIN connector. S-Video uses a 4-pin mini-DIN connector, the same connector used on Macintosh keyboards from about 1986-1997.

If you already have a Commodore video cable, you can easily make an adapter. Get a 4-pin mini-DIN connector and two female RCA plugs, red and yellow. Connect S-Video pin 3 to the center of the yellow plug. Connect pin 2 to the outside of the yellow plug. Connect pin 4 to the center of the red plug, and pin 1 to the outside of the red plug.

If you don’t have a cable but can locate the appropriate connectors, you can make a cable like so:

   Commodore              S-Video

        2                  4   3
     4     5              2     1
   1    8    3
     6     7

  (solder side)        (solder side)

Commodore pin 1 goes to S-Video pin 3 (luma)
Commodore pin 6 goes to S-Video pin 4 (chroma)
Commodore pin 2 goes to S-Video pins 1 and 2 (ground)

Commodore pin 3 goes to the center of an RCA connector for audio. Connect the outside of the RCA connector to Commodore pin 2.

To make a straight composite cable for a C-64 or VIC-20 (the VIC had a 5-pin plug, and so did early 64s–the later plug is backwards compatible with this 5-pin plug), connect Commodore pin 4 to the center of a yellow male RCA plug. Connect Commodore pin 2 to the outside of the yellow plug. Connect Commodore pin 3 to the center of a white RCA male plug, and Commodore pin 2 to the outside of the white plug.

Fare thee well, goodnight, and goodbye to my friend, OS/2

The Register: IBM has finally brought the Great Rebellion [OS/2] to a close.
The Register was the only online obituary that mentioned eComStation, a third-party OS/2 derivative that everyone forgets about. Interestingly, the product literature never mentions OS/2 by name, only bragging about its technology licensed from IBM.

The Reg also talked about OS/2 version 3 being positioned as a gamer’s OS. Maybe that’s ironic, coming from the suits at IBM, and that wasn’t how I saw it–I switched from Windows 3.1 to OS/2 because, coming from an Amiga, I was used to being able to multitask freely without a lot of crashes. Windows 3.1 crashed pretty much every day if I tried to do that. OS/2 knocked that number down to about once a year, and usually those lockups happened when I was running Windows apps.

Even though I never really thought of it that way, OS/2 was great for games. Since it used virtual DOS machines, each environment had its own memory management, so you could fine-tune it and avoid shuffling around boot disks or struggling to master the DOS 6.0 boot menus. Pretty much no matter what you did, you got 600K or more of conventional memory to work with, and with some fine-tuning, you could bring that total much higher than you could usually attain with DOS. Since 600K was almost always adequate, most games just ran, no sweat.

The other thing I remember is the speed at which DOS games ran. Generally, running it under OS/2 gained you a speed grade. A game running under OS/2 on a DX2/50 felt like a DX2/66 running under DOS would feel. An OS/2 wizard could usually squeeze yet more performance out of the game with some tweaking.

I have fond memories of playing Railroad Tycoon, Civilization, and Tony LaRussa Baseball 2 on my Compaq 486 running OS/2 v3.

And there was another really nice thing about OS/2. When I bought a shiny new Pentium-75 motherboard and CPU and a new case, I pulled the hard drive out of the Compaq and dropped it into the Pentium. It just worked. All I had to do was load drivers for my new video card, since it used a different chipset than my 486.

And the cult of OS/2 won’t go away just yet. The talk over at has me almost ready to install OS/2 again.

Old computer magazines

I guess I need a “retro” category here. Anyway, I found this on Slashdot this morning: The Computer Magazine Archive. Don’t let the URL fool you–it’s not just Atari stuff.
You can go into the Compute!’s Gazette section and download the disk for the November 1991 issue to find the only program I ever published, a C-64/128 sprite utility program called MOB Mover. The text for the accompanying article isn’t present, alas. I think I got a cool $175 for that project (I had to split it with my co-author). I have no idea what anyone would do with the program these days, but hey, it’s out there.

I spent a good deal of time (when I should have been fixing dinner) in the Creative Computing archive. I never saw a copy of the magazine when it was in print but I knew it was well-regarded. Browsing a few articles, I can see why. Take a gander at its review of the Apple Lisa and its preview of some weird computer called Amiga, which contained a rather amusing prediction: “For, regardless of the fact that the IBM standard is a decidedly mediocre one, the [IBM PCjr.] is bound to become the home standard.”

Well… Within a decade, that prediction had become half right.

A total blast from the past

I don’t remember how I stumbled across it, but tries to collect documents from the classic days of BBSing, which the curator defines as having ended in 1995. I wouldn’t have thought it that recent. I was still BBSing in the summer of ’94, but by the fall of ’94 I’d discovered the Web, and I thought I was the last one to wake up to it.
I’d learned FTP and Gopher when I went to college in 1993, and I’d been using Usenet via local BBSs for even longer, but as everyone knows now, it was the Web that put the Internet on the map. I think a lot of people think the Web is the Internet.

Anyway, before the Internet, hobbyists would take computers, get a phone line, hook up a modem, and see who called. There were usually discussion boards, file transfers, and at least one online multiplayer game. The really big BBSs ran on 386s with hard drives, but an awful lot of the BBSs I called ran on 8-bit computers and stored their data on floppy drives. I remember one board I called used seven or eight floppy drives to give itself a whopping 6 or 7 megs of online storage. It was called The Future BBS, and the sysops’ real names were Rick and Jim (I don’t remember their handles), and it ran on a Commodore 64 or 128 with, ironically, a bunch of drives that dated back to the days of the PET–Commodore had produced some 1-meg drives in the early 80s that would connect to a 64 or 128 if you put an IEEE-488 interface in it. Theirs was a pretty hot setup and probably filled a spare bedroom all by itself for the most part.

It was a very different time.

Well, most of the boards I called were clearinghouses for pirated software. It was casual copying; I didn’t mess with any of that 0-1 day warez stuff. We were curmudgeons; someone would wax nostalgic about how great Zork was and how they didn’t know what happened to their copy, then someone would upload it. I remember on a couple of occasions sysops would move to St. Louis and complain about how St. Louis was the most rampant center of software piracy they’d ever seen, but I see from the files on that probably wasn’t true.

Besides illegal software, a lot of text files floated around. A lot of it was recipes. Some of them were “anarchy” files–how-to guides to creating mayhem. Having lots of them was a status symbol. Most of the files were 20K in length or so (most 8-bit computers didn’t have enough address space for documents much longer than that once you loaded a word processor into memory), and I knew people who had megabytes of them in an era of 170K floppies.

A lot of the stuff on the site is seedy. Seedier than I remember the boards I called being.

But a lot of the content is just random stuff, and some of it dates itself. (Hey, where else was I going to find out that the 1982 song “Pac-Man Fever” was recorded by Buckner & Garcia? forgot about that song. If I recall correctly, that’s probably proof that God is merciful, but hey.)

Mostly I find it interesting to see what people were talking about 10 and 20 years ago. Some of the issues of yesterday are pretty much unchanged. Some of them just seem bizarre now. Like rumors of weird objects in Diet Pepsi cans.

Actually that doesn’t sound so bizarre. I’m sure there’s an e-mail forward about those in my inbox right now.

Optimizing BIOSes and optimizing DOS

Optimizing the BIOS. Dustin Cook sent in a link to Adrian’s Rojak Pot, at , which includes a BIOS tweaking guide. It’s an absolute must-read. I have a few minor quibbles with a couple of the things he says, particularly about shadowing and caching your ROMs with Windows 9x. He says you shouldn’t do it. He’s right. He says you shouldn’t do it because Microsoft says not to do it with Windows NT, and Windows 9x “shares the same Win32 architecture.” It does and it doesn’t, but that’s flawed logic. Shadowing ROMs isn’t always a bad thing; on some systems that eats up some usable memory and on others it doesn’t, depending on the chipset and BIOS it uses. But it’s pointless because Windows doesn’t use the BIOS for anything, unless you’re in safe mode. Caching ROMs makes very little sense; there’s only so much caching bandwidth to go around so you should spend it on caching memory that’s actually being used for something productive. So who cares about architecture, you shouldn’t cache and shadow your ROMs because Windows will ignore it one way or the other, so those facilities are better spent elsewhere. The same thing is true of Linux.

Still, in spite of this minor flaw I found in a couple of different spots, this is an invaluable guide. Perfect BIOS settings won’t make a Pentium-90 run like a Pentium III, but poor BIOS settings certainly can make a Pentium III run more like a 386DX-40. Chances are your BIOS settings aren’t that bad, but they can probably use some improvement. So if you want the best possible performance from your modern PC, visit Adrian’s. If you want to optimize your 386 or 486 or low-end Pentium, visit the site I mentioned yesterday.

Actually, it wouldn’t be a half-bad idea to take the downloadable versions of both guides, print them, and stick them in a binder for future reference. You’ll never know when you might want to take them with you.

Optimizing DOS again. An awful lot of system speed is psychological. I’d say maybe 75% of it is pure psychology. It doesn’t matter so much whether the system really is fast, just as long as it feels fast. I mentioned yesterday keyboard and screen accelerators. Keyboard accelerators are great for people like me who spend a lot of time in long text files, because you can scroll through them so much faster. A keyboard accelerator makes a big difference in how an old DOS system feels, and it can improve the responsiveness of some DOS games. (Now I got your attention I’m sure.)

Screen accelerators are a bit more of a stretch. Screen accelerators intercept the BIOS calls that write to the screen and replace them with faster, more efficient code. I’d estimate the speedup is anywhere from 10 to 50 percent, depending on how inefficient the PC’s BIOS is and whether it’s shadowing the BIOS into RAM. They don’t speed up graphics at all, just text mode, and then, only those programs that are using the BIOS–some programs already have their own high-speed text routines they use instead. Software compatibility is potentially an issue, but PC power users have been using these things since at least 1985, if not longer, so most of the compatibility issues have long since been fixed.

They only take a couple of kilobytes of memory, and they provide enough of a boost for programs that use the BIOS that they’re more than worth it. With keyboard and screen accelerators loaded in autoexec.bat, that old DEC 386SX/20 feels an awful lot faster. If I had a copy of a DOS version of Microsoft Word, I could use it for writing and it wouldn’t cramp my style much.

Optimizing DOS and the BIOS, plus new iMacs

Optimizing DOS (Or: A New Use for Ancient Equipment). I was thinking yesterday, I wished I had a computer that could just hold disk images and do data recovery. Then I remembered I had a DECpc 320P laptop laying under my desk. I cranked it up. MS-DOS 5, 20 MHz 386sx, 80-meg drive, 6 MB RAM, grayscale VGA display. So I installed Norton Utilities 8, the main thing I wanted to run (I had a retail box sitting on my shelf), then of course I set out to optimize it. Optimizing DOS is really easy: it’s just a question of disk optimization and memory management. I cleaned up the root directory, pulled the extraneous files in the C:\DOS directory (the .cpi files, all the .sys files, all the .bas files). Then I ran Speed Disk, setting it to sort directory entries by size in descending order, put directories first, and do full optimization. It took about 30 minutes. If I’d been really bored I could have mapped out what executables are most important to me and put those first. Since DOS doesn’t track file access dates it can’t automatically put your frequently accessed files first like Speed Disk for Windows does.

Of course when I installed Norton Utilities 8 I installed NDOS, its replacement. Built-in command history, improved resident utilities, and thanks to its memory management, it actually uses far less conventional memory (but more memory total) than That’s OK; with 6 MB of RAM I can afford to give up a fair bit of extended memory for better functionality.

Once I was happy with all that, I also attacked the startup files. I started off with a basic config.sys:

device=c:\dosemm386.exe noems

Then I went into autoexec.bat, consolidated the PATH statements into one (it read: PATH C:\WINDOWS;C:\DOS;C:\DOS\u;C:\MOUSE) and added the prefix LH to all lines that ran TSRs or device drivers (such as MOUSE.EXE). Upon further reflection, I should have moved the Mouse directory into C:\DOS to save a root directory entry.

I added the NCACHE2 disk cache to autoexec.bat– NCACHE2 /ext=4096 /optimize=s /usehigh=on /a a c /usehma=on /multi=on. That turns on multitasking, enables caching of both C: and A:, tells it to use 4 MB of memory, use high memory, and use extended memory. My goal was to use as much memory as prudently as possible, since I’d be using this just for DOS (and mosly for running Norton Utilities).

I also set up a 512K RAMdisk using RAMDRIVE.SYS (devicehigh=c:\dos\ramdrive.sys 512 128 4). Then I added these lines to autoexec.bat:

md d:\temp
set tmp=d:\temp
set temp=d:\temp

Now when an app wants to write temp files, it does it to a RAMdisk. The other parameters tell it to use 128K sectors to save space, and put 4 entries in the root directory, also to save space. With DOS 5, that was the minimum. I don’t need any more than one, since I’m making a subdirectory. I could just point the temp directory to the root of D:, but I’d rather have dynamic allocation of the number of directory entries. This setting is more versatile–if I need two big files in the temp directory, I’m not wasting space on directory entries. If on the other hand I need tons of tiny files, I’m guaranteed not to run out of entries.

It’s not a barn burner by any stretch, but it’s reasonably quick considering its specs. Now when someone trashes a floppy disk, I can just throw it in the 320P, run Disk Doctor and Disktool on it (and in a pinch, Norton Disk Editor), copy the data to the HD, then throw the recovered data onto a new, freshly formatted floppy. I’ll only use it a couple of times a year, but when I need such a beast, I need it badly. And if I have the need to run some other old obscure DOS program that won’t run on newer machines, the 320P can come to my rescue again too. It runs the software well, it boots in seconds–what more can I ask?

I could have done a couple more things, such as a  screen accelerator and a keyboard accelerator . Maybe today if I have time.

I was tempted to put Small Linux ( ) on it, but frankly, DOS 5 and Norton Utilities 8 is more useful to me. I’m not sure what I’d do with a non-networkable Linux box with only 6 MB RAM and a monochrome display.

A useful (but unfortunately dated) link. I stumbled across this yesterday: The BIOS Survival Guide , a nicely-done guide to BIOS settings. Unfortunately it stopped being maintained in 1997, so it’s most useful for tweaking very old PCs. Still, it’s better than nothing, and most modern PCs still have most of these settings. And reading this does give you a prayer of understanding the settings in a modern PC.

If you want to optimize your BIOS, this is about as good a starting point as you’re going to find online for free. For more recent systems, you’ll be better served by The BIOS Companion, written by Phil Croucher (one of the co-authors of this piece.) You can get a sample from that book at .

New iMac flavors. Steve Jobs unveiled the new iMacs this week. The new flavors: Blue Dalmation and Flower Power. Yes, they’re as hideous as they sound. Maybe worse. Check the usual news outlets. They’d go great in a computer room with a leopard-skin chair, shag carpet, and lava lamps. And don’t forget the 8-track cranking out Jefferson Airplane and Grateful Dead tunes.

I think the outside-the-box look of Mir, the PC Gatermann and I built as a Linux gateway (see yesterday), is far more tasteful–and that’s not exactly the best idea we ever had.

486s and Amigas and emulators, oh my

Recovering data from an old large hard drive out of a 486. Someone asked how. No problem.
What do is put both drives in a new(er) system, each on its own IDE channel as master, then autodetect the old drive with the BIOS’ autodetect drives feature. But, to be on the safe side, I don’t boot Windows. I don’t want anything to try to write to the old drive, because it may not work right the first time. Instead, hold down the control key while booting (if you have Win98; if you have Win95, start tapping the F8 key immediately after the BIOS boot screen comes up–if you get a keyboard error, hit F1 when it says, then resume your attack on the F8 key). Select Safe Mode Command Prompt Only from the menu. That will put you at a C prompt.

Your old(er) drive will be drive D. If you had other partitions on the drive, they’ll be lower in the alphabet as Dan said. We can tell you exactly how your drives will be mapped if you remember your partitions (or maybe you’re familiar with how drive letters get mapped already).

Now, I execute a DIR /S D: to see if it produced an error. If it doesn’t, try this to get your data (don’t type the comments in italics):

MD C:RECOVER create a destination for your data
SMARTDRV D- turn on disk caching to speed up –may not work but does no harm
XCOPY /S /E /V D:*.* C:RECOVER copy drive D in its entirety to the destination

With any luck, that’ll safely spirit all your data away to the new drive. This is more convoluted than using Windows Explorer, but it’s safer. (See why I disagree with the people who say command lines are evil and obsolete and we shouldn’t have them anymore?)

If that succeeds, power down, disconnect the old drive, boot Windows, and check to make sure your data is intact and not corrupt. If it fails, reboot, go into the BIOS, and change the translation scheme for the old drive (you have a choice between Normal, Large, and LBA–LBA is usually the default). Lather, rinse, and repeat.

The good news is, I’ve used this method numerous times to move data from old 486s to newer machines, so chances of success, though not guaranteed, are pretty high.

Maybe I don’t want that Amiga 1200 after all… I went ahead and downloaded UAE 0.8.8 Release 8, then downloaded Amiga In A Box, which gives me a nice, souped-up Amiga setup without me having to remember all the nuances of the OS and tweak them myself (including some nice PD and shareware stuff already installed, configured and running). I fed it my Kickstart ROM image and my Workbench disk, it copied the files it needed, and voila, I had a working AGA-compatible Amiga!

The package even includes TCP/IP support. While Web browsing on a 33 MHz machine is a bit slow, I found performance to be almost as good as Netscape 4.x on a 90 MHz Power Macintosh 7200.

I benchmarked it, and on my Celeron-400 with a pathetic Cirrus Logic video card (I really need to get a cheap TNT2) I still compared favorably to a 33 MHz Amiga 4000/030. (My old beloved Amiga 2000 had a 25 MHz 68030 in it.) Since the Amiga’s biggest bottlenecks were with the disk subsystem and the video–they were comparable in speed to the PCs of 1990 and 1991–even a slow-sounding 33 MHz machine runs pretty nicely. I could probably crank out a little extra speed with some tweaking, which of course I’ll do at some point.

Then again, maybe I’ve finally found a use for a 1.2-GHz Athlon… (Besides voice recognition.)

If you have an old Amiga laying around and want some nostalgia, go get this. There’s a ton of legal Amiga software at to experiment with. If you don’t have an Amiga but want to see what all the fuss is about, you can get Cloanto’s Amiga Forever package, which contains legal, licensed ROM and OS images. You’ve probably never heard of Cloanto, but they’re one of the largest remaining Amiga software publishers. They’re reputable.

Now I just need to get TransWrite, the great no-nonsense word processor that I bought when I first got my A2000, running under UAE.

Overclocking Pentium-75s

I had an overclocking conversation at work today. A coworker wanting to overclock a laptop. I told him I didn’t think that was a good idea. Then this was waiting for me at home:

From: Curtis Horn
Subject: Pentium-75

Hello, I’m one of your readers and I check your view for the tips you sometimes put up. I’m working on a compaq pentium 75 also and maybe you can do what I did. Overclock the chip to 90Mhz. the way I did this is by changing the bus speed to 60Mhz, from 50. this has speed it up significantly I think because the memory is also speed up. You’ll also get a laugh out of this, it’s a Compaq 972 and it has 8MB of memory — ON the Mother board!! I could not believe it. but it’s there. Luckily this leaves 4 simm slots open, so i can add 4 8MB SIMMs. (16MB SIMMs are way to expensive and I have some 8MB laying around and can buy some more for 10$ each) I convinced the person to buy a 5Gig quantum drive, so they have something they can use when they upgrade. Well hope the P75 you’re working on OCs as easily as the one I have here.


Compaq used to put a fair bit of memory on the motherboard itself. My Presario 660 (a 486/66) has 4 MB on the board. There are a couple of Compaqs from the 900 series still floating around at work that have 8 MB on the motherboard as well. But it’s not a common practice anymore, and I don’t recall any other manufacturer who did that regularly–I remember Compaq doing it because I used to sell Compaqs by the truckload and frequently I ended up adding upgrades to them.

Bus speed isn’t nearly as important in the Pentium Pro/II/III/Celeron and AMD Athlon arena, but in Socket 7 and earlier, you’re right, it makes a huge difference. Remember, the bus speed determines the speed at which the CPU can access the memory and the cache, and as the Mendocino Celeron illustrated, cache speed is more important than CPU speed or cache size. In the early days of Tom’s Hardware Guide, Tom Pabst revealed that a Pentium-150 running at 75 MHzx2 outran a Pentium-166, and a Pentium-166 running at 83MHzx2 outran a P200. So what was the point of buying a P200 if you weren’t going to overclock it, right? Ah, the good old days…. This, of course, was one reason Intel decided to start locking CPU multipliers.

The speed of the PCI bus was also tied to the bus speed. A good Pentium-100 could outrun a Pentium-120 because the Pentium-100 had a full 33 MHz PCI bus while the Pentium 120’s PCI bus ran at 30 MHz. The Pentium 75’s PCI bus ran at a pokey 25 MHz. Nobody wants to slow down their video and disk performance by 10 percent, let alone 25 percent.

Overclocking P75s is risky business though. Intel never intended to make a P75. The problem was, they had terrible yields initially on their P90s, but they found a good percentage of the bad chips would run reliably at 75, so they created the P75 and phased out the P60 and P66. (The P66 was actually a better performer because of the bus speed.) The P75 sold like crazy, and Intel wasn’t going to can a best-seller, so once they got over the yield problems, they still marketed P75s. I’ve heard of people going as high as 133 MHz with P75s. I experimented once with a P75 and took it as high as 120 MHz, but couldn’t get 133 (I suspect people getting to that level may have been increasing the voltage). It didn’t run reliably at 120 MHz for long, though I know of people who swear up and down they got 75’s running at that speed reliably with no special tricks.

Overclocking an old chip like that is fine, as long as you’re aware of the risks and willing to live with them. I’d definitely put a heavy-duty CPU fan on it (like a PC Power and Cooling fan for a high-end K6-2). In my case, I’m more interested in having a PC that’s as reliable as possible. Her life’s plenty complicated enough without having and overclocked P75 to deal with.

And we now have better ways to measure overclocking’s effects. Microsoft doesn’t have a dog in this fight but they see the weirdness.

But thanks for the idea, and for the stroll down memory lane, definitely.