I’ve talked before about how to disable animation in Cyanogenmod 10.x, but I’ve done a few other things to conserve some scarce system resources on my gigahertz-ish, half-gig Nook Color. If you’re running Cyanogenmod on a phone that’s a couple of years old, these tricks can help you too. Here are some tricks to speed up Android.Read More »Speed up Android with these six tips
Home » eye candy
The big-time gamers are all up in arms over John C. Dvorak’s assertion that the game industry is dying. But he’s right an awful lot more than he’s wrong.
The games aren’t nearly as original as they used to be.Let’s track the evolution of the first-person shooter. Games where you run around in a maze and shoot everything that moves aren’t new. Castle Wolfenstein was a huge hit for Muse Software way back in 1981. The premise was simple: You’re trapped in a castle full of Nazis and your job is to shoot everything that moves and escape. Simple enough.
Was it the first game of its type? I don’t know. I don’t even know for certain that it was the first popular game of its type. But it at least proves the idea is is at least 24 years old as of the time of this writing.
Eleven years later, Wolfenstein 3D was published and released. It took the same premise and put it in a 3D setting. Its inspiration was obvious. And like its famous predecessor, it pushed the limits of the time: You needed a pretty advanced CPU to play it, and the better your graphics and sound cards were, the better gaming experience you got. In the early 1990s I remember people bragging about the slowest computer they managed to get to run Wolf3D.
A year or so later, Doom was released. It was considered revolutionary. The graphics and sound were better, and it required a better computer, but as far as a plot went, all one had to do was replace the Nazis with monsters and give the main character a larger assortment of weapons.
And that’s pretty much where we stand today. There is no revolution here. Each generation adds more eye candy and another layer of complexity, but the basic premise isn’t really changed since that 1981 game. Some people like that kind of thing and others don’t. Dvorak clearly doesn’t. I never really got into it much either. Once I got over the initial wow factor of seeing a computer-generated 3D world, I found I just didn’t enjoy it. I had a brief fling with a 3D FPS called Redneck Rampage. It used a recycled game engine, just replacing the original setting with a backwoods theme and replacing the characters with rednecks and aliens and playing off every stereotype in the book. I enjoyed the game mostly because I thought it was funny. Once the jokes wore off, I quit playing.
Whether this genre has been worked over to death depends on whether you like this sort of thing, I guess. And maybe that’s where Dvorak is wrong. Neither he nor I see the originality, but people enjoy the games and keep buying them. I don’t see the originality in country music either–to me, the songs pretty much sound alike, and the words are all about pretty much the same thing–but the country music industry is huge and it ain’t exactly shrinkin’, y’all.
But maybe this is just a sign of a mature industry. One of my high school writing teachers was fond of pointing out that Shakespeare never wrote an original plot in his life. But the stories seemed new when he put new and compelling characters in new settings along with those tired old plots.
Some people will get bored with the FPS games and move on to another interest. Others will keep at it, no matter how bad or unoriginal the games get. The only question is whether the audience will grow or shrink as a whole over time, and if it shrinks, how profitable the genre will become.
I think part of the problem for both Dvorak and me is that we’re both old enough to remember the early 1980s, when new games would come out and the new games really did seem new. All told, a total of about 900 games were released for the Atari 2600, and of those, about 100 were really common. (Of the remainders, a very large percentage of them were knockoffs or sequels and some of them were so bad that they sold terribly, so nobody saw them.)
Most of us who lived through that time and were really into technology saw those 100 or so games and enjoyed them.
There’s another difference too. Those games were a lot simpler. That’s both good and bad. A really avid gameplayer will probably master the game too quickly and get bored with it. But a more casual gamer can pick it up and learn it and enjoy it.
A really good Civilization player will probably enjoy Civ3 more than the original because it’s more challenging. But I’ve come to prefer the first two, because I can still pick up the original and play it well. If I spent ten hours a week playing video games, it might be different.
The gaming industry hasn’t completely lost me. There are still a handful of games I enjoy: the Civilization series, the Railroad Tycoon series, and the Baseball Mogul series. I haven’t bought the new Pirates! yet, but I’m sure I will if and when the price comes down because I loved the original.
But I only pick up one or two of those games per year anymore, and I probably don’t play them for more than a few weeks when I do.
Since my fiancee enjoys racing games where the two of us can race, if I’m ever out somewhere and I see two copies of a cheap racing game that looks decent and offers network play, I’ll get it and a couple of USB steering wheels. I imagine she’ll want to play a lot at first, and then it’ll become something we do occasionally when we might otherwise go to the movies.
The gaming industry changed, and in doing so, it lost John Dvorak and it’s probably written people like me off too, because I only spend $50 every two or three years on games.
Dvorak seems to think the gaming industry needs people like him. And that’s the only point he makes that I’m not wholeheartedly ready to agree with. The gaming industry is very different now than it was when I was 15 and playing games a lot, but it’s also a lot bigger.
We needed an XP box at work for testing. Duty to do the dirty deed fell to me. So after ghosting the Windows 2000 station several of us share, I pulled out an XP CD. It installed surprisingly quickly–less than half an hour. The system is a P3-667 with 128 MB RAM and an IBM hard drive (I don’t know the model).
It found the network and had drivers for all the hardware in the box. That doesn’t happen very often with Microsoft OSs, so it was nice.
I booted into XP, to be greeted by a hillside that was just begging to be overrun by tanks, but instead of tanks, there was this humongo start menu. I right-clicked on the Start button, hit Properties, and picked Classic view. There. I had a Win95-like Start menu. While I was at it, I went back and picked small icons. I don’t like humongous Start menus.
I also don’t like training wheels and big, bubbly title bars. The system was dog slow, so I right-clicked on the desktop to see what I could find to turn off. I replaced the Windows XP theme with the Classic theme. Then I turned off that annoying fade effect.
Still, the system dragged. I went into Control Panel, System, Performance. Bingo. I could pick settings for best appearance (whose choices are certainly debatable–I guess they look good if you like bright colors and have a huge monitor) or best performance. Guess which I picked? Much better.
Next, I went into Networking. I saw some QoS thing. I did a search. It’s intended to improve the quality of your network, at the price of 20% of your bandwidth. Forget that. I killed it.
After I did all that stuff, XP was reasonably peppy. It logs on and off quickly. I installed Office 2000 and it worked fine. The apps loaded quickly–just a couple of seconds. That’s how it should be. If I went in and edited the shortcuts in the Start menu to turn off the splash screens, they’d load instantly.
WinXP brings up a bunch of popups that I don’t like. If I wanted unexpected popup windows, I’d run a Web browser. I couldn’t quickly figure out how to disable those.
I couldn’t run Windows Update. It froze every time I tried.
I found a Windows XP tuning guide at ExtremeTech. I suspect turning off the eye candy will help more than most of the suggestions in that article. I suspect if I dug around I’d find other things. We’ll see if I get some time.
XP isn’t as bad as I expected, I guess. But I’m still not going to buy it.
This, on the other hand, is worth a second look. And a third. You can now run MS Office on Linux. No need to wait for Lindows, no need to abandon your current fave distro (at least if your fave distro is Red Hat, SuSE, Mandrake, Debian, or Caldera).
It’s 55 bucks. It’s available today. It brings Office 97/2000 and Lotus Notes r5 to your Linux desktop. Other Windows apps work, but their functionality isn’t guaranteed.
You can get some screenshots at CodeWeavers. It even makes the apps look like native Linux apps.
Yes, Brian, baseball will soon return. I hate the things Major League Baseball does (Bob Costas once likened choosing sides between the players and the owners to choosing sides between Iran and Iraq), but we’ve chosen to stay together for the kids. I’m sure everyone who cares (and some who don’t) can guess what I think of Bud Selig, but I’ll tell you anyway, soon enough.
In the meantime, I look like ArsTechnica today. Oh well. I don’t do this very often.
Blogging. Wired News had its take on the phenomenon, and threw out some interesting stats.
In January alone, at least 41,000 people created new blogs using Blogger, and that number is always increasing, [Blogger founder Evan] Williams said. Some have put the total number of weblogs at more than 500,000.
Alongside the boom, however, there have recently been a few faint signs of backlash. As increasing hordes take on the task of trying to keep new sites looking nice, sounding original and free from banalities, more hordes just seem to fail.
Blog critic Dave Linabury offered a recipe for success:
“It really can take a lot of time,” he said. “I spend two hours a day on my weblog. Many people don’t realize this, they think it’s a quick way to get popular. And after awhile they get really discouraged and say, ‘he got 2,300 hits today, I got four.’ The bulk of people out there get less than two dozen hits.”
“I don’t want to be elitist,” Linabury added, “but all these people out there with popular weblogs, they’ve been doing it longer and they stick to their guns.”
I can attest to that. The people who get more traffic than I get almost all have been doing this longer. But I can tell you one thing: It’s never enough. Back when I was getting 80 visits a day I wanted 150. When I was getting 150 visits a day, I wanted 250. Now that I get about 500 visits a day, I’m awfully distressed to see people are getting 2,300. And by the time I reach 2,300, I’m sure there will be people getting 5,000 or even 10,000. (Note that visits are the number of unique visitors; hits are the number of files served up. Hit count is deceptive. I get 500 visits per day but closer to 1,000 or even 1,500 hits per day, due to people visiting, reading comments, and then often reading something from a previous week. And if they do a search, that’s at least two additional hits.)
Another feather in Internet Explorer’s cap. To my knowledge, no new security vulnerabilities have been reported in Internet Explorer this week, but the newest security patch, released last week, contains a bug that can cause a VBscript directive that previously worked to crash the browser.
Microsoft says Webmasters need to modify their pages not to use the directive.
That’s nice (I don’t use VBscript on this site) but there are embedded devices, such as HP’s JetDirect card, that use the directive. So early adopters of this patch may find themselves unable to do their jobs.
Better webmaster recommendation: Don’t use VBscript or ActiveX or other Microsoft-owned languages in your Web pages at all. Better end-user recommendation: Use Mozilla or a derivative instead of Internet Explorer.
Recompiling Debian for your hardware. This thread comes up every so often, and with the popularity of Linux From Scratch and Gentoo, the appeal of a compiled-from-scratch Debian is undeniable. But does the small speed improvement offset the increased difficulty and time in upgrading?
The consensus seems to be that recompiling gzip, bzip2, and gnupg with aggressive options makes sense, as does recompiling your kernel. Recompiling XFree86 may also make some sense. But expending time and energy in the perfectly optimized versions of ls and more is foolhardy. (Especially seeing as speed demons can just get assembly language versions of them from www.linuxassembly.org.)
A Guide to Debian. This is a guide, still incomplete, that gives a number of tips for someone who’s just installed Debian. The tips are applicable to other many other Linux (and even Unix) flavors as well.
Spam. A coworker walked into my cube today and asked me how he could keep web robots from harvesting e-mail addresses from his web site. I found myself referring once again to the definitive piece on the subject, from Brett Glass (who gets my nomination for the greatest computer columnist of all time, for what that’s worth).
The RULE project. A project has emerged to bring Red Hat Linux back to its roots, and allow it to run on older, less-powerful hardware.
From their site:
This install option is meant to benefit primarily two classes of users:
* GNU/Linux newbies who cannot afford modern computers, but still need, to get started more easily, an up to date, well documented distribution.
* System administrators and power users who have no interest in eye candy, and want to run updated software on whatever hardware is available, to minimize costs, or just because it feels like the right thing to do.
I love their FAQ. Check this out:
1.0 Hardware is so cheap today, why bother?
1. This is a very limited and egoistic attitude. Eigthy per cent of the world population still has to work many months or years to afford a computer that can run decently the majority of modern, apparently “Free” software.
2. Many people who could afford a new computer every two years rightly prefer to buy something else, like vacations, for example…. Hardware should be changed only when it breaks, or when the user’s needs increase a lot (for example when one starts to do video editing). Not because “Free” Software requires more and more expensive hardware every year.
These guys have the right idea. I can only hope their work will influence other Linux distributions as well.
Linux uptime. (Sure, a little original content.) When I was rearranging things months ago, I unplugged the keyboard and monitor from my webserver, then I never got around to plugging them back in because I didn’t have to do anything with it.
The other day, I had occasion to plug a keyboard and mouse back into it. I went in, did what I wanted to do, then out of curiosity I typed the uptime command. 255 days, it told me. In other words, I haven’t rebooted since last May, which, as I recall, was about when I put the machine into production.
Photography. Tom sent me links to the pictures he took on the roof of Gentry’s Landing a couple of weeks ago. He’s got a shot of downtown, the dome, and the warehouse district, flanked by I-70 on the west and the Mississippi River on the east.
I’m tired. I spent yesterday fighting Mac OS X for a couple of hours. It still feels like beta software. I installed it on a new dual-processor G4/533 with 384 MB RAM, and it took four installation attempts to get one that worked right. Two attempts just flat-out failed, and the installation said so. A third attempt appeared successful, but it felt like Windows 95 on a 16-MHz 386SX with 4 megs of RAM. We’re talking a boot time measured in minutes here. The final attempt was successful and it booted in a reasonable time frame–not as fast as Windows 2000 on similar hardware and nowhere near the 22 seconds I can make Win9x boot in, but faster, I think, than OS 9.1 would boot on the same hardware–and the software ran, but it was sluggish. All the eye candy certainly wasn’t helping. Scrolling around was really fast, but window-resizing was really clunky, and the zooming windows and the menus that literally did drop down from somewhere really got on my nerves.
All told, I’m pretty sure my dual Celeron-500 running Linux would feel faster. Well, I know it’d be faster because I’d put a minimalist GUI on it and I’d run a lot of text apps. But I suspect even if I used a hog of a user interface like Enlightenment, it would still fare reasonably well in comparison.
I will grant that the onscreen display is gorgeous. I’m not talking the eye candy and transparency effects, I’m talking the fonts. They’re all exceptionally crisp, like you’d expect on paper. Windows, even with font smoothing, can’t match it. I haven’t seen Linux with font smoothing. But Linux’s font handling up until recently was hideous.
It’s promising, but definitely not ready for prime time. There are few enough native apps for it that it probably doesn’t matter much anyway.
Admittedly, I had low expectations. About a year ago, someone said something to me about OS X, half in jest, and I muttered back, “If anyone can ruin Unix, it’s Apple.” Well, “ruin” is an awfully harsh word, because it does work, but I suspect a lot of people won’t have the patience to stick with it long enough to get it working, and they may not be willing to take the extreme measures I ultimately took, which was to completely reformat the drive to give it a totally clean slate to work from.
OS X may prove yet to be worth the wait, but anyone who thinks the long wait is over is smoking crack.
Frankly, I don’t know why they didn’t just compile NeXTStep on PowerPC, slap in a Mac OS classic emulation layer, leave the user interface alone (what they have now is an odd hybrid of the NeXT and Mac interfaces that just feels really weird, even to someone like me who’s spent a fair amount of time using both), and release it three years ago.
But there are a lot of things I don’t know.
I spent the rest of the day fighting Linux boot disks. I wanted the Linux equivalent of a DOS boot disk with Ghost on it. Creating one from scratch proved almost impossible for me, so I opted instead to modify an existing one. The disks provided at partimage.org were adequate except they lacked sfdisk for dumping and recreating partition tables. (See Friday if you don’t have the foggiest idea what I’m talking about right about now, funk soul brother.) I dumped the root filesystem to the HD by booting off the two-disk set, mounting the hard drive (mount -t ext2 /dev/hda1 /mnt) and copying each directory (cp -a [directory name] [destination]). Then I made modifications. But nothing would fit, until I discovered the -a switch. The vanilla cp command had been expanding out all the symlinks, bloating the filesystem to a wretched 10 megs. It should have been closer to 4 uncompressed, 1.4 megs compressed. Finally I got what I needed in there and copied it to a ramdisk in preparation for dumping it to a floppy. (You’ve gotta compress it first and make sure it’ll fit.) I think the command was dd if=/dev/ram0 bs=1k | gzip -v9 > [temporary file]. The size was 1.41 MB. Excellent. Dump it to floppy: dd if=[same temporary file from before] of=/dev/fd0 bs=1k
And that’s why my mind feels fried right now. Hours of keeping weird commands like that straight will do it to you. I understand the principles, but the important thing is getting the specifics right.
Linux for the rest of us. I find the bloatware in current Linux distributions somewhat annoying. It’s nice to have tons and tons of free software right off the bat, but how much of that software is actually useful to the majority of people? Windows users complain about lack of software for Linux, to which Linux zealots usually retort “I have 9 gigs worth of software installed on my PC and didn’t have to pay a dime for any of it, and it’s all legal!”
It’s not really the quantity of software that Windows users are complaining about; it’s type and quality. Give a Windows user a fast and stable Web browser, an instant messaging client, a mail client/PIM, a fully-featured graphical newsreader, a word processor and a spreadsheet that can cleanly handle Word and Excel files, and a fully functional personal finance program, and that’s all they need to be happy. Most of that exists for Linux, or is in development. Fine. Linux is neck-and-neck with the Mac in the race to be #2 on the desktop. Fine.
To anyone who’s read Optimizing Windows, my biggest gripe with Linux ought to be obvious. I spent a good deal of time editing Windows INF files by hand trying to figure out how to get Windows 95 to install in 17 megabytes’ worth of disk space. I presented this, that, and another tweak to minimize Windows’ RAM and CPU usage so that it could be tolerable on a low-end Pentium or 486. Linux fans rightly point to Linux’s modest requirements. They’re very proud of those 2-meg 386SXs running Linux 1.0. But they’re in an arms race to see who can create the GUI with the most eye candy (and highest CPU/memory requirements). Wanna bring a former 550-MHz powerhouse to its knees? Run the Enlightenment window manager on it.
That’s easy enough to fix. Just install IceWM and make it your default window manager, then your 120 MHz Pentium feels OK again. But what of the minimum disk space requirements? Most current distros are difficult to install in less than 500 megs. That sounds awfully Microsoftian to me. True, you can rip a lot of it out, which you can’t always do with MS. But do you know what you can safely get rid of?
That’s what makes the likes of VectorLinux and Peanut Linux attractive. I’ve got a stack of 170-meg drives. I’ve got a 1-gig drive sitting in my 486 because I couldn’t make Red Hat 6.2 small enough to fit on one of the small drives. Five hundred megs for something whose primary job is to route packets is ridiculous. Vector or Peanut will fit. These won’t take forever to download either, because Vector’s less than 70 megs and Peanut’s about 60. I know a company that thinks that’s a reasonable size for a Web browser.
I’m pretty sure I’ll be experimenting with these distros sooner rather than later. I’d love to liberate that gig drive, for instance.
US vs. UK English. I’m trying to write my new Shopper UK article in UK English because I feel bad about the number of edits my UK editors are having to make. Here’s what I can tell, so far, about the differences:
Extra letters. color=colour, favorite=favourite, program=programme, ton=tonne
Sparing use of the last letter of the alphabet. optimize=optimise
Pluralization, er, pluralisation: In US English, a group of people is refered to in the singular, unless that group is in disagreement. In explaining how old software can be better than new software, I drew a musical analogy: Just like Joy Division is better than ‘N Sync, old DOS games are better and certainly more original than many of the newer Windows games. That’s proper US English. Proper British English, from what I can tell, is “Just like Joy Division are better…” In the States, saying that implies that the members are in disagreement as to whether they’re better than ‘N Sync (the three surviving members would not disagree about that; they’d utter a number of profanities and then say, “Of course we were better than ‘N Sync!”).
But I can’t, and won’t try to, mimic the sentence structure of a British writer. I can’t pinpoint the differences, but when I read something written in English, I can almost always tell when the writer is from the British Isles. (Other English-speaking countries like South Africa, Australia, and New Zealand, throw me–but I haven’t read much stuff from there. Canadian writers sound like U.S. writers but you’ll find hints of cultural differences.) You can’t escape what you are, and if I try to sound like anything but a Missourian, it’ll come across as insincere and fake. We definitely don’t want that.
Thoughts on the Pentium 4 launch. No big surprises: a massively complex new processor design, limited availability, and systems from all the usual suspects, at high prices of course. And, as widely reported previously, disappointing performance.
This isn’t the first time this has happened. The Pentium Pro was a pretty lackluster performer too–it ran 32-bit software great, but Win9x was still the dominant OS at the time and it still has a lot of 16-bit code in it. So a 200 MHz Pentium Pro cost considerably more than a 200 MHz Pentium and for most of the people buying it, was significantly slower. History repeats itself…
Intel revised the Pentium Pro to create the Pentium II, with tweaks to improve 16-bit performance, but of course massive clock speed ramps made that largely irrelevant. Goose the architecture to 600 MHz and you’re going to blow away a 200 MHz previous-generation chip.
That’s what you’re going to see here. Intel fully intends to scale this chip beyond 2 GHz next year, and that’s when you’ll see this chip come into its own. Not before. And by then Intel will probably have changed their socket, (they intend to change it sometime next year) so buying a P4 today gives you no future-proofing anyway.
It never makes sense to be the first on the block with Intel’s newest chip. Never. Ever. Well, if you’re the only one on the block with a computer, then it’s OK. The P4 has issues. The P3 had issues (remember the serial number?) and was really just a warmed-over P2 anyway. The P2 was a warmed-over Pentium Pro. The Pentium Pro had serious performance issues. The Pentium had serious heat problems and it couldn’t do simple arithmetic (“Don’t divide, Intel inside!”). The last new Intel CPU whose only issue was high price was the 486, and that was in April 1989.
Unless you’re doing one of the few things the P4 really excels at (like encoding MP4 movies or high-end CAD), you’re much better off sticking with a P3 or an Athlon and sinking the extra money into more RAM or a faster hard drive. But chances are you already knew that.
Time to let the cat out of the bag. The top-secret project was to try to dual-boot WinME and Win98 (or some other earlier version) without special tools. But Win98’s DOS won’t run WinME, and WinME’s DOS seems to break Win98 (it loads, but Explorer GPFs on boot).
The best method I can come up with is to use the GPL boot manager XOSL. It just seems like more of an achievement to do it without third-party tools, but at least it’s a free third-party tool. You could also do it with LILO or with OS/2’s Boot Manager, but few people will have Boot Manager and LILO will require some serious hocus-pocus. Plus I imagine a lot of people will like XOSL’s eye candy and other gee-whiz features, though I really couldn’t care less, seeing as it’s a screen you look at for only a few seconds at boot time.
I’ll be back in a bit. With preliminary impressions of Netscape 6. My notes on it are at work, but I’ll give you the overall. I’m thinking C+. It worked OK for me and it was fast. There were things about it that annoyed me though. I very badly want to use a non-Microsoft product, because I detest Microsoft, but IE has a couple of features that save me a lot of keystrokes and I have to think of that.
Assuming it manages to install, chances are there’ll be things about it you like. The things that bother me most are features that Netscape used to have but now don’t. But for basic browsing it’s much better than its predecessors.
I’ll get the rest of the details up here within a few hours.
My notes on Netscape 6. This is pretty rough, but I don’t have time to pretty it up.
Speed: Good. Very comparable to IE in most regards and sometimes faster, though still not as fast when rendering nested tables. On a P2/350 it’s hard to tell a difference. Program loads very slowly however (20+ seconds on that P2/350).
Stability: So-so if you can manage to get it installed. Installation problems galore; seemed stable under NT4 once I got it running. Under heavy use it didn’t crash on me once. However, numerous attempts to get Java plug-in working failed. I never did get it to install on a Mac G3 running OS 8.6.
Features: Stop animations feature is gone and sorely missed. Makes me mouse more than IE does. IE-like backspace is there; ctrl-enter is not and autocomplete is Netscape 4-like rather than IE like, forcing more keystrokes. I wish they’d focus more on usability, speed and stability and less on eye candy. Text enlargement doesn’t trigger window scrollbar or margin resizing when needed, so if you enlarge the text, you’ll lose the edge of the screen.
The ctrl-l-accessible Open Location box doesn’t use any autocomplete at all.
What’s Related moves from the navigation bar to the sidebar, where it’s tempting to turn off to save screen space.
Built-in search tool turns the sidebar back on if you turned it off. Annoying–don’t throw out your bookmarks to Google and Altavista yet.
No longer any fast, easy way to toggle images on/off
No longer forces you to install everything under the sun, which is very nice. Good to be able to get just a browser if you want.
Memory usage: disappointing. Used anywhere from 18-28 megs during initial testing. It’d be so nice to nuke the #$%& eye candy and get that memory usage down.
The verdict: I’m pretty happy with how the Gecko rendering engine turned out. But as soon as K-Meleon comes of age, chances are I’ll switch to that because it’s so much leaner and meaner. (Mozilla’s plagued by the same eye candy garbage, and until we all have 2-GHz processors and a gig of RAM and 15K RPM hard drives on our desktops, I’m mostly interested in having something that works fast. That means giving up some inessential whiz-bang stuff.)
And if you missed it… I posted an update late yesterday. It was too important to wait until this morning.
From: “bill cavanaugh” <email@example.com>
I just followed the Daynotes link to your site. I couldn’t help but notice:
“Farquhar’s Law. I should have some t-shirts made with this on it. Repeat after me. Cable connections are the last thing most people check. Make them the first thing you check.”
This has been one of (actually, I think the first) Pournelle’s Laws for a couple of decades.
Aw man, I thought I stole that fair and square from PC/Computing way back when it was still a magazine kind of worth reading.
Well, hopefully there’s some other stuff on the site useful to you that isn’t stolen from someone who stole it from Jerry Pournelle.
From: “Curtis Horn” <firstname.lastname@example.org>
Subject: Fwd: FIC VA-503+ and K6-III+
I read what Peter said, and you are right, I got the K6-III because my other option is a k6-2, and we all know that on chip cache is better than on board, even at 100Mhz. And it wasn’t that much more expensive than getting a k6-2.
I haven’t had the chance to upgrade the bios, but I did find it. The other issue is that the bios chip is soldered on so I have to do it right and back up the old bios. I’ll have some time this weekend, when I’m going to put the hard drive in.
This may sound weird but ever since I got a job that has me work on computer sometimes I feel less enthusiastic about doing it at home. Right now I have 3 computers that I have to put NT Images on, and one has to have a second network card (for a bnc connector). Thanks allot for the help.
By all means take all proper precautions. It’s always a shame to ruin a motherboard because of something as simple as a BIOS upgrade. (I’ve got a dead Abit IT5H under my desk. Great board. I have no idea what I did that killed it, and that’s a shame because I could drop a Cyrix MII in it along with all the 72-pin SIMMs I could scrounge up and a 7200 rpm hard drive and it’d still be a fantastic workaday machine.)
What you say about not wanting to work on PCs after you get home actually makes a lot of sense. I resemble that remark! My main station’s Antec 300W power supply blew over the summer. The PC sat there in pieces for a couple of months because I just didn’t feel like working on it after doing that kind of stuff all day at work. I finally got around to swapping in another power supply a couple of weeks ago. I messed up my Linux firewall around the same time that power supply blew. I didn’t get around to fixing it until this weekend. Writing is relaxing to me because I don’t do it all day. Back when I was paying for college by selling my soul working as a salesman in a consumer electronics store, I found working on PCs relaxing.
I’m glad I could help.