These guys have a clue

I read on Slashdot this morning that Phish is selling soundboard recordings of its live concerts online, in unrestricted format.
For $10, fans can download concerts in MP3 format, or for $13, they can download in lossless format.

Record industry and bands take note: People are far more likely to have heard of whatever artist you happen to be listening to right now than they have Phish. But chances are Phish makes more money than whoever it is you’re listening to right now. The Rolling Stones had problems selling out venues on their last tour. Phish never has problems filling the house.

I’m not a Phish fan. To my knowledge, I’ve only ever heard one Phish song, back in 1996 when they had a song in heavy rotation on the AOR-oriented station I listened to in college. They’re a quirky alternative band. I like quirky alternative music, but my favorite quirky alternative bands are quirky in different ways than Phish.

Phish’s absence from most radio stations tells you that a lot of people aren’t into their quirks. Yet a lot of other people are. Phish proves that narrowcasting, as opposed to broadcasting, can be profitable. You don’t have to be a manufactured sellout to make it in the industry. Phish was around long before the current crop of manufactured boy bands, and after all of this crop is just a memory like the Bay City Rollers and the New Kids on the Block, Phish will still be making records and touring.

So what’s the secret? The willingness to sell unprotected copies of its concerts online gives a good clue. Phish allows things like tape recorders and cameras in their concerts. And if you want to trade live tapes with friends or give them away, that’s fine too. So people can get introduced to the band very cheaply.

How often have you heard a new band, liked their stuff, and then run out and bought more of it? I know I’ve done it a lot. But if I only kind of like a band, I don’t become obsessed, because I don’t run out to buy a $15 disc that I kind of liked. But when a friend is free to give me a copy of something I kind of like, I get more chances to acquire a taste for it. Obviously, not everybody who copies a Phish concert becomes a fan. But the economics show that some people who copy Phish concerts must end up running out and buying records and concert tickets.

Still not convinced? The Dave Matthews Band has a similar liberal policy towards taping shows, and you can download all the DMB concerts you want, for free, at archive.org. You probably have heard of them. I know you’ve heard them on the radio.

People are going to copy the Phish MP3s they download. Friends will split the cost of downloading one concert and then make copies for each other. I know that, and I’m sure that Phish knows that. They ask people not to do that. Some will anyway.

But the companies that sell dirty JPEGs online don’t protect their wares. They’re smoking crack if they think their stuff isn’t getting passed around. A cursory glance at the headers tells you the whole alt.binaries.sex tree is one massive copyright violation. But the players you read about in the mainstream press in the mid-’90s are the same players you read about in the mainstream press today. Piracy isn’t putting those guys out of business. They get people hooked on their product and they come pay for it when they can’t get enough of it for free.

Doesn’t music pretty much work the same way?

I’m not saying this makes piracy right or ethical, but if someone pirates something and then ends up buying more stuff than they pirated in the first place, then the copyright holder isn’t exactly hurt by the action.

About six years ago, there was a Web site called The Cure MP3 Audio Archive. You could go there and download everything imaginable–basically everything that had ever been recorded by the band that didn’t appear on the albums you could buy in record stores–from demos Robert Smith, Porl Thompson and Lol Tolhurst recorded when they were in high school and called themselves Easy Cure to recordings from their most recent concerts. Eventually a band representative asked them to remove all of the studio recordings. They complied. Then a couple of months later, Elektra Records stepped in and shut the site down completely.

I’ve often wondered what would have happened if Elektra, or Robert Smith himself, for that matter, had simply bought out the archive and turned it into a pay site.

I think we’re about to get an idea.

I’m returning to the Web’s good old days

In the early days of the Web, there were only 12 pages on it.
Well, there appeared to be hundreds, even thousands, of pages on the Web, but only 12 of them actually had any real content. The rest of them were pages coded by college students, who were the only people who had time to learn HTML (they made time by signing up only for classes that met in computer labs and worked on their homepages during lecture). Their pages consisted entirely of their resume, a bunch of animated GIFs, links to however many of the 12 pages they’d discovered, and links to all their friends.

Then the college students flunked out because they didn’t pay attention in class–the professors handed them finals, and they thought it was scrap paper meant to be used to sketch out the next week’s big design–and two years later, after the school’s bureaucracy figured out they were no longer students and kicked them out, they went and got jobs.

Somehow they convinced their new employers that if they went and spent thousands and thousands of dollars on equipment and put their companies online, they’d make lots of money. The result of that convincing was the dot-com boom. The biggest difference for the students was that now they got paid a fortune to sit in the back of a cubicle programming Web pages that contained a lot of animated GIFs (provided by advertisers, rather than stolen from another Web page), and, in a novel bit of creativity, these animated GIFs themselves linked to one of the 12 pages on the Web that contained real content.

Well, after a series of IPOs that would have created hundreds of thousands of new millionaires had they not been forbidden by law from selling their stock certificates, someone finally remembered how to read a balance sheet and found that the total amount of money generated by the dotcom boom was four-fifty. Rubles. Investors panicked and sold off all their stock. Companies got investigated for fraud and the college students got laid off. (You thought I was going to say something else, didn’t you?) Once again, they hung around for a couple of years until the bloated bureaucracy figured out they didn’t work there anymore and kicked them out.

The upside of all of this is that the Web isn’t as commercial now as it was a few years ago. The downside is that the commercials are way more annoying than ever.

Meanwhile, those college students are still working on their personal pages, most of which now end in .com or .net or .org and they don’t have squiggly lines in them anymore. Now they annoy the 12 Web sites that still produce original content by deep-linking their stories on blogs and adding their own comments.

Meet the new Internet: Same as the old Internet.

So in that grand tradition, since I haven’t had an original thought all day and have absolutely nothing meaningful to say tonight, I’ll provide a couple of links to stories I found and add some worthless commentary to it. And someone will think it’s great and spectacular and declare me a visionary and I’ll start a new software company.

Or something.

Santa Claus reportedly considering Linux

BBSpot: “IIS couldn’t keep up when Slashdot posted a link to that web-interface I made for turning Rudolph’s new LED nose on and off. That was the last straw,” [Santa] Claus continued. “I’m entrusting the entire holiday of Christmas to a company that can’t even make a reliable web server?”
The story mentions lots of other reasons for Santa to switch from Windows. I guess that means Santa doesn’t believe that controversial IDC report from last month or whenever it was. Thanks to Karl Koenig for this link.

Fare thee well, goodnight, and goodbye to my friend, OS/2

The Register: IBM has finally brought the Great Rebellion [OS/2] to a close.
The Register was the only online obituary that mentioned eComStation, a third-party OS/2 derivative that everyone forgets about. Interestingly, the product literature never mentions OS/2 by name, only bragging about its technology licensed from IBM.

The Reg also talked about OS/2 version 3 being positioned as a gamer’s OS. Maybe that’s ironic, coming from the suits at IBM, and that wasn’t how I saw it–I switched from Windows 3.1 to OS/2 because, coming from an Amiga, I was used to being able to multitask freely without a lot of crashes. Windows 3.1 crashed pretty much every day if I tried to do that. OS/2 knocked that number down to about once a year, and usually those lockups happened when I was running Windows apps.

Even though I never really thought of it that way, OS/2 was great for games. Since it used virtual DOS machines, each environment had its own memory management, so you could fine-tune it and avoid shuffling around boot disks or struggling to master the DOS 6.0 boot menus. Pretty much no matter what you did, you got 600K or more of conventional memory to work with, and with some fine-tuning, you could bring that total much higher than you could usually attain with DOS. Since 600K was almost always adequate, most games just ran, no sweat.

The other thing I remember is the speed at which DOS games ran. Generally, running it under OS/2 gained you a speed grade. A game running under OS/2 on a DX2/50 felt like a DX2/66 running under DOS would feel. An OS/2 wizard could usually squeeze yet more performance out of the game with some tweaking.

I have fond memories of playing Railroad Tycoon, Civilization, and Tony LaRussa Baseball 2 on my Compaq 486 running OS/2 v3.

And there was another really nice thing about OS/2. When I bought a shiny new Pentium-75 motherboard and CPU and a new case, I pulled the hard drive out of the Compaq and dropped it into the Pentium. It just worked. All I had to do was load drivers for my new video card, since it used a different chipset than my 486.

And the cult of OS/2 won’t go away just yet. The talk over at os2voice.org has me almost ready to install OS/2 again.

Roll your own news aggregator in PHP

M.Kelley: I’m also wondering how hard would it be to pull a PHP/MySQL (or .Net like BH uses) tool to scrape the syndicated feeds off of websites and put together a dynamic, constantly updated website.
It’s almost trivial. So simple that I hesitate to even call it “programming.” And there’s no need for MySQL at all–it can be done with a tiny bit of PHP. Since it’s so simple, and potentially so useful, it’s a great first project in PHP.

It’s also terribly addictive–I quickly found myself assembling my favorite news sources and creating my own online newspaper. To a former newspaper editor (hey, they were student papers, but one of them was at Mizzou, and in my book, if you can be sued for libel and anyone will care, it counts), it’s great fun.

All you need is a little web space and a writable directory. If you administer your own Linux webserver, you’re golden. If you have a shell account on a Unix system somewhere, you’re golden.

First, grab ShowRDF.php by Ian Monroe, a simple GPL-licensed PHP script that does all the work of grabbing and decoding an RDF or RSS file. There are tons of tutorials online that tell you how to code your own solution to do this, but I like this one because you can pass options to it to limit the number of entries, and the length of time to cache the feed. Many RDF decoders fetch the file every time you call them, and some feeds impose a once-an-hour limit and yell at you (or just flat ban you) if you go over. Using existing code is a good way to get started; you can write your own decoder that works the way you want at some later date.

ShowRDF includes a PHP function called InsertRDF that uses the following syntax:
InsertRDF("feed URL", "name of file to cache to", TRUE, number of entries to show, number of seconds to cache feed);

Given that, here’s a simple PHP page that grabs my newsfeed:


<html><body>

<?php include("showrdf.php"); ?>

<?php

// Gimme 5 entries and update once an hour (3600 seconds)

InsertRDF("https://dfarq.homeip.net/b2rss.xml", "~/farquhar.cache", TRUE, 5, 3600);

?>

</body></html>

And that’s literally all there is to it. That’ll give you a very simple HTML page with a bulleted list of my five most recent entries. Unfortunately it gives you the entries in their entirety, but that’s b2’s fault, and my fault for not modifying it. I’ll be doing that soon.

You can see the script in action by copying and pasting it into your Web server. It’s not very impressive, but it also wasn’t any effort either.

You can pretty it up by making yourself a nice table, or you can grab a nice CSS layout from glish.com.

I can actually code tables without stealing even more code, so here’s an example of a fluid three-column layout using tables that’ll make a CSS advocate’s skin crawl. But this’ll get you started, even if that’s the only useful purpose it serves.


<html><body>

<?php include("showrdf.php"); ?>

<table width="99%" border="0" cellpadding="6">

<tr>

<td colspan="3" align="left">
<h1>My personal newspaper</h1>
</td>

</tr>

<tr>

<td width="25%">

<!--- This is the leftmost column's contents -->

<!--- Hey, how about a navigation bar? -->

<?php include("navigationbar.html"); ?>

</td>

<!--- Middle column -->

<td width="50%">

<p><h1>Dave Farquhar</h1></p>

<?php

// Gimme 5 entries and update once an hour (3600 seconds)

InsertRDF("https://dfarq.homeip.net/b2rss.xml", "~/farquhar.cache", TRUE, 5, 3600);

?>

</td>

<!--- Right sidebar column -->

<td width="25%">

<p><h2>Freshmeat</h2></p>

<?php

InsertRDF("http://www.freshmeat.net/backend/fm-releases-software.rdf", "~/fm.cache", TRUE, 10, 3600);

?>

<p><h2>Slashdot</h2></p>

<?php

InsertRDF("http://slashdot.org/developers.rdf", "~/slash.cache", TRUE, 10, 3600);

?>

</td>

</tr>

</table>

</body></html>

Pretty it up to suit your tastes by adding color elements to the <td> tags and using font tags. Better yet, use the knowledge you just gained to sprinkle PHP statements into a pleasing CSS layout you find somewhere.

Finding newsfeeds is easy. You can find everything you ever wanted and then some at Newsisfree.com.

Using something like this, you can create multiple pages, just like a newspaper, and put links to each of your files in a file called navigationbar.html. Every time you create a new page containing a set of feeds, link to it in navigationbar.html, and all of your other pages will reflect the change. This shows another nice, novel use of PHP’s niceties–managing things like navigation bars is one of the worst things about static HTML pages. PHP makes it very convenient.

0wnz0r3d by an electrical storm

We had some more downtime yesterday as my DSL connection got 0wnz0r3d. Not by a script kiddie, but by an electrical storm–thankfully just a rainstorm and not thundersnow–and I fell off the ‘net.
I reset the DSL modem when I got home and all was well.

I’ll be back this evening with a (gasp!) programming piece. Well, pretty lame programming, really. But what good is having the GPL around if you don’t take advantage of it?

A b2 user looks longingly at Movable Type

This web site is in crisis mode.
I’ve been talking the past few days with a lot of people about blogging systems. I’ve worked with a lot of them. Since 1999, I’ve gone from static pages to Manilla to Greymatter to b2, and now, I’m thinking about another move, this time to Movable Type.

At the time I made each move, each of the solutions I chose made sense.

I really liked Manilla’s calendar and I really liked having something take care of the content management for me. I moved to Greymatter from Manilla after editthispage.com had one too many service outages. (I didn’t like its slow speed either. But for what I was paying for it, I couldn’t exactly complain.) Greymatter did everything Manilla would do for me, and it usually did it faster and better.

Greymatter was abandoned right around the time I started using it. But at the time it was the market leader, as far as blogs you ran on your own servers went. I kept on using it for a good while because it was certainly good enough for what I wanted to do, and because it was super-easy to set up. I was too chicken at the time to try anything that would require PHP and MySQL, because at the time, setting up Apache, PHP and MySQL wasn’t exactly child’s play. (It’s still not quite child’s play but it’s a whole lot easier now than it used to be.)

Greymatter remained good enough until one of my posts here got a hundred or so responses. Posting comments to that post became unbearably slow.

So I switched to b2. Fundamentally, b2 was pretty good. Since it wasn’t serving up static pages it wasn’t as fast as Greymatter, but when it came to handling comments, it processed the 219th comment just as quickly as it processed the first. And having a database backend opened up all sorts of new possibilities, like the Top 10 lists on the sidebar (courtesy of Steve DeLassus). And b2 had all the basics right (and still does).

When I switched to b2, a handful of people were using a new package called Movable Type. But b2 had the ability to import a Greymatter site. And Movable Type was written in Perl, like Greymatter, and didn’t appear to use a database backend, so it didn’t appear to be a solution to my problem.

Today, Movable Type does use a MySQL backend. And Movable Type can do all sorts of cool stuff, like pingbacks, and referrer autolinks. Those are cool. If someone writes about something I write and they link to it, as soon as someone follows the link, the link appears at the bottom of my entry. Sure, comments accomplish much the same thing, but this builds community and it gives prolific blogs lots of Googlejuice.

And there’s a six-part series that tells how to use Movable Type to implement absolutely every good idea I’ve ever had about a Weblog but usually couldn’t figure out how to do. There are also some ideas there I never conceived of.

In some cases, b2 just doesn’t have the functionality. In some cases (like the linkbacks), it’s so easy to add to b2 even I can do it. In other cases, like assigning multiple categories to a post, it’s difficult. I don’t doubt b2 will eventually get most of this functionality. But when someone else has the momentum, what to do? Do I want to forever be playing catch-up?

And that’s my struggle. Changing tools is always at least a little bit painful, because links and bookmarks go dead. So I do it only when it’s overwhelmingly worthwhile.

Movable Type will allow you to put links to related entries automatically. Movable Type will help you build meaningful metatags so search engines know what to do with you (MSN had no idea what to do with me for the longest time–I re-coded my page design a couple of weeks ago just to accomodate them). MT will allow you to tell it how much to put into your RSS feed (which I’m sure will draw cheers from the poor folks who are currently pulling down the entire story all the time).

MT doesn’t have karma voting, like Greymatter did (and I had Steve add to b2). I like it but I can live without it. I can probably get the same functionality from page reads. Or I can just code up a “best of” page by hand, using page reads, feedback, and gut feeling as my criteria.

The skinny: I’m torn on whether I should migrate. I stand to gain an awful lot. The main reason I have to stay with what I have is Steve’s custom code, which he worked awfully hard to produce, and some of it gives functionality that MT doesn’t currently have. Then again, for all I know it might not be all that hard to adapt his code to work with MT.

I know Charlie thought long and hard about switching. He’s one of the people I’ve been talking with. And I suspected he would be the first to switch. The biggest surprise to me when he did was that it took him until past 3 p.m. today to do it.

And I can tell you this. If I were starting from scratch, I’d use Movable Type. I doubt I’d even look at anything else.

apt-get install aclue

My boss called a meeting mid-week last week, and if all goes well, there’ll be some changes at work. That’s a very good thing.
I deliberately don’t write about work very often, and only in vague terms when I do, because some things I wrote about work in the past came back to bite me.

I’ve thought blogs were a very useful tool for a long time. When I started my career in 1997, I found myself gravitating towards some embryonic blog-like sites that offered technical information. Eventually enough people egged me into starting one myself. I found myself posting the solutions to my technical problems there, since searching there was much easier than with any tools we had at work. It’s a good way to work in the public eye and solicit ideas and feedback.

Well, my boss took notice. I blog, and so does one of my coworkers (I hesitate to mention him by name, as it might give away my employer, which I’d still rather not do). He visits from time to time, though the only time he’s tried to post a comment, my DSL connection went down (he naturally asked what I was doing to sabotage IE).

At the meeting, where we were talking about new ways to do things, he asked me point-blank to “Set up a weblog like you and [the guy in the cube next to me] have.”

So this morning I asked my mentor in the cube next to me for a MySQL account on one of our Linux servers. Then I installed Movable Type, mostly because both of us have heard great things about it but neither of us (so far) has been willing to risk everything by switching to it. (I know it’s not free for commercial use; call this “evaluation.” For all I know we’ll end up using b2, which is under the GPL, because for internal, intranet purposes, I don’t know that MT offers anything that b2 doesn’t. But if the boss decides he wants us to go live with MT, we’ll fork over the $150.)

The idea is, we can all log onto the blog at the end of the day and write down any significant things we did. Along the way, hopefully we’ll all learn something. And, as far as I can tell, we won’t block our clients from seeing the blog either. That way they can catch a glimpse into what we do. They won’t understand it all (I know I won’t understand all the VMS stuff on there, and the VMS guys may not understand all the NT stuff) but they’ll see something.

We talked about the cluetrain philosophy a little bit. Essentially, both of us understand it as the idea of being completely open, or at least as open as possible, with the customer. Let them see the internal operations. Let them make suggestions. Let them participate in the design of the product or service.

And I think that’s good up to a point.

Robert Lutz, one of the executives who turned Chrysler around before Daimler-Benz bought the automaker and ran it into the ground, wrote a marketing book called Guts: The Seven Laws of Business That Made Chrysler the World’s Hottest Car Company. I’ve got a copy of it on my shelf at work. One of the chapters of the book is titled, “The Customer Isn’t Always Right.” He argued that customers will follow trends and not necessarily tell the truth. Put out a survey asking people if they’d like a heated cupholder in their car, and most of them will say, yes, they’d love a heated cupholder. Everybody knows that a heated cupholder is a useless gadget no one will use, it won’t work right, and it’ll increase the cost of the car without adding any value, but nobody wants to look cheap.

Lutz argued that experts should make decisions. Since cars are the love of Lutz’s life, Lutz knows how to make killer cars. Lutz observed that the redesigned Dodge Ram pickup elicited extreme reactions. People either loved it or hated it. 70% of respondents loved it; 30% of respondents said they’d never go near the thing. Lutz argued that their then-current design had roughly 30% marketshare, so if half the people who said they loved it bought one, they’d gain 5%. So they brought it to market, and gained marketshare.

I suspect the biggest reason why the cluetrain philosophy works is that it helps to make you experts. See enough opinions, and you’ll learn how to recognize the good ones. When you’re clueless, the cluetrain people are right and you look like geniuses. Eventually, you stop being clueless, and at that point, Lutz is right.

The main reason I’m excited about having a blog in place at work isn’t because blogs in IT are trendy and popular and glitzy. (I’d still be using an Amiga if I could get a 68060 accelerator and a Zorro II Ethernet board without spending a grand.) I’m excited about blogs because I think it’ll get us a clue.

My boss typed apt-get install aclue at work today. I don’t think that’ll get us anything. Bgirwf that blog doesn’t get us a clue, I don’t think anything will.

End of the road for CD burners?

I know it wasn’t more than a couple of months ago that I read the Taiwanese manufacturers of CD burners and media were leery about going above 48X. And now Asus has released a 52X burner. There’s a very favorable review here.
So now the fastest write speeds have reached parity with the fastest read speeds, which means burning a 650-meg disc (with this drive, at least) takes two and a half minutes. Rewrite speeds are at 24X, which doesn’t sound as impressive, but is very nice.

Not everyone needs this drive. I burn CDs rarely enough that I’m perfectly happy with my 20X unit (in fact, I’ve still got a quarter-spindle of CDs that will only burn at 12X). Personally, I’m more interested in rewrite speeds than in write speeds these days, since most of the stuff I burn is stuff like Linux CDs with a shelf life measured in months. In two years I won’t give a rip about Debian 2.2 or 3.0, so it’s nice to be able to erase and reuse old discs rather than keeping them around, taking up space.

But people’s needs vary. I’m sure some people are very excited about this drive.

Since I keep drives until they either die or are too slow for me use them and keep my sanity anymore (I have a Sony 2X unit and a Yamaha 20x10x40x unit, both in working order, which should tell you something), I’m definitely going to wait for a 52x52x52x unit. Maybe the industry will surprise us with a 56X write speed, but they’re not going to get much higher. At these speeds, the CDs are spinning at 27,500 RPM–nearly twice the speed of the very fastest hard drives on the market. I’ve read about the theoretical possibility of discs shattering at 50x+ speeds, though I’ve never actually seen that. I have seen discs crack though, which is irritating–even more so if you don’t have a backup copy.

I think this market is about to stabilize.

My contributions to the Wikipedia

When I was checking up on some facts on Joe Jackson, I found the free Wikipedia to be of use. In the very well-done account of the Black Sox scandal (to which I made some minor edits, replacing a couple of odd word choices and fixing some commas), I noticed a link to a non-existent biography of pitcher and ringleader Eddie Cicotte. So I whipped out my Baseball Encyclopedia, opened it up to Cicotte’s statistics, did a couple of Web searches to grab some more detail and check my own memory, and based on those references, I wrote one.
Cicotte was a knuckleballer, but I found the Wikipedia didn’t have an article on the knuckleball either. So I wrote one of those too. Along the way to writing that, I found the Wikipedia had a biography of Hoyt Wilhelm, which I didn’t touch, but didn’t have one of Phil Niekro, the most notable pitcher from my lifetime to throw the pitch. I didn’t write that biography.

I also found there’s no biography of Jimmie Foxx. That wrong will have to be righted by yours truly very soon. As will the criminal exclusion of Mike Sweeney, and the embarrassingly sketchy history of the Kansas City Royals. (The George Brett biography was reasonably complete; I made a few minor additions.) I can see how this can get addictive fast.

I read a while back some astronomical statistic about the Wikipedia’s size, but that it wasn’t yet as big as the Encyclopædia Britannica. I visited it, ready to contribute an article or two, but couldn’t think of anything. I figured I’d write about technology but found all the articles in my area of expertise were already very impressive.

So my contribution to this fount of knowledge is in the area of baseball instead.

Hey, it’s good that it’ll go somewhere.