I found an interesting editorial called If I had my own Linux Distro. He’s got some good ideas but I wish he’d known what he was talking about on some others.
He says it should be based on FreeBSD because it boots faster than Linux. I thought everyone knew that Unix boot time has very little to do with the kernel? A kernel will boot more slowly if it’s trying to detect too much hardware, but the big factor in boot time is init, not the kernel. BSD’s init is much faster than SysV-style init. Linux distros that use BSD-style inits (Slackware, and optionally, Debian, and, as far as I understand, Gentoo) boot much faster than systems that use a traditional System V-style init. I recently converted a Debian box to use runit, and the decrease in boot time and increase in available memory at boot was noticeable. Unfortunately now the system doesn’t shut down properly. But it proves the concept.
He talks about installing every possible library to eliminate dependency problems. Better idea: Scrap RPM and use apt (like Debian and its derivatives) or a ports-style system like Gentoo. The only time I’ve seen dependency issues crop up in Debian was on a system that had an out of date glibc installed, in which case you solve the issue by either keeping the distribution up to date, or updating glibc prior to installing the package that fails. These problems are exceedingly rare, by the way. In systems like Gentoo, they don’t happen because the installation script downloads and compiles everything necessary.
Debian’s and Gentoo’s solution is far more elegant than his proposal: Installing everything possible isn’t going to solve your issue when glibc is the problem. Blindly replacing glibc was a problem in the past. The problems that caused that are hopefully solved now, but they’re beyond the control of any single distribution, and given the choice between having a new install stomp on glibc and break something old or an error message, I’ll take the error message. Especially since I can clear the issue with an apt-get install glibc. (Then when an old application breaks, it’s my fault, not the operating system’s.)
In all fairness, dependency issues crop up in Windows all the time: When people talk about DLL Hell, they’re talking about dependency problems. It’s a different name for the same problem. On Macintoshes, the equivalent problem was extensions conflicts. For some reason, people don’t hold Linux to the same standard they hold Windows and Macs to. People complain, but when was the last time you heard someone say Windows or Mac OS wasn’t ready for the desktop, or the server room, or the enterprise, or your widowed great aunt?
He also talks about not worrying about bloat. I take issue with that. When it’s possible to make a graphical Linux distribution that fits on a handful of floppies, there’s no reason not to make a system smooth and fast. That means you do a lot of things. Compile for an advanced architecture and use the -O3 options. Use an advanced compiler like CGG 3.2 or Intel’s ICC 7.0 while you’re at it. Prelink the binaries. Use a fast-booting init and a high-performance system logger. Mount filesystems with the highest-performing options by default. Partition off /var and /tmp so those directories don’t fragment the rest of your filesystem. Linux can outperform other operating systems on like hardware, so it should.
But when you do those things, then it necessarily follows that people are going to want to run your distribution on marginal hardware, and you can’t count on marginal hardware having a 20-gig hard drive. It’s possible to give people the basic utilities, XFree86, a reasonably slick window manager or environment, and the apps everyone wants (word processing, e-mail, personal finance, a web browser, instant messaging, a media player, a graphics viewer, a few card games, and–I’ll say it–file sharing) in a few hundred megabytes. So why not give it to them?
I guess all of this brings up the nicest thing about Linux. All the source code to anything desirable and all the tools are out there, so a person with vision can take them and build the ultimate distribution with it.
Yes, the idea is tempting.
4 thoughts on “If I had my own Linux distribution”
I agree that people with older hardware should be catered for with supported OS.
I recently built a machine from some old parts I had lying around, a p133, 96mb mem and a 1.2 gig hard drive (see told you it was old) and installed a linux distro called Vector Linux which is based on Slackware. Install’s in less than 400mb.
As a terminal it functioned brilliantly (which is what I needed it for), as a desktop less so.
The Window Manager(s) supplied for the older hardware were pretty slow and lacked some features and the xfree86 had trouble getting the correct screen definitions and the mouse wouldn’t work and my modem needed configuring (thought it was a WinModem), etc ,etc.
However, the community Vector has is excellent (friendly AND helpful) and all my issues were resolved with some advice and investigation.
Still cant get java to work with any browser (firebird 0.5, the old Pheonix, Mozilla or Opera), but it is only a matter of time. I have probably got a gclib issue 🙂
When I get my new box i will probably install the full updated version of Vector.
I love your articles on Linux and building PC’s. I’d like to see someone pay you to write articles combining the two. We need more information on building PC’s specifically for linux.
I’m sure you will quickly be writing for paying magazines.
For what it is worth, I concur with Joseph. After Slashdot, I click over to you every single day. When you are down, I have physical pain. When you add a new article, I jump for joy. Literally on both, Dave!
A few issues to address:
GCC optimization flags need some relabeling. Note this article on how O3 actually *hurts* performance.
The second one is better.
Concerning fragmentation – that’s really not so much an issue with non-FAT filesystems. I think maybe that yes, partitioning off /var and /tmp will increase performance, but only because those files aren’t being stored so close to the beginning of the disc (where seek times and data transfer rates are a bit faster).
And in my opinion, the biggest “dependency problems” for linux are simply recursive dependencies. libx needs liby to compile. Go to compile liby. liby requires libz to compile. Go to compile libz. libz, unfortunately, needs libx to compile. Go to compile screwthisimgoinghome.so. 😉
I love GNU/Linux. I use it everyday for web hosting where I work. I use it at home (helping squish bugs in Mandrake and Mozilla). I use it at play (Quake 3, UT, etc. run great). But I don’t think it’s ready for the average user. There are still horrible install problems. Hardware installation? Please! *I* can barely get a new driver installed without huge problems, and I work with this stuff all of the time!
The day Mary Jane User can sit in front of a PC running something like Mandrake, plug in her digital camera, have a window pop up and ask what she wants to do with those pictures, select a few, click “Print”, it goes to her photo printer, select a few more, click “Email”, it goes to her relatives, and select a few more, click “Delete”… well… you get the idea.
Comments are closed.