Yep. Usually I don't respond to y'all's comments. It's a waste of time, mostly. But they're fun to read, so please continue hating on each other. Do our community PROWD.
But I gotta respond to this one.
Jeff Bailey (from the LSB steering commity) had a few criticisms about my rant to standardize binaries:
1) When RHEL releases every 2 years, and Ubuntu releases every 6 months, you'll never get the same base toolchains.
Please please please tell me why Ubuntu needs to ship a new toolchain every 6 months? Is there some luser grandma out there asking why she doesn't have gcc 4.1.2 instead of 4.1.1? What does this possibly accomplish except fragment everything out there? Microsoft ships a new compiler like every 3 years, and things work just fine. Even Apple (who isn't so great about backwards compatibility either) ships a new toolchain maybe once a year or so.
If you look at people actually trying to ship binary software on Linux, this is what they do: They go around looking at all the distros they want to support and they look at all the glibc's and they say, what's the MINIMUM glibc version I can depend on? What's the LOWEST COMMON DENOMINATOR among all these distros that I can depend on. Figuring this stuff out is really difficult, but people who ship software figure out some way to do it.
The point is, everyone is looking for the MINIMUM. Not the BLEEDING EDGE. ISV's want compatibility over the newest gcc compiler optimization flag. If there were standard core bits, then Ubuntu, and every other distro would be given a choice: go with the standard platform and add value on top, or build your own toolchain and nobody will give a fuck. LSB people need to talk to distros to make them aware of this tradeoff. There will always still be freetard distros that ignore you, but they don't matter anyways.
2) It's always possible that slightly different versions of underlying bits like glibc or the compiler can cause gross ABI changes.
Yes, that's why I said ship the EXACT SAME BITS. Figure out exactly ONCE what all the dependencies between gcc/glibc/etc in terms of ABI's are. Get a standard set, and standardize the compiled bits. No chance for different compile flags. No nothing.
3) Everyone compiling with the same toolchain and base libraries would've guarantee backwards compatability at all. Only a test suite can do that (and then, only within the limits of what's tested).
You're confusing test suites here. There are two test suites.
One is to make sure that one LSB certified distro behaves like another. Since the bits aren't standardized, you have to go about making sure they act the same. If you ship the same bits, then you get the same behavior, so you don't need to do these tests anymore.
For backwards compatibility, of course you need to do tests. But if everyone shipped the same set of bits, then everyone could work together to test the same bits. You could rev the bits once every few years to introduce new features on a very conservative schedule. The point is that testing resources are scarce, so making everyone do their own tests individually just does not scale. Plus, under the current model, ISV's won't trust you anyways because the bits on each system are different, so they'll still have to internally QA against all the distros regardless.
Anyways, guys, this is basic software engineering. The fact that I have to even write this down so explicitly makes me hopeless. The simple fact is that if you want to come up with a "standard" Linux, then your distros are going to make sacrifices. Until the distros realize that making these sacrifices will result in benefits for all of them, there's no point even debating the techincal details.