Self contained packaging
I am no longer much of a believer in central package repositories and traditional (RPM and APT-like) package management systems. I also dislike the insistence to keep libraries, binaries etc. separated and scattered all over the system.
Today, most GNU/Linux distributions actually treat the whole software repository as an operating system. As the operating system moves all 15 000 packages in the corresponding repository move with it. If you upgrade from Debian Stable to Debian Testing, for instance, or from Ubuntu Gutsy to Ubuntu Hardy, it will not be only the basic system that will be upgrades (kernel, GNU userland, desktop environment), but all applications as well. And that's the thing I'm trying to point out. On GNU/Linux there doesn't seem to be a practical separation between an "application" and an "OS" on which it runs. Instead they're treated as if they were the same.
While this may seem like a good and convenient thing I beg to differ at this point. I suppose it's great that you get all your applications updated automatically, but why not have a choice on the matter (that doesn't include some sorcery on command line)? And what are the other disadvantages of this approach? What if you have two versions of the same application, one pretty old and another brand new? On GNU/Linux as it currently is this doesn't seem like a recommended thing to do, unless you're an advanced user of course, which is missing the point.
Why not do what successful yet proprietary operating systems seem to be doing and it seems to be working quite well for them. Instead of "debianizing", "fedoranizing", "susenizing" etc. all software by adding various packages which only end up adding new bugs and prolonging the amount of time needed before one can get a stable yet brand new package in the system, why not just go completely decentralized and build packages which are neither .deb nor .rpm, which *don't* have any dependency resolution, but come with all dependencies built right into the package. As far as I'm concerned the package could be a mere tar.gz containing the binaries that you can just drop anywhere you want and run the executable. Step up from there is making it akin to setup.exe on Windows. Double click, select location, next, next and you're done.
Some games are already distributed on GNU/Linux this way through .run, .bin files or just archives as I described above. On windows the equivalent is setup.exe and on Mac OS X I think it is .dmg, but I hear you can just drag and drop stuff to install. On PC-BSD they are .pbi packages.
Why is this not done more on GNU/Linux? Seriously, if people are worried about losing the advantages of what we have today I think it's quite an irrational worry. You can still have repositories and programs like synaptic which allow you to from one place fetch whatever you want and install automatically to a preset location. It wouldn't even need to do any dependency checking, just download and decompress to something like /programs directory.
All you need is a stable core system that runs everything that is programmed for GNU/Linux and a /programs directory for your software.
What do you think?