×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

CDE — Making Linux Portability Easy

timothy posted more than 3 years ago | from the namespace-collision dept.

Software 385

ihaque writes "A Stanford researcher, Philip Guo, has developed a tool called CDE to automatically package up a Linux program and all its dependencies (including system-level libraries, fonts, etc!) so that it can be run out of the box on another Linux machine without a lot of complicated work setting up libraries and program versions or dealing with dependency version hell. He's got binaries, source code, and a screencast up. Looks to be really useful for large cluster/cloud deployments as well as program sharing. Says Guo, 'CDE is a tool that automatically packages up the Code, Data, and Environment involved in running any Linux command so that it can execute identically on another computer without any installation or configuration. The only requirement is that the other computer have the same hardware architecture (e.g., x86) and major kernel version (e.g., 2.6.X) as yours. CDE allows you to easily run programs without the dependency hell that inevitably occurs when attempting to install software or libraries. You can use CDE to allow your colleagues to reproduce and build upon your computational experiments, to quickly deploy prototype software to a compute cluster, and to submit executable bug reports.'"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

385 comments

Isn't that three-letter acronym taken? (5, Informative)

Anonymous Coward | more than 3 years ago | (#34212442)

CDE will always mean Common Desktop Environment to me.

Re:Isn't that three-letter acronym taken? (2, Interesting)

Jeremiah Cornelius (137) | more than 3 years ago | (#34212472)

Me too.

Common to Sun and HP. :-)

I guess Ultrix, too.

Regarding this development - it's really what NeXT and later Mac OSX packages do. In the Windows world they have Thinapp and MS's App-V.

Re:Isn't that three-letter acronym taken? (1, Informative)

countertrolling (1585477) | more than 3 years ago | (#34212554)

The Mac has been doing this since almost the very beginning.. The app was a single file, not even a package. The other systems, Windows and Linux, are madness. It's like splatter painting. Or for your airplane analogy, it's like designing the instrument panel with a shotgun.

Re:Isn't that three-letter acronym taken? (1, Informative)

Anonymous Coward | more than 3 years ago | (#34212618)

Nope. Mac programs are still "packages". Even before OS X, you had the data fork and resource fork that held different parts of the app.

Right click any Mac app -- you'll get a context-menu item called "Show Package Contents".

Re:Isn't that three-letter acronym taken? (4, Insightful)

h4rr4r (612664) | more than 3 years ago | (#34212652)

That method guarantees security problems. Applications and their dependencies should be managed by a proper package management system.

Re:Isn't that three-letter acronym taken? (1)

aliquis (678370) | more than 3 years ago | (#34212684)

Depends on the code used though.

With open-source code and lots of borrowed code? Yes.

If you have written everything yourself from scratch anyway? No.

Re:Isn't that three-letter acronym taken? (1)

h4rr4r (612664) | more than 3 years ago | (#34212718)

So build a package, if the only things in it are your code that will make it very easy.

Just about 0 commercial software uses no outside libs in some shape or form.

Re:Isn't that three-letter acronym taken? (4, Insightful)

countertrolling (1585477) | more than 3 years ago | (#34212766)

I prefer to avoid the disagreements over what is a "proper package management system". In fact with each program in its own "sandbox", protected from each other, I see better security.

Re:Isn't that three-letter acronym taken? (1)

h4rr4r (612664) | more than 3 years ago | (#34212784)

So now you want everything in it's own BSD style JAIL?

That is sure to waste a lot of space.

Re:Isn't that three-letter acronym taken? (3, Insightful)

h4rr4r (612664) | more than 3 years ago | (#34212812)

It is also sure to piss off users who now have to have another Documents directory for each application.

Else my bad application could edit a document that another application that relies on an old outdated insecure library uses.

I am starting to think you are not thinking this through.

Re:Isn't that three-letter acronym taken? (1)

noidentity (188756) | more than 3 years ago | (#34212908)

How about only allowing a given app to edit files in Documents that have particular file extensions?

Re:Isn't that three-letter acronym taken? (1)

icebraining (1313345) | more than 3 years ago | (#34212690)

So when a library that 20 apps use needs to be updated, it gets downloaded 20 times?

Making a package isn't hard at all, and you can completely automate the process, making it effortless for subsequent versions of the app.

Also, there's no central updater for the system. Talk about madness.

Re:Isn't that three-letter acronym taken? (1)

Culture20 (968837) | more than 3 years ago | (#34212476)

CDE will always mean Common Desktop Environment to me.

Hear, hear! Of course I was always an openwindows fan since CDE rendered so slowly on our sparc lx's.

Re:Isn't that three-letter acronym taken? (4, Insightful)

Haeleth (414428) | more than 3 years ago | (#34212670)

I am still waiting for Gnome or KDE to catch up with the efficiency and usability of these older environments.

KDE is getting closer now that it's possible for the desktop menu to present a list of applications rather than a handful of useless wallpaper-changing commands, but both major environments seem to be stuck on the stupid Windows 95-derived taskbar paradigm. Give me spatial management of running applications dammit! I want to develop muscle memory, not scan slowly across a list of tiny icons that are never in the same place twice.

Re:Isn't that three-letter acronym taken? (5, Funny)

Waffle Iron (339739) | more than 3 years ago | (#34212936)

CDE will always mean Common Desktop Environment to me.

I only used CDE briefly, but I remember that it was like a combination of the sheer visual elegance of Tk's widgets with lush the color scheme of a bordello.

It's About Time (1, Funny)

WrongSizeGlass (838941) | more than 3 years ago | (#34212456)

Making Linux easier for the masses is always a good thing.

+1 obvious, but it's as simple as this to make something more popular. The easier it is to use the more people who will use it.

Re:It's About Time (2)

h4rr4r (612664) | more than 3 years ago | (#34212506)

This does not make anything easier, it just makes it wrong.

The ubuntu app center makes things easier without this sort of nasty kludge.

Re:It's About Time (4, Interesting)

couchslug (175151) | more than 3 years ago | (#34212544)

Making applications portable is handy for doing things like running them from a USB stick. It also makes backup much more convenient.

Copy the program and its data in one shot, carry it with you, and use anywhere.

Windows apps are ahead of the game on this one:

http://portableapps.com/ [portableapps.com]

Re:It's About Time (0)

Anonymous Coward | more than 3 years ago | (#34212576)

Isn't that the point? Making things more convenient, adding a feature that's already in the marketplace on other OS's, etc, etc. If the many flavors of Linux want to catch up on the desktop they need to make things easier for the other 99% of computer users.

Re:It's About Time (1, Interesting)

h4rr4r (612664) | more than 3 years ago | (#34212590)

If you need the same app everywhere it is easy enough to either make the data format portable or run the entire OS from the usb stick. Your method just lets you move outdated and almost guaranteed insecure software around. Windows only has this because it still lacks proper package management.

Bringing that sort of braindead thinking to linux is a curse not a blessing.

Re:It's About Time (1)

CannonballHead (842625) | more than 3 years ago | (#34212704)

Ubuntu package management requires online access, unless you keep an updated DVD of the latest normal dependencies.

That means, if you have a Linux distro sitting on a non-internet-ized computer, it can be difficult to run a random program because of the dependencies and libraries you need.

Granted, that's not going to happen in too many random user's setups, but it could happen.

Re:It's About Time (1)

h4rr4r (612664) | more than 3 years ago | (#34212742)

So you sneakernet the debs over. How else are you planning on installing/running something anyway?

Unless you somehow write software yourself that depends on libs you do not have installed. Which is a pretty silly thing to do.

Re:It's About Time (2, Insightful)

CannonballHead (842625) | more than 3 years ago | (#34212794)

... so isn't this basically just a way to gather all the appropriate dependencies and put them all into one spot?

Hm. I guess this is a way to do it without installing. Reading comprehension fail...

Still, I can see how it could be useful in some situations, just like having certain programs that don't require installation on Windows can be helpful.

Re:It's About Time (1)

h4rr4r (612664) | more than 3 years ago | (#34212822)

Those are only useful because windows is broken. If you could install apps as easily on windows as you could on linux from one central source no one would bother with that.

They might make a launcher that calls firefox with their portable preferences folder, but no reason to go carting firefox around.

Re:It's About Time (-1, Flamebait)

icebraining (1313345) | more than 3 years ago | (#34212708)

Of course, there's so little number of GNU/Linux installations out there that it's almost irrelevant.

Re:It's About Time (-1, Redundant)

h4rr4r (612664) | more than 3 years ago | (#34212748)

Pathetic troll is pathetic. Ever been in a server room?

You think all those nice websites you like are hosted on Macs?

Re:It's About Time (0, Redundant)

icebraining (1313345) | more than 3 years ago | (#34212840)

I'm not a troll. I use GNU/Linux exclusively. I just never see them anywhere besides my university.

And no, I hadn't thought about server rooms - parent talked about portableapps.com, which is exclusively for desktop apps (or at least, I wouldn't run Gimp or AssaultCube on a server).

CDE 2 (2, Informative)

ukpyr (53793) | more than 3 years ago | (#34212460)

I'm just pointing out a major application - that's not so major anymore - Common Desktop Environment uses this acronym :)
Does sound like a neat tool though!

dependency hell is not inevitable (0)

Anonymous Coward | more than 3 years ago | (#34212486)

I don't understand why some people insist that dependency hell is inevitable. I have been using Debian for 8 years and Ubuntu for 4 years, using them for a wide variety of purposes, and I never experience anything that I could describe as "dependency hell"... (unless you count trying to cross-compile my linux code for win32)

That being said, I think this is CDE thing is cool, and is a good idea. I just don't think it is a good idea for the same reasons that the original poster specifies.

Party like it's 1988 (1)

countertrolling (1585477) | more than 3 years ago | (#34212488)

Wow, portable apps, just like an old Mac. Did anybody, for even a second think it's kinda weird for a program to splatter its parts all over the disk and into every directory it can find?

Re:Party like it's 1988 (5, Insightful)

h4rr4r (612664) | more than 3 years ago | (#34212516)

Wow, static linking, did anybody for even a second think it is kinda weird to have the same lib on the machine over and over and in every old exploitable version you can find?

Re:Party like it's 1988 (2, Insightful)

countertrolling (1585477) | more than 3 years ago | (#34212598)

That would all be nice if not for each program replacing shared libraries with its own and breaking the other programs. A program should not disturb the system or mess with its libraries. I prefer it remains as isolated as possible, where it can do the least damage.

Re:Party like it's 1988 (1, Informative)

h4rr4r (612664) | more than 3 years ago | (#34212640)

No the program should just use the libraries the system has. This is what package management is all about. What you prefer is only the product of brain dead OSes that lack proper package management.

Re:Party like it's 1988 (1)

aliquis (678370) | more than 3 years ago | (#34212714)

Except it doesn't.

Poorly written libraries or whatever may, but the package maintainer and package manager will make sure that won't happen.

Re:Party like it's 1988 (1)

icebraining (1313345) | more than 3 years ago | (#34212720)

That's why dpkg prevents installation if the package wants to override others' files.

You don't need static linking to do damage prevention, you need a decent package manager.

Re:Party like it's 1988 (1)

lowen (10529) | more than 3 years ago | (#34212756)

That's also why RPM will not let a package overwrite another package's files. Wow, what a concept.

Re:Party like it's 1988 (0)

h4rr4r (612664) | more than 3 years ago | (#34212830)

To windows users it really is. Either you know the better way or you start to think about hacks like putting everything you need for an application in one file/directory.

Re:Party like it's 1988 (4, Insightful)

Haeleth (414428) | more than 3 years ago | (#34212722)

That would all be nice if not for each program replacing shared libraries with its own and breaking the other programs.

Do, please, show me just one widely-used program that does this on a recent UNIX or Unix-like platform.

A program should not disturb the system or mess with its libraries.

Right. That's why you should put programs you install under /usr/local, not straight under /usr. Or of course many programs like to be installed in their own self-contained directories under /opt, which is, er, basically exactly what you're asking for and has been common practice for decades.

Re:Party like it's 1988 (3, Insightful)

countertrolling (1585477) | more than 3 years ago | (#34212856)

That's why you should put programs you install under /usr/local, not straight under /usr.

Not the issue. That's a given. It's when I suddenly find out I don't have some bizarre version of gtk, or ncurses (great name, because that's what I'm doing when I find it missing), and I'm suddenly without internet, it gets a bit tense. I prefer the portability over raw efficiency. It is far and away one of the best things about a Mac. I can take something as bloated as MS Office or Photoshop straight from one machine to the next.

Re:Party like it's 1988 (2, Informative)

avaik6 (1927978) | more than 3 years ago | (#34212616)

I would LOVE to have mod points and vote this up... I always loved the fact that the applications use the existing blocks on the system to do their magic, and by sharing the blocks you can have more apps in less space than other platform$. But with this... if you install every f*cking package with this.. you will end up having 12 different glibc versions (hey, maybe 12 times the same glibc version, each for each app that requires it), and 81725 times each library.. I think it's a _GREAT_ tool to perform a fast-and-no-brains-available deployment of some in-development app, or to debug some weird shit. But please, do not pretend to install all my software like this.. I don't want gedit to weight 100mb.

Re:Party like it's 1988 (4, Insightful)

Kjella (173770) | more than 3 years ago | (#34212842)

For packages provided by the distro, it makes sense to have them all use their complex dependency tree. For installing some other version side by side, this sounds like a great tool. The problem with dependencies is that often a pebble turns into an avalanche by the time you're done. If you want the new version of *one* KDE app, it can drag pretty much the whole of KDE and every library they in turn depend on with it in an upgrade. I've had that happen and ended at 450MB to download and install, and that would pull almost all packages out of LTS support.

From the user's point of view it's completely illogical to upgrade the whole system just because you want a new feature in amaroK 2.4 while your distro only packages 2.3, you expect one application to install or upgrade independently of any other application. That does not happen with Linux. It is not just about new library versions, via dependencies you pull in unwanted version upgrades. As for security I'd rather have one potentially insecure package on my system than to pull most packages out of support, it probably open ups more vulnerabilities than it prevents.

I wouldn't want to run a dozen applications like that. But if it's one or two? I got no problem taking the extra overhead of a bit more memory use. And honestly, a lot of software I use isn't in contact with the "outside world" as such. Even if there is an exploit in a library, I'd never open any file crafted to exploit it. Obviously it is good in general to patch stuff, but it's not always that critical...

Re:Party like it's 1988 (0)

Anonymous Coward | more than 3 years ago | (#34212892)

Wow, static linking, did anybody for even a second think it is kinda weird to have the same lib on the machine over and over and in every old exploitable version you can find?

If only there were some sort of global network over which applications could update themselves on a regular basis.

Re:Party like it's 1988 (1)

aliquis (678370) | more than 3 years ago | (#34212664)

Some mac applications does to. It's just that there's nothing tracking it :D

The obvious disadvantage to this is that 1) it takes up more space, and more importantly, 2) libraries or whatever included in the applications may be outdated since long but nothing is tracking it or noticing you. You have to rely on noticing the application is out of date and the developer to have upgraded anything vulnerable.

Personally I still want my Amiga :/

Copy libs/ fonts/ env/ if you want to. Don't if you don't want to =P

Copying them obviously make removal somewhat harder but I never saw a purpose to remove any library files anyway. The advantage of copying them was that Directory Opus could tell you if the version you tried to replace with was newer or not and hence they would eventually get upgraded at the same time.

Package managers are good. With one package format for all platforms / large enough selection of packages (see Debian, FreeBSD, Ubuntu, Gentoo, ..) there sorta wouldn't be any need whatsoever for any of these portable or "single installation package" solutions.

A solution for a problem which isn't there, and which make new ones.

Re:Party like it's 1988 (1)

rgmoore (133276) | more than 3 years ago | (#34212712)

Did anybody, for even a second think it's kinda weird for a program to splatter its parts all over the disk and into every directory it can find?

Yeah, that would be weird. It would be much more sensible for programs to behave the way they do under sane Linux package management, and put their parts into places defined by the filesystem hierarchy standard. That way it's possible to have just one copy of important files and libraries, so each program package can be much smaller.

Re:Party like it's 1988 (1)

Haeleth (414428) | more than 3 years ago | (#34212732)

Did anybody, for even a second think it's kinda weird for a program to splatter its parts all over the disk and into every directory it can find?

It's no weirder than the opposite, which is for your documentation to be scattered all over the disk in a whole bunch of different directories, and your commands to be scattered all over the disk in a whole bunch of different directories, and then when you want to fix a major security hole in a common library you have to completely reinstall 50 programs from scratch instead of just updating one single file.

Open your mind. Just because something is different from what Apple does, doesn't mean it's necessarily a stupid thing to do. Sometimes there is more than one mutually-exclusive right answer, and Apple can only choose one of them.

GPL Compliance warning! (2, Insightful)

Anonymous Coward | more than 3 years ago | (#34212490)

If those libraries are GPL or LGPL, then when you deliver the binary of the library, you must also deliver the source or an offer to deliver the source, and you must also deliver a copy of the (L)GPL, as part of the CDE. Is this done?

Re:GPL Compliance warning! (1)

noidentity (188756) | more than 3 years ago | (#34212926)

Correction: if the libraries are just about anything besides modified BSD/MIT/zlib-style licensing, you probably need to include the license and author information, at the very least. Not specific to GPL by any means.

Bad idea or worst idea ever? (4, Insightful)

h4rr4r (612664) | more than 3 years ago | (#34212494)

Great, now we can have outdated exploitable libs and every other kind of BS that comes with this. Might as well just statically link everything. Package mangers exist for a reason, use them. Do not bring the errors of Windows to us.

Re:Bad idea or worst idea ever? (2, Informative)

99BottlesOfBeerInMyF (813746) | more than 3 years ago | (#34212624)

Great, now we can have outdated exploitable libs and every other kind of BS that comes with this. Might as well just statically link everything. Package mangers exist for a reason, use them. Do not bring the errors of Windows to us.

Or you could just use OpenStep, get dynamic libraries and portable apps. This is a long solved problem.

Re:Bad idea or worst idea ever? (1)

DigitalContradiction (1189907) | more than 3 years ago | (#34212878)

It seems to me it's a good idea for the precise things mentioned in the article : quick prototyping, using software on an environment you don't fully control (like running some scientific simulation software on a cluster) and bug reports (although I'm more skeptical about that one). Not for managing and deploying software in general. So why so much scorn ? It's just a tool for an end. CDE won't kill good software development and deployment practices, no more than Wine killed GNU/Linux FOSS software.

Year of linux (0)

Anonymous Coward | more than 3 years ago | (#34212496)

I think this breakthrough may just be the start of the year of linux (on the desktop).

-F

copying proprietary software (2, Insightful)

tommeke100 (755660) | more than 3 years ago | (#34212504)

This sounds like an easy way to copy installed proprietary software?

Re:copying proprietary software (1)

aliquis (678370) | more than 3 years ago | (#34212738)

As if there is ever hard to copy any data you can get hold of?

Spreading files all over the place for sure won't make it "impossible" or even hard to pirate the application.

Other things will try their best to prevent that, and fail.

Re:copying proprietary software (0)

Anonymous Coward | more than 3 years ago | (#34212750)

Nuke it from orbit! It's the only way to be sure.

Limitations (1)

Bryan Ischo (893) | more than 3 years ago | (#34212510)

Knowing what files a program will open without running the program is impossible, and since a program can dynamically change what files it opens from run to run, it would be impossible to predict every file that a program would require in all situations. The best that a tool like this could do would be to record the files that were used during a given run of the program and assuming that the program when run later with the same inputs would use the same files, support running the program with exactly those inputs.

I don't think you could reliably package up many programs in this way for general use, but maybe you could package up a program for a specific task like producing a reproducible bug report. But that is of little utility as most distributions that you would run a program on and need to submit a bug report are fairly easily reproduced anyway and the developer probably would have an easier time just reproducing the bug on their own system without fussing with some packaged up binary that the bug submitter sent in.

This honestly seems like a wrapper around "strace | grep ^open". If this is what it takes to be a researcher at Stanford these days then sign me up!

Re:Limitations (1)

martin-boundary (547041) | more than 3 years ago | (#34212700)

That's exactly what the tool does. Quoting from the CDE homepage:

CDE is easy to use: Simply prepend any Linux command with cde, and CDE will execute that command, monitor its actions, and automatically copy all files it accesses (e.g., executables, dynamically linked/loaded libraries, plug-ins, scripts, configuration/data files) into a package within your current working directory. Now you can transfer the package to another computer and run that same command without installing anything. In short, if you can run a Linux command on your computer, then CDE enables others to run it on theirs.

So there are no guarantees, and it's certainly no real solution to portability. However, I might be an interesting security monitoring tool, eg after recording "typical" program behaviour, you could clone the package for running in a locked down environment, and if it ever starts to behave strangely, then the program would likely crash because the files/paths it expects were never cloned.

Re:Limitations (2, Insightful)

h4rr4r (612664) | more than 3 years ago | (#34212764)

You could just use SElinux, which would already let you do that.

This looks like a solution looking for a problem.

Re:Limitations (1)

aliquis (678370) | more than 3 years ago | (#34212754)

I haven't RTFA and have no idea if they track the needed files. But the obvious alternative would of course be to compile everything needed to some other location than / and then pack it all together.

No need to track anything. Either everything needed is there or it's not.

Maybe is the fatigue (1)

Saija (1114681) | more than 3 years ago | (#34212522)

but i can't read which license this software adheres*?
*just in case anyone wants to take a look and change some source code...

already done, already proven a bad idea (1)

KiloByte (825081) | more than 3 years ago | (#34212542)

Uhm, but using the packaging system already present can do the same with less waste, better support for distribution and with existing tools. File-based dependencies like in RPM may be slightly deficient here, but with package-based dependencies like in .DEB you have full control over what you want.

There were multiple such systems for working against the packaging system already like autopackage, and they turned out to be a disaster. I fail to see how this one is any different.

Re:already done, already proven a bad idea (1)

Burdell (228580) | more than 3 years ago | (#34212644)

Not sure what you mean by "File-based dependencies like in RPM may be slightly deficient here". RPM only depends on specific files if you need them; e.g. if you have package A with a config file /etc/a.conf that is also required by package B, package B will depend on the file /etc/a.conf (commonly found with scripts and dependencies on /bin/sh, /usr/bin/perl, etc.). If package B just needs package A to work, it'll depend on "A".

I really like where this is going. (1)

AnonymousClown (1788472) | more than 3 years ago | (#34212604)

It's looks very promising and hopefully it'll get to the point where installing software on Linux will be as easy as on WIndows and OSX.

I wonder how far you can take this? For example, a program that requires certain versions of: Mono, glib, gphoto, and other libraries, would it be able to grab all of those?

Then, if you have something really complicated, you're going to eat up a huge amount of disk space.

Now, I'm going to ask myself, 'this is fucking awesome and wtf am I picking on it so much when I couldn't do any better?'

Time for my anti-dork pills....

Re:I really like where this is going. (1)

h4rr4r (612664) | more than 3 years ago | (#34212626)

Installing software on linux is easier than on windows or osx.

Either
apt-get install foobar
or find foobar in the GUI package manager/app store/software center.

This just brings us the hell that is software on windows. Every app with its own outdated versions of every other damn lib nevermind the fact that it wastes space and provides a ton of security bugs.

Re:I really like where this is going. (1)

Lunix Nutcase (1092239) | more than 3 years ago | (#34212752)

Yes, if you ignore when the version of the package you want doesn't exist in the repos and it also requires dependencies that are of a higher version than in the repost. And this isn't an uncommon case.

Re:I really like where this is going. (2, Interesting)

Haeleth (414428) | more than 3 years ago | (#34212786)

Installing software on linux is easier than on windows or osx.

For software that is in the package manager, yes.

Where something like this "CDE" might be handy is for software that is not in the package manager. Suppose you've written a program that is only of interest to a handful of users. There's no way it's going to find package maintainers for every major distro, and your users might not be happy building from source code.

There are also classes of software that are not allowed in the main repositories for some major distros like Fedora and Debian. For example, the authors of indie games might want to let Linux users play without making the whole game open source. Even if they open-sourced the engine, some distros will not permit it in the repos if its only use is to access non-free data.

Basically, "package once, run everywhere" is very appealing to a certain class of software distributor.

What I don't see is what CDE offers over any of the dozens of existing autonomous packagers. Or why they chose to confuse people by using the same name as the standard UNIX desktop environment.

Re:I really like where this is going. (4, Insightful)

bmo (77928) | more than 3 years ago | (#34212946)

Where something like this "CDE" might be handy is for software that is not in the package manager. Suppose you've written a program that is only of interest to a handful of users. There's no way it's going to find package maintainers for every major distro, and your users might not be happy building from source code

So do the packaging yourself. It's not hard. And when you're done, you have something sitting in the RPM or DEB database with all the others so you can keep track of it.

There are also classes of software that are not allowed in the main repositories for some major distros like Fedora and Debian. For example, the authors of indie games might want to let Linux users play without making the whole game open source. Even if they open-sourced the engine, some distros will not permit it in the repos if its only use is to access non-free data.

So set up your own ppa (or rpm equivalent) repository. Your customers can add the repository to their list and then keep track of the package. You seem to be under the impression that repositories are only for "approved" software or that package managers can only handle a small number of entries. I have over 150 entries in /etc/apt/sources.list. Adding another one is no big deal. You also seem to think that licensing issues affect what you can put in a repository. It doesn't matter if you have your own repo. You could put commercial software in there, like Sun/Oracle with their VirtualBox.

Package management and repositories as they exist in the Linux world are better ways of handling the distribution of software both free and commercial than anything else I've seen on any platform.

This "CDE" doesn't solve any problems, but introduces its own "dll hell"

--
BMO

Re:I really like where this is going. (1)

Homburg (213427) | more than 3 years ago | (#34212658)

hopefully it'll get to the point where installing software on Linux will be as easy as on WIndows and OSX.

Why would we want that? I prefer the current situation, where installing software on Linux is much easier than installing software on Windows and OS X.

Re:I really like where this is going. (1)

Lunix Nutcase (1092239) | more than 3 years ago | (#34212780)

I prefer the current situation, where installing software on Linux is much easier than installing software on Windows and OS X.

Like when you have to upgrade your entire Ubuntu install just to get the latest version of Firefox? Whereas I can install Firefox 4 beta on a 10 year old version of Windows?

Re:I really like where this is going. (1)

oiron (697563) | more than 3 years ago | (#34212824)

And have all the existing, open bugs of a 10 year old version of Windows along with all the bugs of a beta version of your browser?

Why not just upgrade when things are fixed?

err whut (0)

Anonymous Coward | more than 3 years ago | (#34212880)

I am far from a guru, but I seem to have no problem running FF 3.6xx pre dailies and also 4.x beta FF on my ubuntu install 10.4 lts. No having to upgrade to do it either, just enable the correct PPAs in the repository list.

Re:I really like where this is going. (5, Insightful)

Junta (36770) | more than 3 years ago | (#34212846)

Dear god no.

I do not want to execute installshield or any similar crap/wizard for every little thing I install.

I do not want to have a system tray/task manager full of two dozen vendor's update checker processes, each individually bugging me about how I'm running WidgetFoo 1.8.1.20.1.3, and it is critically important that I execute WidgetFoo's custom one-off graphical update wizard with 3 or 4 pages to click through to get to 1.8.1.20.1.4. Then rinse and repeat once per app instead of knocking them out in one shot/dialog/icon/process.

I do not want each application to bundle their ancient ass directx library or ancient library from visual studio or any other similar crap.

Windows installs were not historically 'easy' due to any effort on MS's part (installshield and friends made an entire business out of covering for MS' lack of help, even as MSI matured into a usable solution). Linux (specifically Debian) really got this right first. Apple recognized that model and made it a great success on the iPhone, setting the tone for all of modern mobile devices. Debian did it right first and never gets the credit.

Re:I really like where this is going. (1)

houghi (78078) | more than 3 years ago | (#34212962)

It's looks very promising and hopefully it'll get to the point where installing software on Linux will be as easy as on WIndows and OSX.

What now? Installing software is a two step issue.
1) Locate the software. The major distributions have a program and/or website that will be able to do just that and find the majority of software. e.g. http://software.opensuse.org/ [opensuse.org]

2) Installing will be done with most likely the same program. Do you have openSUSE 11.3 and want to install e.g. lbreakout2 Just click here [opensuse.org]

Yeah, sometimes you will need to compile the software yourself. That is however not an OS problem, but a developer problem. They decided not to package. And packaging can be done for many distro's by using e.g. https://build.opensuse.org/ [opensuse.org] where you can build against openSUSE, SLE, Debian, Fedora, RedHat, CentOS, Mandriva and Ubuntu (23 versions in total) all in one go.

The last time I needed to compile something is now about 3 years ago. I just look for something similar and use that instead.

This can all be avoided, (0)

Anonymous Coward | more than 3 years ago | (#34212606)

if you write proper packages for your distro. That is after all, the whole point of the package manager...

Re:This can all be avoided, (1)

Haeleth (414428) | more than 3 years ago | (#34212798)

if you write proper packages for your distro.

Which distro is "your" distro? And what are people who use a different distro supposed to do? Write their own packages?

cross distribution compatibility (0)

Anonymous Coward | more than 3 years ago | (#34212630)

doesn't this make cross distribution distribution of applications much easier?

meaning, as an example, if Adobe wanted to release Photoshop for Linux, it could without have 50,000 different package formats for every different version of every different distribution?

if so, this could be very awesome.

Re:cross distribution compatibility (1)

h4rr4r (612664) | more than 3 years ago | (#34212678)

You seem to be pretty bad at counting.

2 versions would be fine. RPM and Deb.

Re:cross distribution compatibility (0)

Anonymous Coward | more than 3 years ago | (#34212924)

so i can distribute my closed source app that will run on debian, ubuntu 6.06 and 8.04 with 1 deb and then fedora 12, opensuse 11.4, mandriva, and for good measure an older version of redhat enterprise linux and maybe even a mandrake?

all with 2 packages, a deb and a rpm?

really? sweet.

Re:cross distribution compatibility (4, Insightful)

icebraining (1313345) | more than 3 years ago | (#34212808)

Firstly, you don't need 5000, you need 4 or 5 for the most used distros. Ubuntu, Fedora, OpenSuse, Debian and Red Hat. Let the others figure it out from a tar file.
And if a company like Skype can produce those packages, so can e.g. Adobe.

Secondly, that already exists [wikipedia.org].

Good idea, Not enough. (1)

goruka (1721094) | more than 3 years ago | (#34212632)

This is great for application developers that put out new version faster than distros can keep up (Example, Qt-Creator, which i download always as binary from nokia site). The problem is when you want to try an app that is either of something larger (KDE, Gnome), or an actual desktop environment, dependencies make it impossible..

Cloud computing?! (0)

Anonymous Coward | more than 3 years ago | (#34212758)

Um, if you are managing a cloud and cannot figure out how to package up your own code and install it as part of a kickstart then you need to be hit and have your sysadmin privileges revoked ASAP, not necessarily in that order.

Not meant for distribution (0)

Anonymous Coward | more than 3 years ago | (#34212774)

This is obviously not meant for distributing programs (and the guy doesn't claim it is either; he's using it for repeatable research). It takes a snapshot of the entire filesystem on the system on which it's run. This not only results in huge packages for anything nontrivial, but also simply won't work for things that are system dependent, like, say, OpenGL, or which sound system to use. It also restricts file system access to the CDE package.

Portable, standalone packages can be done the right way by compiling against an older version of GLIBC, including required libraries in the package (GLIBC is backwards compatible, so don't include that, and don't include the entirety of X11, or OpenGL, or sound drivers, etc), and then using either RPATH on the executable or a script that sets LD_LIBRARY_PATH so that the executable finds and uses your included libraries. This is how it's done by many commercial Linux games, it works fine.

Interesting, but worth caution. (0)

Anonymous Coward | more than 3 years ago | (#34212796)

Many apps aren't so simple. For example, say you have a GPGPU app that links against a hypothetical library dynamically. That ABI represents a cross-vendor interface so that, say, nVidia or AMD's library could satisfy the link, depending on which hardware is installed. Pulling in that link could be very counter-productive. I don't think GPGPU works like that, but more esoteric things do. The other case is where a dynamic load or file operation is highly conditional. I presume he uses ldd strace and follows the syscalls to pick up the things ldd cannot. If the code doesn't happen to traverse the path, it won't get hit.

When positioned as a 'cloud' tool, lends itself to exacerbate I/O load and memory load. It's a philosophy that undoes a lot of the efforts towards efficiency that dynamic libraries were intended to do. It may be fashionable to scream 'cloud' on every single little thing nowadays, but shared libraries and common infrastructure are a good thing. Even if the apps will be separated by a harder virtualization way degrading efficiency, at least memory dedupe strategies can still dig in and get back some efficiencies if common platforms are used, but if library fragmentation is encouraged, that too will be lost.

Now, in a high performance computing application, interesting way to build 'just enough' OS to run an application. This is a tad different as each server has no more than one real application in flight at a time, and therefore need not worry about memory inefficiencies of running many apps on the same box without the full benefit of shared libraries to mitigate load. Deliver that small package every reboot and you give job submitters ultimate flexibility over their environment without drastic speed penalties to apply them. No need of local disk to host the platform and little to no memory waste as the OS image has only files that would be memory resident anyway.

My extremely similar tool: dynpk (2, Interesting)

xgoat (629400) | more than 3 years ago | (#34212800)

I very recently published a tool [xgoat.com] that performs a similar task. dynpk (my tool) bundles programs up with packages from your system, then wraps them with some other stuff to create a bundle that essentially allows you to run a Fedora program on a RHEL machine (and probably Ubuntu or Debian, but this is outside my needs...).

Recompiling loads of libs for RHEL isn't fun or particularly maintainable. Therefore, use the ones from Fedora!

Sharing nethack "bones" file has never been easier (1)

cmdr_tofu (826352) | more than 3 years ago | (#34212820)

Seriously though this could end up making collaborating on software a lot easier among trusted people.

Commercial equivalent? (1)

Skaven04 (449705) | more than 3 years ago | (#34212868)

If you're looking for a commercial product that can do just this -- but in a supportable way, and with lots of nifty features that enable building appliances and deploying them to the cloud...take a look at rPath: http://www.rpath.com./ [www.rpath.com] We're using it at our company on thousands of systems and love it!

CDE - good idea, try cpos-recovery on Sourceforge (1)

amazingcs (1157687) | more than 3 years ago | (#34212942)

A good idea, but there are easier ways to do the same task. The project cpos-recovery on Sourceforge does essentially the same task, but relliably and quickly. There are several ways that this kind of quick copy can be used: 1) back up to an alternate partition as a backup boot partition if you want your system to be bombproof. 2) make an installable tar file that can be quickly applied to many new systems. 3) create another partition of a different type, and copy your system to it (think ext4 to btrfs), with the assurance that the system will boot, and run, properly (given that grub supports that filesystem). 4) create an offsite backup of your working system in case of problems. These are all easily done with cpos-recovery, so check it out.

Concurrent Versions (1)

mrawhimskell (1794156) | more than 3 years ago | (#34212948)

When I can comfortable run nightly, beta, and stable versions of firefox in linux then I'll know this has been a hit. looking forward to that.

Works, Out of The BOX !!! (0)

Anonymous Coward | more than 3 years ago | (#34212966)

about damned time. I said that this is what was needed back in the 20th century. NOW we can see linux adoption begin to take off. Who amongst the great unwashed users, can begin to figure it all out? NONE. everyone just wants to click the download link and HAVE IT WORK OUT OF THE BOX !!!

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...