Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Linux Descending into DLL Hell?

Roblimo posted more than 13 years ago | from the complex-software-causes-complex-problems dept.

Linux 467

meldroc writes "Or should I call it "Shared Library Hell"? The top story of LWN.net this week states that the newest version of GnuCash requires sixty libraries(!), many of which are new versions that aren't available out of the box with most Linux distributions. Managing shared libraries, with all their interdependencies and compatibility problems, is a major hassle. How should we go about dealing with the multitudes of shared libraries without driving ourselves mad or descending into the DLL Hell that makes Windows machines so unreliable?" Well, GnuCash 1.4.1 works fine for me, and I feel no immediate need to update to 1.6, the version that needs 60 libraries. But still a good point here.

cancel ×

467 comments

Sorry! There are no comments related to the filter you selected.

Oh, come on, wussies (1)

Anonymous Coward | more than 13 years ago | (#149389)

Just because you have applications running with multiple versions of the same library from one desktop environment running under the other desktop environment with a window manager on top of X on top of an OS, what's the problem?

That's why I use gnofin. (1)

Anonymous Coward | more than 13 years ago | (#149390)

Gnucash has always been dependancy hell. Try gnofin, the less fat alternative. http://gnofin.sourceforge.net/

Re:uh oh (1)

Anonymous Coward | more than 13 years ago | (#149391)

True dat. Frankly it's the first thing that occured to me when I learned about open source coding. The answer is having centralized and directed development, but that's not a GNU idea! I'd mod you up but registering is far too much care for a news recylcing site. --- Of course, I run OS9, Win95 and Win98SE so I don't really know anything about computers. I even run Internet Explorer. Even on my iMac!! I'm sure Guy Kawasaki and Linus will pee all over my grave. And I'm cool with that. I'll shut up now. Goodbye

Well (1)

Anonymous Coward | more than 13 years ago | (#149392)

I have noticed this too; there are just too many dll's to deal with. A possible solution, although a bit far out i would admit, would be to have all the dll's in updated for online, and when your computer needs them, it can access them in advance, and have them ready for the program to run. When they are no longer needed, they can be discarded, or saved, depending on how often they will be needed in the future, or user preferences.

Star Office! (1)

Anonymous Coward | more than 13 years ago | (#149393)

What short memories! When staroffice was statically linked everyone bitched cause it was freaking huge and slow. Now people are saying, "lest go to static linking!!".

Fools who ignore history are doomed to repeat it.

Re:Is there a real solution to this? (2)

Anonymous Coward | more than 13 years ago | (#149398)

Http://www.linuxbase.org is trying to acomplish this. They have good backing and if all goes well we should have a bit more of a standard library group.

Theoretically there should be no problem (2)

Anonymous Coward | more than 13 years ago | (#149399)

To get into serious trouble simply disobey one of these rules:
  1. Perform proper versioning on your shared libraries. The simplest rule is: If your new version breaks binary compatibility (on any platform!) change the soname. Normally you would bump the major version number. The versioning abstraction layer provided by GNU libtool is far from being perfect, but it does it's job.
  2. Make your shared libraries fully self contained. Many platforms (if not more) provide everything to accomplish this. At runtime (and at compile time for executables or shared libraries that your library is linked to) the system must be able to resolve all needs of your library. The executable and linking (or linkable?) format provides the dynamic tags NEEDED and RPATH for this. Most prominent example for ingnoring this rule is libpng: You think you need to say -lpng -lz for linking against libpng? You are dead wrong! If libpng would be self contained, the linker does not only know, that libpng needs libz, but where it should look for libz (for the case it's not in /usr). So no need to say -lz as long as you don't use libz in your code.
But there are problems. First of all the GNU ld needs to be fixed. It fails use the RPATH of shared libraries, when looking for NEEDED libraries at link time. I sent patches, but nobody seemed interested.
So long, happy linking.

Application directory anyone?? (1)

mAIsE (548) | more than 13 years ago | (#149404)

How about
/usr/local/gnucash/lib
/usr/local/gnucash/bin/gnucashinit.ksh

You could set your entire environment to point to an application specific lib directory and you're done.

Don't approach a UNIX problem in a m$ mindset.

Re:It's not DLL hell that makes Windows unreliable (1)

Trepidity (597) | more than 13 years ago | (#149405)

But then again I use Opera v5.11 on Win98, and it doesn't go down in flames, so the OS remains stable.

Re:Libraries: Harken to the Bad Old Days (3)

Ian Bicking (980) | more than 13 years ago | (#149406)

When's the last time you saw a "disk full" error?
What, efficiency isn't worth anything these days?

But, really, this breaks down under certain usage patterns. On a system like Debian, where package installation is trivial compared to Windows, there are a ton of packages. I currently have 694 packages installed, though a significant number of them are libraries.

Consider another pattern -- extended environments. Gnome is an instance, as is KDE. I have 12 Python packages installed, and Python by itself doesn't even do anything. I won't speculate on how large Gnome or KDE are.

I have 41 gnome packages installed (or, at least packages named gnome*). What would happen if I had 41 copies of the Gnome libraries for these applications? What if packages had even greater granularity? What if I get to choose which applets I want installed? What GTK engines I want? Hell, I don't even know how engines could work with 41 copies of the libraries.

Sym linking identicle libraries to save space wouldn't do any good, because that offers nothing better than what I have now (which works just fine, actually) -- where most libraries end up in /usr/lib. In a funny way, I suppose hard links would almost work right.

On Windows per-application DLLs kind of make sense. On Windows, people don't have that many applications installed. Each application tends to do a lot on its own. This is partially the influence of commercial tendencies (people don't want to pay for something too small), and partially it's because small applications quickly become unmanagable on Windows. But Debian doesn't have this problem, and RPM-based systems have, well, not too much of this problem. Why "fix" what isn't broken?

Next you'll have us putting application files in directories named after domain names or company names, to avoid naming conflicts. /usr/applications/org/gnu/gnucash. Uh huh.

Re:Welcome to Windows? (1)

jedidiah (1196) | more than 13 years ago | (#149409)

...or barring that: just installing all of the packages related to a particular desktop. "The latest GNOME" satisfies most of the dependencies in question.

This is not what "DLL hell" is/was under WinDOS. This is actually the polar opposite: end user fright induced by being made aware of what exactly is going on and being the one managing it.

If the 5 required meta-packages have some conflicts with some older software, then this would be actual DLL-hell and cause for alarm.

Re:Windows, Gnome - same thing (1)

jedidiah (1196) | more than 13 years ago | (#149410)

It can be thought of as one huge monster package that is organized as a single unit. It's not as if you have 60 completely independent subcomponents that could potentially all be competiting with each other.

What is under discussion here is not really a list of 60 components, but 5.

60 Libraries indeed! (2)

jedidiah (1196) | more than 13 years ago | (#149411)

According to the gnucash site, the requirement is for GNOME 1.4 plus 4 other libraries. This is hardly as complicated as the situtation is being made out to be in the editorial.

Re:Yea but (2)

jedidiah (1196) | more than 13 years ago | (#149412)

...Another fine example of slashdot respondents not bothering to inform themselves on the issue before spouting off.

Most of the GNUcash dependencies are satisfied merely by "obtaining latest version of associated desktop". That hardly rates as "including every single doo-dad" or "spans 10 DVD's".

This is more a matter of a developer using "bleeding edge" libraries rather than creating some perverse octopus.

Re:Relax. (3)

Sludge (1234) | more than 13 years ago | (#149413)

I agree that code reuse causes bloated software in that libraries often are required to deal with a very general case of the problem, which requires much more time than just coding an example by yourself.

However, I can prove that code reuse isn't always bloat: the ANSI C library on your system. If the ANSI C library was statically linked, there wouldn't be any shared memory redgarding it between your processes. When you run 'top' and a process says it takes up a few more megabytes than you thought it would, be sure to check the shared column.

Saying that code reuse causes bloat is not the whole story. Code reuse serves both sides of the bloat war.

\\\ SLUDGE

Relax. (4)

Sludge (1234) | more than 13 years ago | (#149414)

The software requirements [gnucash.org] require "60 libraries" because "The majority of the GNUCash 1.6.0 dependancies are satisfied by Gnome 1.4".

If major distros don't yet support the libraries of recent software releases, that's fine with me. The push for newer versions should come from bleeding edge software.

Aside from that, I personally commend the code reuse of GNUCash. Functionality needs to be reused as much as possible: We're working alongside giants. Let's stand on each other's shoulders.

\\\ SLUDGE

Don't Confuse DLL Hell with the Linux Situation (5)

Mihg (2381) | more than 13 years ago | (#149421)

Linux isn't experiencing anything remotely similar to DLL Hell.

DLL Hell is when Foo DLL 1.0 and Foo DLL 6.0 both stored in the file foo.dll (unlike libfoo.so.1.0 and libfoo.so.6.0) and brain damaged installer programs blindly replace foo.dll version 6.0 with foo.dll version 1.0, thus breaking every single program that depends on the newer version of foo.dll

Because so many of these crappy programs exist, Microsoft has made an attempt at fixing the problem by introducing the Critical Files Protection mechanism, in which the operating system itself monitors file creations and modifications, looking for these stupid installers as they attempt to replace the new versions with the old versions of the libraries. Attempts at change silently fail, and the installer runs along its merry course without breaking things too badly.

Let the package maintainers take care of it (2)

Bruce Perens (3872) | more than 13 years ago | (#149424)

Once this software becomes part of a Linux distribution, the package dependencies will take care of those 60 libraries for you. At least that's the case with Debian, if Red Hat doesn't put this in their official distribution it might be a bit harder.

Thanks

Bruce

Re:Application directory anyone?? (1)

buysse (5473) | more than 13 years ago | (#149436)

Duplication of data is not a valid mindset. Shared library versioning is the answer, young grasshopper.

Also, imagine if every app did this. And you people bitch about Redhat using up 2 gigs of disk now... :)


Re:Libraries: Harken to the Bad Old Days (1)

buysse (5473) | more than 13 years ago | (#149437)

applications will keep their own .DLL's in their own application directories

Dammit, I know that disk is cheap, but this makes me start wondering if the conspiracy theorists aren't bloody right! (Intel/Microsoft -- sell the OS and apps to sell new hardware, which of course comes with an OS... How many people would buy a bigger drive, vs just saying that their computer can't handle it and buy a new one?)

I mean, is this really the right answer? Do I need 20 copies of the same damned .NET DLL on my disk, one for each application? I think not. I do not consider this an intelligent move at all.

Re:Libraries: Harken to the Bad Old Days (1)

buysse (5473) | more than 13 years ago | (#149438)

I'll be honest -- from a technical standpoint, it's the best way to make a broken system (Windows) work without fixing the technical problem -- that you don't link against a version of a shared library, you link against a filename.

Honestly, the concept of having to keep N copies of a given file just offends me on some level... it just seems inefficient and wrong.

Really, how is it that my uber-l33t 1.2Ghz Athlon feels slower than a 2.8 Mhz Apple II running a program with the same basic functionality -- a word processor? Just needed to get that off my chest. </rant>


Re:static goddamnit! (2)

buysse (5473) | more than 13 years ago | (#149439)

Um, you don't run a multiuser system, do you... the problem is that if this program is statically linked against 60 libraries, it's probably going to use up a fuck of a lot of memory. And, if you have 10 users running statically linked programs, you hurt. Badly. Especially since it's usually not just one program -- imagine Gnome statically linked. There are several resident programs, all linked against X and libc at a minimum (call it an extremely -- almost psychotically -- optimistic 5 megs each for library code, plus the app.) That's a lot of memory for 10 people running a desktop... and they might want to do work too. Shared libraries exist for a reason.

Difference from Windows... (5)

buysse (5473) | more than 13 years ago | (#149440)

The major reason we refer to it as dll hell on Windows is very simple -- there's no concept of a version. App A uses v6 of foo.dll, app B uses v8. It's still named foo.dll. Oops -- the API changed. Hell.

Unix systems have the concept of a version -- you change the API, you rev the major version of the library. The old one's still there if you need it, but apps will get (dynamically) linked against the version of the library they were originally linked against.

Yeah, it's a bitch (and a half) to compile all that shit -- I've compiled Gnome (and all dependencies of it) on a Solaris box from sources. It's a pain in the ass. But, as Bruce Perens said in another post, that's the job of the packager -- Ximian, RedHat, the Debian volunteers (thanks.)


Why doesn't Linux adopt a Mac OS X type scheme? (5)

VanL (7521) | more than 13 years ago | (#149447)

One of the most interesting features I read about in Mac OS X was the way in which they distributed binaries. They have a "package" that appears as one binary object, but it acts (for the loader and linker, I presume) like a directory. Inside are stored versioned libraries.

Couldn't something like this be done to reduce the clutter? Have "gnome-libs.pkg" which is actually a tar or tgz file. When an application needed to use a library, it would involve an extra step -- extracting it from the tarfile -- but that would only be on first load (after that it would be cached in swap) and the cost to retrieve the file would be minimal.

On the other hand, the possible gains would be enormous. Packaging would become simple. For most applications, install/uninstall could simply be putting the binary in bin and the libpkg in /usr/lib.

I guess what I'm thinking of is like GNU stow, just taken further. Why not make those directories into tarfiles?

Want to make $$$$ really quick? It's easy:
1. Hold down the Shift key.

Re:Let the package maintainers take care of it (2)

Goonie (8651) | more than 13 years ago | (#149450)

Indeed. GnuCash 1.6.0 is already in Debian unstable, in fact (thanks to the great work of the Debian maintainer).

Go you big red fire engine!

Re:Relax. (2)

Goonie (8651) | more than 13 years ago | (#149451)

Come on. I'm sure they could have done better. Installing gnucash on my system, not counting the libraries I already have installed, would take 35332k! (yes I actually checked).

Sure. You can then install Abiword, Gnumeric, Nautilus, Evolution, and Dia in another few megabytes. That's the point of shared libraries.

Oh, and by the way, have you checked just how much screenshot-heavy documentation gnucash provides?

Go you big red fire engine!

Re:Welcome to Windows? (4)

PD (9577) | more than 13 years ago | (#149457)

You forget the other part of the equation.

On Windows, the libs are called Dynamic Linked Libraries. On UNIX, they are called SHARED libraries. Of course, we all know they are the same thing, but apparently on Windows many people don't understand their purpose.

Part of the DLL hell is the vast amount of them that are unnecessarily created by people who don't understand when static linking will work just fine. I still hear people claim that DLL's magically keep the executable size small. DUH! All it does is unnecessarily chunk up your program, increase file count, and increase loading time.

So far under Linux I have hardly seen any abuses of this. Shared libraries are generally reserved for geniunely sharable code, and the rest is statically linked the way it should be.

It sounds like GNU Cash is using shared libs correctly and once the distros catch up we'll wonder what the fuss was about.

Use debian (1)

Ex Machina (10710) | more than 13 years ago | (#149460)

This is why systems like debian's apt-get that automagically satisfy dependencies are good. Granted, 60 is a lot!

Re:Yea but (2)

jsled (11433) | more than 13 years ago | (#149463)

Indeed. In fact, the development of GnuCash helped push a couple of those packages to have new features [there was a lot of back-and-forth with respect to Guppi, for instance, that benefits both projects]. Guile got a few things GnuCash was including itself into the guile distro, IIRC.

The deps for GnuCash are at the forefront of the state of the art for the GNU desktop, but c'mon: this ain't no half-hour hack; it's one of the more serious desktop applications for Linux, and thus requires some recent libraries. Or, you could just dual-boot and run Micros~1 Money or Quicken...

...jsled

Re:Let the package maintainers take care of it (3)

MikeFM (12491) | more than 13 years ago | (#149470)

Tools like apt, Red Carpet, and the Apache Toolbox are doing a good job of handling this problem for me. I constantly run the newest versions of applications without any extra legwork.

Since the majority of Linux software comes in packages we'll never enter the DLL Hell in the way Windows has because you can always see where each file comes from and how they relate just by bothering to do some legwork.

Upgrading is no big hassle for the most part. Use something like one of the above mentioned tools to stay updated or if you must resort to a program that is none-standard try something like wget (or in many cases just plain old ftp) to suck all the needed files in.

Ever used Red Carpet? (1)

nft (12680) | more than 13 years ago | (#149471)

I don't really worry about library hell, cause I use Red Carpet from Ximian. It's free, it does security updates and such, and it resolves dependencies pretty well. I think linux really need more auto-updating software to make it less work to run. Red Carpet is just like Winders Update, cept there's a BUNCH of software available and I don't worry about anything breaking.

I like Ximian so much, I even bought a shirt from Think Geek. Hold me down...

-=nft=-

Re:Is it really necessary.. (5)

ethereal (13958) | more than 13 years ago | (#149480)

It's never necessary, unless of course you want one of the cool new features in that library. It doesn't take too many active developers before you're pulling in a lot of new technology and thus a lot of new library dependencies.

For most people this won't be a problem since their distro will figure it out, and for everybody else that's trying to install RPMs or build from source, take a look at the mailing list archives on gnucash.org [gnucash.org] . If gnucash didn't use brand-new tools like Guppi and Gnome 1.4 libs, much of the cool new stuff in gnucash 1.6 wouldn't be there.

It's not really DLL hell, since you can have multiple copies of the same library installed for use by different apps. DLL hell would be if gnucash blew away the libs that gnumeric needs, and then reinstalling gnumeric screwed up the libs that nautilus wants, etc.

Caution: contents may be quarrelsome and meticulous!

Re:Difference from Windows... (2)

flink (18449) | more than 13 years ago | (#149492)

Windows DLLs do have versions. That's what the Class ID is for. When you break binary compatability, you generate a new ClassID.

Take, as an example, msado.dll. You want to use the Recordset object. We'll assume there are two versions. Your registry will look something like this:
(This is from memory, I don't have a win machine in front of me)

...
HKEY_LOCAL_MACHINE\Classes_Root\ADODB.Recordset\ Cl assID = {aaaaa-11111-blahblah}
HKEY_LOCAL_MACHINE\Classes_Root\ADODB.Recordset\ Cl assID = {bbbbb-22222-fooey}
...
HKEY_LOCAL_MACHINE\Classes_Root\{aaaaa-11111-bla hb lah}\InProcServer32 = C:\Program Files\Common\ODBC\msado15.dll
...
HKEY_LOCAL_MACHINE\Classes_Root\{bbbbb-22222-foo ey }\InProcServer32 = C:\WINNT\System32\msado20.dll

The upshot of this is that if you use static binding, i.e. you point your compiler at the DLL, the ClassID of the dependancy is compiled into your program, and you're guarunteed to get that version or later loaded at runtime.

If, on the the other hand you use late binding (CreateObject() in VB) the ClassID is determined at run time by checking the registry. If multiple copies exist, I think you get the one with the newest version string.

www.modules.org (1)

LL (20038) | more than 13 years ago | (#149495)

I'm surprised more people don't use it ... it's really useful for managing multiple versions of libraries and different applications which depend on them. Unforuntately when you start hitting C++ class interdependencies or zillions of versions it starts being tricky ... which is where namespace managment and scoping rules become more important.

LL

Solution: Mix Dynamic and Static (5)

GroundBounce (20126) | more than 13 years ago | (#149497)

One of the original reasons behind DLLs in the first place was to save redundant disk space and memory. This is still true, but when DLLs were first popularized on PCs by Windows 2.x (or whatever it was), most machines had a 20-30Mb hard drive and 1Mb of RAM.

Things have changed. While the larger, most common libraries (GTK, QT, glibc, X libs, gnome and kde libs) should remain dynamic, it would be helpful for binary packagers to statically link the smaller and more obscure libraries, especially if they are using a bleeding edge version that is not even in the most current distributions.

With a combination of static and dynamic linking, you'll achieve the majority of the benefit of shared libs because your largest and most common libs will be dynamic, but you'll be able to avoid much of the DLL hell and upgrade hell that accompanies packages that use bleeding edge libraries.

It's not DLL hell that makes Windows unreliable (4)

Katravax (21568) | more than 13 years ago | (#149502)

How should we go about dealing with the multitudes of shared libraries without driving ourselves mad or descending into the DLL Hell that makes Windows machines so unreliable?

DLL hell is a small part of the problems Windows faces, but most of the better programmers started putting their libraries either in their app directory or static linking... Most every app on my Windows machine can be uninstalled by deleting a directory. But don't blame Windows instability on DLL hell. DLL hell is just another symptom of the same thing that causes the instability.

What makes Windows boxes unstable, plain and simple, is faulty drivers and applications. Out of the box, the NT series has been rock-solid even since 1.0 (version 3.1). The Windox 9x series has also been way more reliable than it has the reputation for. Drivers provided directly by Microsoft have traditionally been very stable, even if not very feature-rich. The drivers provided by third-parties, however, tend to suck overall. I would estimated 50% of instability problems I see are related to VIDEO drivers.

The big thing people forget when they compare stability of Windows and other OSes is that in monolithic kernels, the drivers are provided by the guys that know the kernel, thus are typically more stable. I cannot say the same for many Windows drivers. In addition, something like a FreeBSD web server is hardly comparable to an end-user Windows machine, yet this is always the example held up by the anti-Windows crowd. Add MP3 software, graphics apps, kids' games, a crappy sound and video card, and all the other stuff people put on user machines and then see how stable the other OS is.

I'm not blind. I know that on the whole Windows boxes are not so stable. I'm a professional Windows developer. I can say from first-hand knowlege that the bulk of problems with Windows is due to lazy, unknowledgeable, or sometimes hurried and overtaxed programmers. It's a real problem. I also keep a FreeBSD boot around and I'm very pro-GNU/Linux and especially pro FreeBSD. But I program Windows as my work, and know that the instability blamed on Windows itself rarely has anything to do with code written by Microsoft.

Re:Why doesn't Linux adopt a Mac OS X type scheme? (3)

gutter (27465) | more than 13 years ago | (#149509)

I agree that the Mac OS X does things is pretty cool, but you're slightly off. Applications are distributed as folders, but the GUI treats these folders as single objects (although there is a command to open the package). From the command line, they still look like folders. You launch them like 'open OmniWeb.app'.

Shared libraries are treated much the same way - they go in folders called 'OpenGL.framework' or whatever. They contain the library, the headers, other resources, and can contain multiple versions of the framework, which are versioned with a major/minor scheme - minor upgrades don't contain API changes, while major versions do - this way, programs can still use the right version of a framework if an incompatible one is installed, but can still benefit from framework upgrades.

I really do wish Linux and other UNIXes would move to this scheme - it's really nice.

Re:You wanna talk hell... (not NeXT style) (5)

gutter (27465) | more than 13 years ago | (#149510)

Actually, NeXT didn't necessarily bundle all the packages inside the app, although some applications did. I never owned a NeXT box, but most of this works the same way in Mac OS X, which I do use.

Most shared libraries on NeXT/Mac OS X are installed as frameworks. These frameworks are basically directories with .framework suffix, like OpenGL.framework.

Each framework contains a folder called Versions - this folder contains all the installed versions of a framework, which includes shared libraries, the headers necessary to use them, and any other necessary resources. They are versioned with a major/minor numbering system - minor versions get bumped for compatible upgrades, and major versions get bumped for API changes and the like. Programs know what version they were linked against, so if you install a new, incompatible version, the program can still use the old version. This pretty much eliminates DLL hell - you can install new versions without breaking old stuff.

Apple's frameworks go in /System/Library/Frameworks, local frameworks go in /Library/Frameworks/, user-installed frameworks go in ~/Library/Frameworks, and application frameworks can go in the application bundle. I'm not really sure of the precedence though.

It's a really cool system, and makes a lot of sense. Unfortunately, there seems to be a trend in Apple to install many unix shared libraries the regular way instead of as frameworks, to increase compatibility - makes many more things 'just compile'. I'd be much happier if more unix systems went to frameworks instead.

Re:Welcome to Windows? (1)

spectecjr (31235) | more than 13 years ago | (#149517)

Part of the DLL hell is the vast amount of them that are unnecessarily created by people who don't understand when static linking will work just fine. I still hear people claim that DLL's magically keep the executable size small. DUH! All it does is unnecessarily chunk up your program, increase file count, and increase loading time.

Hmmm... increase loading time? Imperceptibly if you use Rebase and Bind to tidy up your DLL use.

Unnecessarily chunk up your program? Not if it's done on functional divisions; one big advantage this has is that you can in-situ replace DLLs that are faulty by ONLY replacing the DLL - not the entire file (which software designed to patch files might not necessarily be able to do).

Besides... who's worried? I've been told since 1998 by all KINDS of people that Linux has never and NEVER WOULD have ANY problems of this nature, because it's not fundamentally flawed like Windows. Surely they weren't all wrong?

Simon

Re:Difference from Windows... (1)

spectecjr (31235) | more than 13 years ago | (#149518)

Unix systems have the concept of a version -- you change the API, you rev the major version of the library. The old one's still there if you need it, but apps will get (dynamically) linked against the version of the library they were originally linked against.

As does Windows - it's not an 'exclusively Unix' kind of thing. The problem only arises when developers DO NOT follow the convention that you change the filename when the version changes.

Simon

there are solutions ... (1)

madmaxx (32372) | more than 13 years ago | (#149522)

a few systems have good solutions for dll/so madness ... though good design really reduces the problem. for example, nextstep and apple's os-x use resource maps/app packages so the shared objects are mapped properly in the application package, and can be stored there or in the system packages. tru64 uses a filesystm/kernal .so versioning scheme ... that allows multiple versions of libraries to exist and be served somewhat sensible. and i am sure there are others. luckily, linux has the potential to improve the library hell ...

It seems to me that.... (2)

Foxman98 (37487) | more than 13 years ago | (#149529)

It seems to me that dll's are both a blessing and a curse for windows. I've always found that installing software on windows tends to be much, much simpler than on linux. For example, say you want to install program x on a windows machine. Typically what is required is running some type of setup.exe file, and the rest is taken care of. On the other hand, in Linux (Redhat for this example) one may download the rpm, only to find that many other rpm's are required. Upon trying to install those rpms, one finds out that each new rpm requires others... It gets freakin annoying as hell. Is the doze approach better? Probably not. I mean, I hate having my win\system folder filled with crap that I no longer need after uninstalling a program. But at the same time, I know that installing a new program isn't going to take more than 5 mins or so. Keep in mind that I'm a redhat/mandrake kinda guy, so apt-get isn't too helpfull for me. Surely there must be a more elegant solution?

Yea but (2)

brianvan (42539) | more than 13 years ago | (#149537)

- then you have a Linux distro that spans 10 DVDs just so you can include every single doo-dad program (and its libraries) on the planet. Next thing you know, MS starts advertising themselves as the quick and easy OS that needs only one CD to install. That, my friends, is hell.

Re:DLL hell? (1)

bludragoon (42763) | more than 13 years ago | (#149541)

Just say apt-get install and there's no hassle.

amen to that!
but remember most people have no idea how actually powerfull that can be.

Re:Application directory anyone?? (1)

Yoo Chung (43695) | more than 13 years ago | (#149542)

If you use application specific directories for shared libraries, you may as well statically link the whole thing in the first place.

shared library madness... (2)

_Quinn (44979) | more than 13 years ago | (#149543)

... is the primary disease which package managers propose to remedy. I would suggest creating `autopkg', a tool like autoconf and automake, which is used to automate the production of packages (start with .rpm, .deb, and .tgz, perhaps). This way maintainers could just 'gmake distribution' when they announce a release and the packages would be built automagically. More importantly, however, would be autopkg's ability to fetch shared libraries required during the build and recursively make them. Finally, if incompatible libraries are found, the installation process should wrap its binaries in scripts which set LD_LIBRARY_PATH to the necessary compatibility libraries (/usr/lib/compat) -- and they should be linked to _specifically by version_, so that different versions of compability libraries don't fight with each other.

-_Quinn

Windows, Gnome - same thing (1)

alehmann (50545) | more than 13 years ago | (#149544)

Windows has DLL hell. Well, so does Gnome. Gnome tries hard to emulate windows in all ways.

Look at the motif version of gnucash - how many dependencies does that have?

Me, I write software with minimal dependencies. Often just the C library, but sometimes libraries like GTK or SDL.

Re:Relax. (1)

alehmann (50545) | more than 13 years ago | (#149545)

Come on. I'm sure they could have done better. Installing gnucash on my system, not counting the libraries I already have installed, would take 35332k! (yes I actually checked).

This isn't a troll. Code reuse is sometimes a good thing, but often it results in awfully bloated software.

Re:Windows, Gnome - same thing (1)

alehmann (50545) | more than 13 years ago | (#149546)

I didn't say that I liked it. My applications link against X, libc, glib, and gtk. No berkeley DB, sorry.

I think SDL is pretty great. Just because whatever messed up version you got from your distribution was bad, well... SDL itself can use either or any of aalib, xlib, svgalib, or even GGI iirc for video. It supports esd, oss, and/or alsa for audio. Every single one of these is optional, although it helps to include at least one :).

Re:That's why I use gnofin. (1)

alehmann (50545) | more than 13 years ago | (#149547)

I think that's a pretty funny thing to say about a Gnome program.

Re:Relax. (2)

alehmann (50545) | more than 13 years ago | (#149548)

Okay, but what if I don't _want_ any of those applications? Well, I have AbiWord installed but it has no Gnome dependency (as long as I'm a developer...)

The package maintainer for Debian should consider splitting gnucash into itself and a -doc package. That way people will have more flexibility. I like to install all documentation on my fileserver but keep my workstations light.

It's only DLL Heck (2)

hamjudo (64140) | more than 13 years ago | (#149561)

I thought we called them shared libraries...

Shared library versioning can be made to work, so it's only annoying to have to have multiple versions of a library installed. Some other OS's still haven't figured that out.

Another usefull feature is that shared libraries don't have to be system wide. The system doesn't share libraries based on the name, it's really based on the inode and mount point. So two different libraries can be named libfoo.so.4.2 as long they're in different directories. The user may get confused, but the system won't.

As long as everything has a reasonable license, we our allowed to fix it ourselves.

I assume the Debian package is already done, or will be done shortly. Type apt-get install gnu-cash and let it sort it out.

Re:Libraries: Harken to the Bad Old Days (5)

dbarclay10 (70443) | more than 13 years ago | (#149573)

then the next will really floor you -- applications will keep their own .DLL's in their own application directories. Just like in the DOS days, you will be able to blow an app completely off your machine by deleting its directory, and version differences will become irrelevant.

Well, no a bad idea totally. It'd help for DLL-hell type problems, but let's raise a few points:

1) Firstly, decent packaging makes DLL-hell much less likely. I use the experimental variant of Debian, and even then, there are rarely library dependancy problems. The problems that do arise are usually easily fixable, as opposed to most of the DLL-hell problems that Windows has(ie: two applications requiring incompatible versions of the same library).
2) Secondly, Linux(as a *nix variant) allows one to have multiple library versions installed at the same time, without trouble. That was one of the design considerations.
3) Security. The main reason for shared libraries isn't space-saving(as you imply), but rather security and stability. The FooBar application relies on library Wingbat. So does app YooHoo. Now, the Wingbat library has a security hole that was just found. Oops! Well, a patch is released, packages are made and sent out. You upgrade(or your computer does it automatically), and poof - all of a sudden, YooHoo, FooBar, and all the other apps that use the Wingbat library are more secure. Ditto with stability. The Wingbat library has a nasty bug which causes crashes. Okay, a fix is made, packages are installed, and now neither the YooHoo nor the FooBar apps are susceptible to that particular bug.

Anyways, I just wanted to say that the main reasons for shared libraries isn't really the space issue. Nor is it a performance thing. It's a quality thing.



Barclay family motto:
Aut agere aut mori.
(Either action or death.)

Hehe (5)

Jailbrekr (73837) | more than 13 years ago | (#149576)

I'll bet the MS developers are rubbing their handing, and giggling gleefully right now. "Oh yes, now they understand, now they UNDERSTAND!"

Windows isn't "DLL Hell" anymore. (1)

-=[ SYRiNX ]=- (79568) | more than 13 years ago | (#149581)

The introduction of MSI (Microsoft Installer, aka Darwin) eliminates DLL hell on Windows with applications that use it. It is a system-level, centralized installer, and as long as applications use that to install themselves, then the system can keep proper records on installed components and their versions.

Before MSI, every application had to use its own installer, and each installer might or might not be so smart about checking for versioning / existance of DLLs before forcibly overwriting newer versions with older versions of the same library or blowing away a more recent registry key.

In Linux, you don't run into as many installation nightmares because you don't HAVE installers to take care of installing shared libs for you--you're expected to be smart enough to install those yourself along with the app, and you're expected to handle versioning of libs appropriately via symlinks and so forth. But Linux, like Windows before MSI, lacks any centralized database of versioning information, so if you're not a dilligent & careful system admin type, you can get into the same scenario due to human error.

GNU/Linux desperately needs some central, standard repositories of data and methods of acting upon them within the system, accessible to all applications, and application authors should be motivated to go through those central repositories/methods rather than bypass them and write their own ways. GNU/Linux is currently just as disorganized as the old MS-DOS era, when every application author wrote their own video and sound driver code from scratch only for certain supported hardware, or relied on such libraries as VESA for some lame level of standardized support. Those MS-DOS apps each had their own config files, sometimes in human readable form, sometimes in binary form, sometimes in script form, just as linux apps today all have to dump differently formatted config files into /etc. Linux needs the same concept of standardization and centralization that Windows brought to the MS-DOS PC with mechanisms such as DirectX and the registry.

Yea but what? (3)

knife_in_winter (85888) | more than 13 years ago | (#149590)

Never happen.

I just installed a Debian GNU/Linux system with 3 floppies (not CDs, not DVDs, but 3 1.44 FLOPPIES) and a network connection.

Once the system is up, I have access to, what is it now, over 6000 packages?

I hate to say this, but the network really *is* the computer, if you take advantage of it.

You wanna talk hell... (2)

Greyfox (87712) | more than 13 years ago | (#149593)

Try upgrading Xfree86 to the latest thing on the Xfree86 web site on a package managed system. Can't remove X because everything depends on it. If you overwrite it, the next package update you do will likely clobber your chages. And tweaking everything to work right will be a major pain in the ass.

Speaking of package manager hell, there's been a lot of griping on various mailing lists lately about RPM database corruption. Apparently something breaks if you switch from db1 to db3, and it's VERY easy for your RPM database to be corrupted. Try doing any upgrades on an RPM based system when the database is corrupted.

The DLL hell thing's been around for a long time, too. Anyone who's ever made the mistake of trying to mix and match package managed software (Try compiling some gnome app that doesn't have RPMs sometime and you'll see what I mean.) You have to go all or none with package managed systems. Or try burning the Ximian branded gnome off your hard drive sometime. That's good for a few hours worth of fun.

So yeah, the whole set-up could probably be a lot more robust. Ideally it could be done without bundling all the requsite libraries in with the package as I understand they used to do with the NeXT.

Re:static goddamnit! (3)

blakestah (91866) | more than 13 years ago | (#149598)

RAM usage.

If you use shared libraries, each gets loaded into memory ONCE. This is particularly good for something like the GNOME or KDE desktop. Do you really want 13 copies of the KDE libraries loaded into RAM because you used staticly linked binaries ??

Anyway, I wouldn't sweat it. This will not become Windows. Open source apps do not install new versions of existing libraries, and libraries increment version numbers when they break compatibility.

I haven't even thought about this in a long time since apt began taking care of it for me.

Its good to use lots of libraries (2)

bug1 (96678) | more than 13 years ago | (#149602)

The whole point of a library is that its shared code, its good that a project shares lots of code with other projects.

Image the bloat hell we would be in if everyone linked their projects statically to libraries.

Re:Why doesn't Linux adopt a Mac OS X type scheme? (1)

forgoil (104808) | more than 13 years ago | (#149610)

This is since they have been thinking when they made NeXTStep which grew into OpenStep and then MacOS X (iStep? MacStep? NextMac? ^_^). To the user, there is one icon == one application. Can it be more simple? I love this, it's great.
What I don't know is how they fix the libraries that are shared... that is still a problem, a big one, and nobody seems to want to fix it. Some try to manage it, but that is seldom a way forward.

Debs (1)

vivarin (106778) | more than 13 years ago | (#149611)

Seems to me that apt-get install works quite well, even when the dependencies are hairy. Assuming of course that the packager knows what's really necessary...

Re:Libraries: Harken to the Bad Old Days (2)

John Miles (108215) | more than 13 years ago | (#149612)

Do I need 20 copies of the same damned .NET DLL on my disk, one for each application? I think not. I do not consider this an intelligent move at all.

When's the last time you saw a "disk full" error?

Personally, I'm in danger of forgetting what that means. When I retire a machine these days, there's always at least a few hundred MB, more likely a gig or two, of disk space left.

Along with the inevitable rise of COM, keeping DLLs out of c:\windows\system has gone a long way toward remedying Windows' version of DLL hell. "Intelligent" or not, there are some very good, real-world-type reasons why each app should maintain its own DLLs locally. Everything just works better that way.

Why use shared libraries? (2)

Animats (122034) | more than 13 years ago | (#149629)

The idea of shared libraries was supposed to be that multiple applications would share the same library. On a single-user desktop machine, the odds are against this except for the most common libraries.

Worse, when you bring in a shared library, you bring in the whole thing, not just what you need. With static linking, only the referenced stuff (ignoring the C++ vtable problem) comes in. Yes, some of it may get paged out later, but you pay for the I/O during program startup.

Now if you're using shared libraries which have shared state, that's something else. And it's something else that ought to be done via an object broker, like CORBA, not by abusing the shared library or DLL concept.

Sometimes running separate executables in subprocesses would be more appropriate. Things like file import filters ought to be run in separate processes, for example. That, after all, is the "UNIX way".

Guidelines (1)

spac (125766) | more than 13 years ago | (#149634)

Well you've brought up a good point about a lot of development that's going on under linux right now. IMHO the best way to ensure that we don't damn ourselves to dll hell is by setting up a set of requirements that all library authors have to follow.

First of all, the interface to all functions and classes in a shared library MUST be backwards compatible with previous versions.

Secondly, some sort of package manager, that could connect to an online repository containing every major shared library (something like CPAN) in order to ensure that libraries can be acquired, upgraded, and repaired easily without reinstalling a product.

A third solution to the DLL problem would be for an application to store all the libraries it uses within its own tree. Although this is redundant since several programs may use the same shared library, it ensures that an application's libraries will never be affected by the installation or removal of other applications on the same system.

Basically, to all you developpers, if you're going to use shared libraries in your products, be responsible with them and never assume your app is the only one on the system. -Mike

Is it really necessary.. (1)

revengance (132255) | more than 13 years ago | (#149639)

.. to use so many libraries?

Re:Let the package maintainers take care of it (1)

revengance (132255) | more than 13 years ago | (#149640)

If I want to let the package maintainers taking care of it, I would be better off using Windows. I use linux because it allows me to fiddle with it, and if doing anything going to break some dependencies, it is gonna to be a sad day for linux.

This is why 'Stable' distributions exist (5)

big.ears (136789) | more than 13 years ago | (#149642)

From what I understand, one of the primary reasons for having two or three levels of distribution is to avoid this dependency hell. In debian, e.g., everything in Stable (should) play nice together, but the newer stuff is more likely to break other stuff. This means that a lot of the stable stuff is a year or more old, and because of Linux's nascent status, this means that if you want to get something that is usable, you have to deal with conflicts.

I think this is one of the primary hurdles facing Linux's wider adoption. Nobody wants to mess with upgrading/downgrading libraries, and you rarely have to do that stuff in Windows. For example, I have never been able to get the newest versions of galeon, mozilla, and nautilus installed and working at the same time. And, perhaps unrelated, gnumeric and netscape 4.7 no longer work. Of course, its not impossible to fix, and I'm not trying to sound like I'm whining, but I don't know to many of my friends who are Windows power users & programmers who would put up this stuff.

Hopefully things will improve when libraries become more stable, and apps move into versions 1, 2, 3, and higher. Then, release cycles should get longer and less drastic, and everything should be easier to use together.

Stop Whining (2)

FnH (137981) | more than 13 years ago | (#149644)

This is the first time I encounter FUD in a Slashdot article, and honestly, I expected that many readers would.ve already pointed out what I'm about to.

Unix and Linux don't have a DLL Hell like MS had. Unices have a quite simple and elegant solution for this problem.

Have you ever looked in your /usr/lib/ directory? The you'll see that most (if not all) .so-files are appended with a version number. In contrast with windows this allows you to have two different API's for the same dll. The linker will decide at run-time wich version is the right one, and hop ... you're happily on your way.

One more time: Linux doesn't have a DLL HELL.

Now stop whining. Unix has its flaws. This isn't one of them. You're making people nervous for no good reason whatsoever. Thank you.

Versioned libraries (2)

Alomex (148003) | more than 13 years ago | (#149655)

Automated replacing of shared libraries is the road to hell. Windows has proven that.

Each library must be versioned and each package must call explicitly a version of each library. If the desired version is not available, the installer should deploy it.

Now that would be worthy of an advanced OS. Anything else is simply an open source implementation of a good but outdated OS (aka Unix).

Re:Versioned libraries (2)

Alomex (148003) | more than 13 years ago | (#149656)

If I had every version of a libary on my computer then my 4GB /usr directory would very quickly become bigger than my 40GB capacity

Versioned libraries does not imply having every version in existance. In all likelihood applications would support a range of versions, plus as you update to newer versions of each application, older libaries can safely be removed.

Well it'll be a lot safer to only use the libary which the programme was developed for but the development cycles are way too short.

I think this is a peculiarity of the immature state of Linux. As it ages changes will not occur as often, and improvements would be so minor that most people would hold on the upgrade until a full 1.0 change (only fresh installs would use the latest code)

What linux needs is just better scripts for creating packages and better error handling for when installing packages doesn't go right.

Those who do not learn from the past are condemned to repeat it. Automated replacement of shared libraries is the road to hell.

My opinion (2)

danpbrowning (149453) | more than 13 years ago | (#149657)

First, remember that unicies can have as many versions of the same library installed at once, each having the possibility of just being a symlink, or being symlink-ed to.

I think that if it is too complicated for you to figure out the shared libraries, then you should stick with the standard distro packages.

I.e., wait for gnucash 1.6 rpm (or deb) comes out for your distro. That way you are leaving the problem in the hands of the distro engineers.

Another solution is to lookup all missing libraries on rpmfind.net and install them as they are needed (yes, that's not the whole problem, but...).

IMHO.

static goddamnit! (1)

rtscts (156396) | more than 13 years ago | (#149660)

WTF is wrong with statically linked binaries? The authors get to play with dependancies, and the user gets a binary that works. If you wanna fuck about with your own library combos, compile from source, since you're obviously an expert.

Re:static goddamnit! (1)

rtscts (156396) | more than 13 years ago | (#149661)

To all who said "RAM usage":

No I don't run a server for a huge number of users, but if I did then I'd probably not be on the bleeding edge anyway, and take the time to fsck about with dependancies.

Beside, go to pricewatch.com and check the price of memory.

video drivers or gfx drivers? (2)

green pizza (159161) | more than 13 years ago | (#149664)

I've had very good luck with the drivers for my various standalone PCI video capture cards on my PCs. It's been the graphics card drivers that have given me nothing but trouble (especially early Matrox and NVIDIA drivers).

Re:static goddamnit! (3)

peccary (161168) | more than 13 years ago | (#149665)

Open source apps do not install new versions of existing libraries, and libraries increment version numbers when they break compatibility.

then how come cdparanoia (or any of a dozen other applications) insists on a different version of libc.so.6 than the one I have installed? Silly me, in twenty years of Unix experience I had naively expected that "requires libc.so.6" meant just (and only) that.

Re:Is there a real solution to this? (1)

kz45 (175825) | more than 13 years ago | (#149678)

the idea of "STANDARD" libraries on a linux system, although needed, are against the ideals of the slashdot community, because linux is "freedom" and a standard set of libraries would mean "conformity".

Re:Application directory anyone?? (1)

gatesh8r (182908) | more than 13 years ago | (#149684)

This I have to agree with. It makes rational sence. Loki does this for their games, and has front end bash scripts to set up the varables, etc, that are needed for the dynamic loading of the program. I am not a Loki developer, but you should try to actually use a hex editor on what you think is a binary to just see the file begin with #!/bin/bash :-)

The other thing I have to comment about is the universal question of upgrades: 1) Is it too soon, and 2) is it worth it? Things like dependancy hell is one thing that sucks. Another thing is resolving all the dependancies to see all sorts of other apps shatter sucks too. I agree. Sometimes upgrading isn't worth it you know.

This next phrase sounds redundant, and it most likely is. Let someone weed out the dependancies that gets paid to do so or is willing to volunteer to do so. Hell, I'd wait for GNONE 1.4 and my other programs to resolve with the depandancies if I wanted to do a whole system upgrade; by that time the next release of my favourate linux distro would be out!

Otherwise, ya, I agree with the parent post here. We should do have a -seperate- package for those who want to keep various other shared libs around, especially if they are core libs!

Re:It's not DLL hell that makes Windows unreliable (1)

jonnystiph (192687) | more than 13 years ago | (#149695)

Add MP3 software, graphics apps, kids' games, a crappy sound and video card, and all the other stuff people put on user machines and then see how stable the other OS is.

Alright I did just that on a linux slackware box with a gui, not just a shell, and well its a hell of a lot more stable than any windows machine I have owned.

What I am saying more than anything, is that the reason that I switched to linux was because of windows instablility, well among other things, but I didn't switch machines and I had much of the same genre of software, so perhaps you are right, but in my expiernce that statement was incorrect.

Re:Difference from Windows... (1)

mkcmkc (197982) | more than 13 years ago | (#149699)

The major reason we refer to it as dll hell on Windows is very simple -- there's no concept of a version. App A uses v6 of foo.dll, app B uses v8. It's still named foo.dll. Oops -- the API changed.

Right. And along with that, the custom in the Windows community seems to be for each application installer to overwrite whichever DLLs they like with whatever version they like, potentially breaking your other applications in arbitrary and mystifying ways.

We're past that problem in Linux, since multiple versions can exist peacefully side by side.

--Mike

Re:Application directory anyone?? (1)

friday2k (205692) | more than 13 years ago | (#149710)

Maybe I am just plain stupid here? But is it not about _shared_ libraries? Id rather put another abstraction layer in between, agree on standards for call interfaces and call it done. Require minimum version and do not kill the rest if you upgrade. But this requires stringent programming in a distributed environment. And agreements on standards. Welcome to comittee work hell (and is it so much better than DLL hell? Both are hot ...).

Welcome to Windows? (1)

Moridineas (213502) | more than 13 years ago | (#149719)

Kinda scary. DLL Hell has _long_ been a regular part of dealing with windows, going back to at least 3.1 (my first experience with it). Today it doesn't seem like as big a deal, I'd guess primarily because available file space and general system resources are so much higher than in those days. Still, the number of random dll files in windows is shocking.

I hope some sort of solution is worked out for linux. One of the things I've always liked about linux has been how you have the option of running a very streamlined system W/O tons of junk files laying around.

My experience with newer versions of RedHat (and stuff like Gnome in particular) don't really follow that so much anymore. It seems like just about every app beyond the certain core set requires like a million dependencies now.

Maybe there is a better way of keeping track of, and storing all these libraries?

Scott

DLL hell is a function of change (1)

KarmaBlackballed (222917) | more than 13 years ago | (#149730)

While shared libraries are evolving, there is greater likelyhood that folks will mix versions. True for Linux, true for Windows. There is no idiot proof fix.


~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~ the real world is much simpler ~~

Central Authority (1)

FreeMath (230584) | more than 13 years ago | (#149743)

Would it be easier if there were some sort of central authority keeping track of libs (keeping track, checking for conflict and overlap). I am also led to believe that all of this code does not need to be linked. I prefer to have one big binary, over a small one with 12 libs.

Re:It's not DLL hell that makes Windows unreliable (1)

FreeMath (230584) | more than 13 years ago | (#149744)

I agree that bad drivers and shitty hardware can bring down the system. But Netscape or a crappy mp3 player crashing shouldn't bring down the whole OS. I use Netscape 4.7(motif) in Linux and when I goes down in flames the OS remains stable.

Re:Difference from Windows... (2)

Twylite (234238) | more than 13 years ago | (#149749)

This unfortunately is not good enough. Replacing an old DLL with a new one can break things.

Too often the new DLL corrects old bugs for which a program had to use a workaround, which causes the workaround to fail. Alternatively the new DLL can change functionality subtly or introduce new bugs, causing the same problems.

The ONLY option for true compatibility is to have multiple, separate DLLs, and dynamically link against the correct one.

This does, however, cripple the OO capabilities inherent in the OS: if you update DLLs like MSVCRT, you can cause L&F changes to all apps, keeping the system updated and consistent in L&F. BUT you do this at the risk of introducing the problems given above.

Is there a real solution to this? (1)

samrolken (246301) | more than 13 years ago | (#149757)

How can everyone agree to make libs compatible, backwards compatible, and standard? Isn't there some way to accomplish this?

Re:Let the package maintainers take care of it (2)

jsse (254124) | more than 13 years ago | (#149766)

the package dependencies will take care of those 60 libraries for you. At least that's the case with Debian,

A package relys on such a huge set of dependencies would cause many dependency problems when you apt-get upgrade once a while. Slight problems would cause packages holding back or something like "...package xxxx is needed but yyyy is installed instead." failure.

Remind me to remove gnucash tonight.

P.S. apply to Debian unstable/sid distro only. Stable version shouldn't have that much problem because it's well-tested and upgrade is less frequent.
&nbsp_
/. / &nbsp&nbsp |\/| |\/| |\/| / Run, Bill!

Shared code, DLLs &c (1)

os2fan (254461) | more than 13 years ago | (#149773)

So OK, I'm not a programmer, I'm an end user.

The whole idea of external libraries was so that programs could share code or minimalise system load. This is as true whether you think of DOS overlays, OS/2 DLLs, or whatever. The notion is already present in C (with header files). For example, under DOS, there is no reason, why say XTREE and NC could not be merged to provide different file management interfaces, and shared file viewers. This could be extended to a generic command line utility that simply views the named file.

When you set up an API for third party use, you are inviting other people to use your routines. Since they will call your library and use it with certian expectations, then it is up to you to meet your end of your deal.

You do not need to update the dll every time you update the product - look at VBRUN?00.DLL for proof of this. But if you have a DLL like, say, MFC42.DLL, where some programs use vers 4, and others use vers 6, then there is a course for concern.

This is probably a bigger concern for the open source vendors than the closed source vendors, purely because the former have a much faster release rate. For example, the EMX package for OS/2 is at 0.9e or something. Some programs have a minimal requirement, like 0.9c. But it is not beyond some future version to become incompatable with an earlier program. And then one has to run either the earlier or later version.

An alternate version would be to design some interfaces which users could put together different files. This sort of thing is used by File Commander/2, ZTBold, Object Desktop, which load external utilities seamlessly. So you do not need separate libraries to open ZIP or RAR or CAB files in these utilities. What you do is to have a dedicated ZIP or RAR or CAB utility, which programs can open, display and use seamlessly. Adding a new archive should be as easy as entering the name of an INI header to the ARCHIVES list.

One can use libraries intelligently, to minimise the amount of loaded and stored code. The power of the unix pipe is due to its ability to thread together divers peices into a meaningful thread. The open software market has the room available to create intelligent libraries that may be used by diverse programs. Exploit this window of opportunity.

Libraries: Harken to the Bad Old Days (2)

localroger (258128) | more than 13 years ago | (#149776)

Once upon a time there was this guy named Linus who had a Better Idea, which was to take this well tested OS built for mainframes but generally thought too complex for the desktop and put it on the desktop. And lo, Linus succeeded (with a lot of help from a lot of his friends), and we have Linux.

We also have desktop machines that are much more powerful than the mainframes on which Unix was first driven. Some of the space-saving strategies that seemed like a Good Idea (tm) then now seem like really Bad Ideas (tm) now. Kinda like what happened to a certain closed-source software company we could mention.

I would also mention that an associate of mine who gets front-updates on what That Closed Source Software Company (tm) is doing has told me that their .NET strategy includes a number of actual intelligent moves. One is the removal of the Variant type from the Visual Studio suite. And if that doesn't seem like a good enough idea to you, then the next will really floor you -- applications will keep their own .DLL's in their own application directories. Just like in the DOS days, you will be able to blow an app completely off your machine by deleting its directory, and version differences will become irrelevant.

Sounds like a good idea to me. Maybe youse linux guys should like thinka implementing something like this before the Redmond Juggernaut gets up another head o' steam.

Re:Difference from Windows... (2)

Ayende Rahien (309542) | more than 13 years ago | (#149792)

Actually, MS' guidelines says that installer has to make certain that it doesn't replace newer DLLs with older ones. It also has a good mechanism to track DLL usage, so you know when you can safely delete a DLL.

Unfortantely, a lot of people ignored both recomendations, which cause the problem.

Although, I must say, I've not encountered such a problem on windows in a long time. Has something changed?

--
Two witches watch two watches.

Re:It's not DLL hell that makes Windows unreliable (2)

Ayende Rahien (309542) | more than 13 years ago | (#149793)

I'm going to ignore most of the above, I can't agree with you, but I won't argue.

Most applications *can't* be uninstalled by deleting their directory.
You have registry keys, Dlls in Common Files, etc.

--
Two witches watch two watches.

Re:It's not DLL hell that makes Windows unreliable (3)

Ayende Rahien (309542) | more than 13 years ago | (#149803)

I agree, but you have forgot two things:
A> The OS can only protect itself from applications whose permissions are limited. (IE, as root, I can hose a Linux system, not matter how stable it is supposed to be).
B> Windows 9x is a single user OS, meaning that everything runs as root. We can spend a long time discussing how bad it is. But it was neccecary for backward compatability.
Win9x can't be blamed for its instability, it allows direct access to hardware, for crying out loud. What you *could* blame is Win9x design, which prevents the system from being stable.
MS did a wonderful job making it semi-stable, though. I mean, I can seat on a 9x machine and don't get a BSOD for an hour at a time, sometimes. :-D



--
Two witches watch two watches.

Microsoft has already fixed this. (2)

slashdot.org (321932) | more than 13 years ago | (#149810)

sound of fanfare
Look at this page: http://www.microsoft.com/windowsxp/pro/guide/featu recomp.asp [microsoft.com] .

Under 'Side-by-Side DLL Support' you will see that this is been fixed in Microsofts latest state-of-the-art Operating Systems already.
/sound of fanfare

Effectively nuking the whole idea behind DLL's of course, but then again, they started of with putting the least used functions in DLLs and not the common stuff in the first place. So who cares...

The from the complex-software-causes-complex-problems dept. makes me choke though. Pulease...

I just want to know that they do (1)

m_evanchik (398143) | more than 13 years ago | (#149814)

What's really important is that the libraries' (in win or lin) functionality is well defined. When you have no idea what it's supposed to do, that's when you can't keep your set up clean. Libraries, used right, free up memory and ease development. Done wrong, they cause a big mess. There should never be anything on your system that you don't know what it does.

Re:Application directory anyone?? (1)

Thurn und Taxis (411165) | more than 13 years ago | (#149821)

Um... what's the point of a shared library if it's not shared?

You ARE the Missing Link. Goodbye!

Re:Installing Free Software (2)

Tachys (445363) | more than 13 years ago | (#149828)

When did I say anything happens behind the scenes?

The point of my proposal is nothing happens behind the scenes. It just works

I can install apps like this in Mac OS and Mac OS X.

Also, in the Mac OS a "clean install" involves just replacing my system folder. You should have seen the look on my face when I found out what a "clean install" on Windows involved.

Installing Free Software (4)

Tachys (445363) | more than 13 years ago | (#149830)

It always seems to me that Free Software could get an edge with ease of installation.

Installing propertary software requires putting in a serial number and then install some updates.

But with Free Software I can always get the latest version, and no need for serial numbers.

But installing apps on Linux more difficult then Windows.

When I want to install something what shoud happen is

I download it tar file.

I double-click on it to uncompress it.

This shows a Gnucash folder.

It's installed!

I can go in the folder and open Gnucash.

I can then move that folder where I want it.

Why can't Linux do this?

One wrong fact (1)

jdavidb (449077) | more than 13 years ago | (#149839)

No, UNIX has been around longer. Since ca. 1970, in fact. (Was Gates even born then? I wasn't.)

You do know Linux is basically UNIX, don't you?

UNIX was about as near to open source as possible when it first came out. You bought the source code on tape and figured out how to port it to your hardware yourself.

What do you mean "sell"? (1)

jdavidb (449077) | more than 13 years ago | (#149840)

Now, when you say, "sell," do you mean I get the intellectual property rights?

Oh, yeah, this is software. Never mind. Selling means the seller still owns it and the buyer gets a warranty for the physical media.

Good ideas, but I don't like LD_LIBRARY_PATH (3)

jdavidb (449077) | more than 13 years ago | (#149841)

if incompatible libraries are found, the installation process should wrap its binaries in scripts which set LD_LIBRARY_PATH to the necessary compatibility libraries (/usr/lib/compat) -- and they should be linked to _specifically by version_, so that different versions of compability libraries don't fight with each other.

Excellent plan. Just so everyone knows, though, LD_LIBRARY_PATH is rarely needed. In this case, it is only needed because the binaries are precompiled. If you ever have to set LD_LIBRARY_PATH, the software should be recompiled correctly!

Neat eye-opening information about LD_LIBRARY_PATH can be found at Why LD_LIBRARY_PATH is Bad [visi.com]

I don't think we're going to see anything analogous to the DLL problem because most shared libraries use explicit versions. But I would love to get rid of the madness of being told to set LD_LIBRARY_PATH to run software I just compiled! All you have to do is set LD_RUN_PATH during compilation. (See that link!) One notes that Perl's MakeMaker system always sets LD_RUN_PATH appropriately when compiling an extension module.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>