Microsoft to End DLL Confusion 687
MankyD writes "ZDNet is reporting that Microsoft is attempting to do away with DLL version conflicts in its next version of Windows with a technology it calls 'Strong Binding'. When new programs attempt to overwrite old versions of DLL's, the Strong Binding will index the DLL's and allow the programs to reference them by a unique ID, rather than by file name. Hopefully it will prevent a new program from breaking an old one. I would think this might add to DLL clutter however."
Auto-DLL Managment? (Score:5, Interesting)
Re:Auto-DLL Managment? (Score:3, Interesting)
Those of us who use MS products, and rely on modified versions of dlls for proper functionality, (Macrovision removal in powerdvd, for instance) will be screwed.
Re:Auto-DLL Managment? (Score:5, Interesting)
The versioning problem could have been handled with an external nameing approach (name-version.major.mindor.dll) and then using a resolver program. They are electing to give an internal id which implies that dlls will not be in one place or will have different names. This will make it hard to change out dll or know which ones to transport.
Actually, just thinking about it, I've heard that longhorn is supppose to have a sql based file system. It would make it easy to sux this into it and use the id as a way to find the dll. You will simply need to know it.
Re:Auto-DLL Managment? (Score:5, Insightful)
If an app installer is so badly written that it messes up your installation then the software can't be much good either.
Re:Auto-DLL Managment? (Score:5, Insightful)
1 - force all non-longhorn users to upgrade
2 - force all software vendors to code to the new
3 - integrate SQL*Server into the OS
Also, imagine what a nice kick in the teeth to Java (which I'm sure is a bigger radar blip than WINE) this will be. I think this will backfire on them, lack of full backwards compatibility is *one* of the reasons why XP never took off. This one lacks any backwards compatibility so you can just extrapolate the barriers to adoption.
Re:Auto-DLL Managment? (Score:5, Insightful)
Gimme a break. DLL Hell has been a problem for a long long time so when they actually try and fix it they are now only trying to fix it to break WINE? That's a strech and I'm sure you know it. Yes you can go back to a restore point but that does not solve the problem. Now one app works but the other doesn't because they are both calling the same dll expecting different versions. Restore points are a temporary solution to a legit problem.
If an app installer is so badly written that it messes up your installation then the software can't be much good either.
So using the current system how do you propose that app developers deal with this? Say when I compile against a dll I am expecting version 1 and it works fine. Three years go by and the dll has gone through 2 new versions released with the latest service pack. Someone installs my application and it copies the old dll over the new one. The system is now screwed. My other option would be to link dynamically and since the new version is on the machine my application simply wouldn't work. BUT using the new system both my application and other applications would work fine, using their appropriate dlls. So how is this a bad thing again?
Can anyone propose a better solution? If so I'm all ears.
Re:Auto-DLL Managment? (Score:5, Informative)
libfoo.so -> libfoo.so.4
libfoo.so.4 -> libfoo.so.4.3
libfoo.so.4.3 -> libfoo.so.4.3.2
libfoo.so.4.3.2 (this is an actual file)
libfoo.so.4.3 -> libfoo.so.4.3.5
libfoo.so.4.3.5 (4.3.2 file deleted)
This system has been around for decades in the Unix world.
Re:Auto-DLL Managment? (Score:5, Insightful)
Here's one of the main situations that Microsoft is trying to address: Microsoft ships FOO.DLL with Windows, or as part of some SDK (like DirectX). Company 1 develops an app, ships it, and on the app CD, it has a copy of FOO.DLL. Company 2 does the same thing.
Now. A bug is discovered in FOO.DLL, and it must be fixed. Unfortunately, fixing it one way causes app 1 to crash, and fixing it the other way causes app 2 to crash. And both apps link to the same version.
So what do you do? In the past, you just crossed your fingers and looked the other way, and tried to write code that behaved as best as possible in whatever circumstances you could think of. But inevitably bug fixes cause other bugs (regressions).
So, Microsoft is trying to solve this, by changing DLL binding, in two important ways:
1) DLL binding will use much more than just a name. Microsoft has developed a very powerful & flexible way to do DLL binding. This is what the
2) You'll have total control over redirecting DLLs, on a per-app basis. Some docs here. [microsoft.com] This means that you can override DLL binding -- if app 1 MUST have version 4.3.5.1.34 of FOO.DLL, and app 2 MUST have version 4.3.5.1.35b, then you can write a simple XML file for each one, that controls exactly which version they get.
Anyone who reads some kind of evil into this is just plain stupid, and has never done any serious development. Any programmer worth their salt knows that DLL binding is ridiculously crude -- and that goes for every modern platform. Microsoft suffers from this more than most, and has therefore decided to do something about it, and has designed a pretty good system. If you have half a brain, you should check it out, and try to keep that knee-jerk reaction under control. (I'm not directing that comment at Grishnakh, but at all of the slobbering idiots who just flame Microsoft whenever they see the name in a headline.)
Re:Auto-DLL Managment? (Score:3, Insightful)
I don't think that's so much the issue. One thing I *hate* is a seeminly simple installation that then requires a reboot.
But, as a bit of a Windows programmer myself, I understand it. If a program happens to require a newer version of some DLL, that is currently in use, it can't replace it. And with Windows' current system, you can't just use a local copy of it either.
When a program loads up a DLL, you can't specify a full path, just the filename. Windows has a specific search order, and the first place it checks is memory. "someapp.dll" is already loaded, so it uses that code -- even if you have a newer version in your own program directory.
I've always wondered why the hell they went with this approach. You have to watch for name conflicts between private DLLs (my program may happen to have "mp3.dll", which is completely unrelated to some other program's "mp3.dll"). And of course if an application uses a newer version of a system DLL (common controls library is a commonly-upgraded one), replacing the system-wide DLL is required, and naturally a reboot is required. And there's always some chance this upgrade may break an older application...
You can also forget about renaming a copy for private use, too; many of the system DLLs reference eachother by name. It works if you hex-edit them to reference the new names (I did this only as a proof-of-concept with VB runtimes)...
In my opinion, the best/easiest solution for developers would be to chnage the search order to start in the application directory -- or better yet, only do this if some registry flag is set, so older apps can have the current default behavior...
However, anything to aleviate this would be nice...
Re:Auto-DLL Managment? (Score:3, Funny)
So, too, the system registry.
We now return you to your shared delusions.
Re:Auto-DLL Managment? (Score:4, Informative)
There is a massive community of XP-Styles, all centered around a hacked uxtheme.dll. If Explorer specifically looks for a specific version of uxtheme.dll instead of the filename, they're SOL.
Re:Auto-DLL Managment? (Score:3, Interesting)
Re:Auto-DLL Managment? (Score:5, Insightful)
Sounds like a great idea.... each and every program will have it's "own" DLL
Call me old-fashioned, but I thought the point of dynamic libraries was to reduce program size and duplication of effort by allowing multiple programs to load common functions out of libraries.
So, it's a great idea, insofar as it completely negates the advantage of having DLLs in the first place. Why don't they just compile statically instead?
Re:Auto-DLL Managment? (Score:4, Insightful)
Re:Auto-DLL Managment? (Score:3, Informative)
I think this simply works by allowing different versions of the same DLL.
Right - given N programs using foo.dll, how many different versions of foo.dll do you expect will be needed? Based on my experience with Windows, I would expect somewhere between N and 2*N (allowing for "upgrades" of N[i] which won't remove the "old" DLL version.
Yes, theoretically, you'll have a smaller number of DLLs than you do programs. Realistically, I don't see it happening. The various versions of Microsoft's XML DLL (which comes to mind because of the security patch which requires you to figure out which versions you have installed, which is always more than 1) illustrate this perfectly.
hilarious... (Score:3, Informative)
Re:Auto-DLL Managment? (Score:3, Interesting)
Get rid of it already! This patchwork systeming is well, I guess it's putting a roof over my head. But it is also unnecessary complicating an already excessively unnecessarily complicated system.
It really is getting time that Microsoft put windows in an emulator and re-wrote the functionality of the OS. It will never happen, but it is time.
-C
Re:Auto-DLL Managment? (Score:3, Interesting)
No, I get the point completely. It isn't a complex design.
My point is about what the end result will be after programmers use this: cruft. Major cruft. Cruft beyond all cruftiness of cruftension. So it has always been with apps that get picky about library version, and so it will always be. I have seen this happen to Unix applications, I have seen it happen to Windows applications, I have seen it happen to COM/DCOM, and I recognize the beast when presented as .NET.
Assemblies in .NET are quite powerful and useful.
No doubt. In the programming world, however, power is rarely wielded wisely or for good.
Re:Auto-DLL Managment? (Score:5, Interesting)
You see, if you just wanted to split your executable code over several files for one reason or another, you could include DLLs with your program (in its own directory) ever since. Those wouldn't ever be changed by Windows, but this has nothing to do with the registry. The whole idea behind registered DLLs that reside in a centralised place is that you have shared libraries, that you don't have to store code used by several programs in multiple places, but only once, where you can do easy updates.
However, now that there are so many versions of Windows out there, Microsoft is experiencing compatibility issues with DLLs and they're doing something about it. I'm not familiar with the details of their solution and don't want to say it's a bad one at all. But your ideas are a little too extreme; saving one copy of a DLL per program is just absurd.
Re:Auto-DLL Managment? (Score:3, Insightful)
Like Unix but with more Cruft (TM) (Score:3, Flamebait)
And here comes M$ taking the same idea, and adding a point of failure in the form of some binary index of dlls. Jeezz this is just another thing I'm gonna have to fix when my windows friends start having trouble with thier computer. Really unnecessary. Couldn't they have just outright copied the Unix method? At least then they would have done it right.
DLL vs static libs (Score:5, Insightful)
In the old days, when diskspace & memory were precious goods, they made sense to share code. But today, what's the burden of 4MB extra app size compared to the DLL misery ?
Except for plugins, I see no reason why developers would need DLLs. Can anyone shed some light here ?
Re:DLL vs static libs (Score:2, Insightful)
Plus, if you`re going to support them for plug-ins, then the system is in place anyway.
Plus, if you know what you`re doing, there *is* no DLL hell - install them in the directory where you install the app, rather than in a shared directory, give them a sensible name, and you`re sorted.
Re:DLL vs static libs (Score:5, Informative)
When 1Gb of memory becomes standard, then maybe. In any case, it is downright inefficient to have the same code in 3 or 4 places in memory, even if you have got loads to spare.
Re:DLL vs static libs (Score:4, Interesting)
Dropping the fancy XP theme frees up about 5 megs of RAM, but if your system only has 128 or 256 megs of RAM you don't have a lot of room to load apps.
Re:DLL vs static libs (Score:5, Informative)
Re:DLL vs static libs (Score:5, Interesting)
Actually I think the reason XP takes up so much ram is because it is loading up a ton of USELESS DLL's and COM objects into ram so that IE will load faster and Word XYZ will open immediately, .NET Framework, VB Framework and the list goes on. A whole bunch of junk that really should not reside in RAM just so that MS's stuff loads faster than the competitors products. DLL's are a great idea but Microsoft has found that they have too many DLL's and they are disorganized.
The idea that an opperating system has a standard core set of DLL's that all programs can use to add functionality to programs is a WONDERFULL IDEA! Microsoft is only finding out that there is such thing as too much of a wonderfull thing. They have been writing DLL's and COM components for windows for almost ten years and they need to keep backward compatibility with all of them no matter how ugly or badly implemented an original DLL was done. This method will allow them to stop having to keep backward compatibilty (but leads to other problems such as bloat). They will still have too many DLL's that are only usefull to only one or two programs (and this solution won't make it better) and really should be statically linked or put in a private directory of the XYZ program (and uninstalled when the program is) but this is a good step in the future management of DLL cruft because they can at least start the process of breaking backward compatibility of badly designed libraries.
In theory; you're right (Score:3, Insightful)
Unfortunately, even when 1GB is standard, the problem is that people will be running Windows KAE-T (Kick Ass Experience - Trusted) which requires about 927MB of memory without themes.
Re:DLL vs static libs (Score:5, Interesting)
It also preserves ram by sharing common code between different applications.
It also makes upgrades and bugfixes easier (think openssl, for example).
Re:DLL vs static libs (Score:5, Insightful)
Plus, if two running processes are sharing a shared library, then a task swap doesn't completely blow away the cache.
Finally, I think with Windoze I think there is a mode in which DLL's can have global data, shared among all programs using it. Not sure though, its years since I did any windoze programming.
Re:DLL vs static libs (Score:3, Interesting)
I currently have 6 programs running, I think that's pretty typical.
How many processes? 35.
Most running programs are less than 50% library code.
Hm, I'll read that as "running processes", since that's what's important.
Let's take one at random: winampa.exe
(winamp's "agent", the little toolbar dealie)
Total image is ~8MB
This includes:
- ntdll.dll: 492kB
- kernel32.dll: 724kB
- user32.dll: 400kB
- gdi32.dll: 240kB
- advapi32.dll: 368kB
- rpcrt4.dll: 448kB
- shell32.dll: 2300kB
- shlwapi.dll: 400kB
- msvcrt.dll: 280kB
- comctl32.dll: 552kB
- imm32.dll: 104kB
- shw95dll.dll: 124kB
- wow32.dll: 256kB
- ntvdm.exe: 644kB
- comdlg32.dll: 248kB
- version.dll: 28kB
- lz32.dll: 24kB
- winampa.exe: 36kB
I don't know about rpcrt4 or some of these, but shell32.dll alone is a huge fraction of its total size. Well over 50% of the memory use of this process is shared.
Trivial apps aren't, but how many trival apps do you run at the same time?
Is this a trivial process? That's about 7MB of shared code. If this is a typical amount (I believe it is), apps have to use >14MB to be less than 50% shared code.
How many do? Only 5 of the 35.
Re:DLL vs static libs (Score:5, Insightful)
Bugs.
If there is a bug or a security problem in a DLL you only have to replace that DLL instead of all statically linked programs.
Remember the problems with zlib [gzip.org] a year ago? Would you like to replace 500 applications [gzip.org] or one library?
Re:DLL vs static libs (Score:3, Insightful)
This new fix seems to break that. You'd need 10-20 different versions of the DLL to overwrite the "unique IDs", if it is at all possible. Another "fix" that fixes the symptoms and introduce new "features", not the real problem (lack of code-proofing and immature languages).
Can be quite entertaining in the future. I sure hope my company stays clear away from Longhorn.
Re:DLL vs static libs (Score:5, Insightful)
It's a classic case of the elimination of duplication in computer code. By duplicating (statically linking) the library code into every app, you only increase the burden when you want to update the library.
Furthermore, on Debian GNU/Linux I never have "DLL misery" because my operating system is not brain-dead. Multiple versions of shared libraries can coexist, there is a consistent versioning scheme, and a competent team of people who check to make sure that apps use the correct versions of the shared libraries.
Re:DLL vs static libs (Score:3, Insightful)
Think about it, if there is a problem with say the resolving library which means your machine is insecure, with static linking you have to rebuild everthing, also with static linking you have to work out what is statically linked with it too...
The key think however is having a rock solid library interface with no "special" features which require XX version of YY. The incompatibilities only start happening when the interface suttly changes. Some programs get bitten and not others due to which parts of the interface they use....
James
Re:DLL vs static libs (Score:5, Insightful)
Each static library you use in your code adds to the overall program size, as you've stated. Thus increasing load time when you run the app. A 4MB exe loads a lot faster than a 400MB exe. (At least when the bulk of the difference is actual code/runtime-data, rather than generic data tacked on to the end of the file.) When you have DLLs, the OS will reuse already loaded DLLs for your app, and will keep newly loaded DLLs in memory for a while even after the app shuts down. This tremendously speeds up application loading.
Plus it allows for version upgrades in the DLLs without requiring recompiles of the applications. Of course, this feature really isn't used in Windows, since MS can't seem to make an application run according to specs, most developers work around and sometimes even depend on the bugs that are in MS's code. Thus creating DLL version dependant apps. If MS goes back and fixes the broken DLLs, they will break a bunch of third party apps in the process. Probably some of their own as well.
Plus their is security. Applications do not have the same level of access as DLLs. In fact, if you want to do some low level hardware access in Windows, you MUST use a DLL as a thunking layer, since you can't access the hardware from code within a regular process.
There are other reasons, but I'm getting lazy, so I'll let someone else finish if they feel like it is needed.
In short, there are plenty of reasons to use dynamically loaded libraries. But the generally poor coding practices that occur on the Windows platform pretty much ruins almost all of them. It's actually pretty sad. Windows had a lot of potential to become a great OS. Too many fuck ups and shortcuts have wrecked that chance. Fixing it would require an entirely new code base. One that doesn't have native ABI support for old apps. Which, of course, MS will never do. At least there's open source software to do things right.
Re:DLL vs static libs (Score:3)
Say you're releasing a suite of programs. These programs are not necessarily bundled - some people might only need 1 program - others might need all 3 of them. All these programs need to use the $foo protocol. So you write a DLL for implementing the $foo protocol, and then all 3 of your apps use it. This is a good thing - say a security flaw is discovered - patch the DLL and send it out there. You don't have to patch each application.
The real issue is the poor use of DLLs and lack of coordination between developers and Microsoft. How many copies of the Visual Basic Runtime DLLs do you have on your system? (VBRUN*.DLL) What about the MS Visual C Runtime? (MSVCRT*.DLL). Or the Microsoft Foundation Classes (MFC*.DLL)? Granted there is some version skew in these DLLs, but I found three identical copies of MSVCRT.DLL - one from MGI PhotoSuite, one from WinVNC, and one from Adaptec Easy CD Creator. And there are lots more identical ones too. I think what Microsoft is doing is a good idea, although the whole problem could have been avoided with better developers and smarter installer programs...
Re:DLL vs static libs (Score:3, Informative)
It also helps reduce application start up times on MS platforms. In the case of Win, they don't generally use load on demand as one thinks of from the unix world. Instead, they load all references into the page file and then demand load from the page file as needed. This means, shorter start times for applications which share DLL's as they only have to be loaded once (into the page file).
Re:DLL vs static libs (Score:3, Informative)
Really most UN*X systems (in particular those which are glibc based) already implement Microsoft's "new" scheme. For example, in /usr/lib/:
Note that this scheme can create a "unique identifier" allowing an application to use a particular function using "libname.particular.version" and the the name of the function.If an application just wants to use libgtkhtml but doesn't care what version it uses it will use "/usr/lib/libgtkhtml.so" and will get the most current version. If the application has a requirement of a version 19 of libgtkhtml it would use "/usr/lib/libgtkhtml.so.19" and get the latest version of libgtkhtml.so.19.0.1. If an application wanted version 20.1.3 it would merely use "/usr/lib/libgtkhtml.so.20.1.3". At any rate, you get the idea.
It amuses me every time Microsoft comes out with a "new" twenty year old technology (such as say, symlinks). It's a move in the right direction but I think they lack the humility to say that they've been rejecting the right answer all along.
Re:DLL vs static libs (Score:3, Informative)
Untrue.
The permissions of symbolic links are not used.
From chmod(1),
AFAIK, symbolic links aren't actually changed, only created and deleted.
In order to do that, you need write permission on the directory.
HTH.
More bloat (Score:4, Insightful)
I'm sure the hardware manufacturers will be pleased, as usual.
Alternative? (Score:5, Insightful)
One could argue that this is better than having five or six broken applications trying to use the wrong version of a DLL.
Re:More bloat (Score:5, Informative)
- Unless a DLL is rebased, it will likely have conflicting load addresses with other processes and have to be reloaded at a different addresss anyway.
- Only the static unwritten pages of a DLL can be shared across processes. For DLLs consuming a lot of memory it's usually for data not code.
- DLL sharing can cause application slow down because of page faults for copy-on-write and the page-mapping setup that occurs initially. My loader [thinstall.com] which doesn't do page sharing usually loads DLLs much faster than Windows.
Jonathan
Windows File Protection (Score:5, Informative)
http://www.serverwatch.com/tutorials/article.php/1 548951 [serverwatch.com]
hrm (Score:2)
Fast as a bullet! (Score:2, Funny)
W2003 - W95 = 8 Years to have that briliant idea!
Welcome to VMS (Score:5, Insightful)
In other news, Microsoft invents a journaling file system to prevent data loss.. oh, wait..
Re:Welcome to VMS (Score:3, Insightful)
Though this could bring slower performance I think it will be negligable to most windows users.
Re:Welcome to VMS (Score:4, Insightful)
Re:Welcome to VMS (Score:5, Informative)
In all fairness NTFS had journaling years before any linux filesystem did.
Security Issues vs. Api Versions (Score:5, Insightful)
Re:Security Issues vs. Api Versions (Score:5, Insightful)
And the way most companies work, this upgrade package may also include changes to the application itself instead of being more focused as most here on Slashdot would prefer. The DLL fix may be the "main course", but you'll get a side of new bugs in the app, and possibly an upgrade fee as a cover charge.
It's a tradeoff. (Score:3, Insightful)
Maybe some policy would help a bit. I'm thinking that things like services, especially public net facing ones be forced to use the latest DLL whether it breaks anything or not. If it's compatible, the service stays up. If it's not the service dies and doesn't make a public nuisance of itself. Reporting tools would help too. If it were easy to get a list of which apps were using which DLL, it would be possible to intelligently manage the situation. For that matter, make apps use the newest by default and then fall back to the oldest only if that doesn't work and it isn't a public facing service.
Yes, this can cause problems but if they include the right tools and sane policy they're managable. This isn't intriniscally new VMS did something like this. Unix admins have been doing manually for years. MS just wants to automate it.
Umm Security. (Score:5, Insightful)
In other news.... (Score:3, Funny)
Christ, if you are going to do this, why not create a recompiler that bundles the executable and all of its referenced DLLs into one EXE and be done with it.
As a nice side effect it'd make it a hell of a lot easier to move programs around your disk and between machines... Oh, ok, now I get it.
-josh
Re:In other news.... (Score:4, Informative)
Check it out:
http://thinstall.com [thinstall.com]
Part of .NET, not Windows (Score:5, Informative)
It isn't really DLLs either, but rather
This is a great feature of
Re:Part of .NET, not Windows (Score:5, Insightful)
The feature was present in
I've always wondered, though... why did MS choose to keep the extension ".DLL" for
Uh.... (Score:3, Interesting)
Garbage collection (Score:5, Interesting)
In addition, the OS should be intelligent enough to know when an EXE's been manually deleted (thrown in the Recycle Bin). The current practice of placing all unstallation responsibilities on the vendor tends to result in DLL buildup when the uninstaller doesn't work right or isn't provided (not uncommon.) There should be a unified process for adding a DLL that links it to the executable file that requires it.
Re:Garbage collection (Score:3, Interesting)
There is no way for the OS to know which DLLs an application might use - since DLLs are "Dynamically" loaded. The application may only load the DLL in rare cases, or maybe the application is run for the first time 1 year after it's installed.
To further complicate the problem, software vendors themselves cannot be 100% sure that no other application are going to use DLLs they installed - so to be safe they leave them behind on uninstall. So you are stuck with growing system32 directory. The only way to garbage collect is to refresh the entire machine.
I have been working on a virtual machine techology [thinstall.com] that allows applications to run in a semi-isolated environment (DLLs, files, and registry keys are not shared with the system). So they have zero impact on other applications and other application installs cannot cause them to fail. Also uninstall becomes a simple "del program.exe".
Again? (Score:4, Insightful)
One of the really annoying things about Microsoft is they always promise to fix something in the next version. It's always "next time" with them, but the problems never seem to go away.
Re:Again? (Score:3, Insightful)
Some time before XP is released, Microsoft claims that
Visual Studio
Some time before Longhorn is released (i.e. now), Microsoft says that it will solve these DLL conflicts in the OS itself, since Longhorn will be their first OS with good
Mac OS X Frameworks? (Score:5, Informative)
Read more about that here [apple.com]. Be sure to read through the section on Framework Versioning [apple.com].
Also note that MacOS has long done a great job at packaging applications together so that the installer is unecessary.
Re:Mac OS X Frameworks? (Score:3, Interesting)
I might as well note that this approach has some serious problems as well as the benefit of added simplicity, like:
Note that there is nothing wrong with a 3rd party application install upgrading parts of the OS, it's just that historically Windows has sucked at it. Look at Debian for an example of how this can be done well.
contradiction in terms? (Score:3, Interesting)
Re:contradiction in terms? (Score:3, Informative)
It adds versioning to dlls -- that's all. Multiple programs can use the same version. If a new version of the dll is added (even if it has the same name), the system will direct old applications to the old dll and new ones to the new dll.
See this [microsoft.com]
Kinda makes you wonder... (Score:3, Interesting)
It'll be interesting to see how this adds to the bloat, I imagine it won't take long for the average user to amass quite a number of these things, mostly doing the same job!
There must be a better solution than this!
What about patches and security fixes? (Score:5, Interesting)
According to Microsoft's Ivo Salmre, quoted in the article: "When a .Net component is installed onto the machine, the Global Assembly Cache looks at its version, its public key, its language information and creates a strong name for the component."
In a few cases in the past, backwards compatibility has been (slightly) broken by service packs and security fixes. How will they deal with that? Presumably, the public key of a library can be affected by a patch. If an application uses the strong binding to request a specific version of a DLL, does that mean that it will keep its own copy of the DLL without the patches? Or will it have to deal with potential incompatibilities introduced by the patches? How "strong" will this binding be?
And by the way, this idea looks rather similar to the usual UNIX shared libraries that allow an application to bind to libpng.so (version doesn't matter), libpng.so.1 (version 1 required) or libpng.so.1.0.89 (specific version and patchlevel required). The proposed system for DLLs does not seem to be very different from that.
DLLs (Score:4, Interesting)
Re:DLLs (Score:3, Interesting)
You're right. It would. The trouble is, Microsoft's API documentation is a closely guarded secret, and if you've ever done Windows development using resources from MSDN, you'd know that it's utterly pathetic. All the most useful library calls are kept hidden from you under penalty of DMCA I imagine.
So, what happens if you either A) have no access to powerful APIs or B) you have crappy documentation for a crappy set of APIs? Simple! You code it yourself, reinventing the wheel. Dozens upon dozens of application developers for Windows have done this, making changes and additions to what may be standard libraries. One vendor creates a library with a call foo(bar b), and distributes the library. Another makes a call like bar(foo f) and distributes the library under the same moniker. You can't just merge them... so one version has got to go.
And of course, as for backwards compatibility, since when have Microsoft ever gotten that right? I've ran into so many issues with a Windows application not working on XP because I developed it on 2000. A common solution to this problem is to distribute the version of a library that came with 2000 and overwrite whatever existed on the XP box with it.
All in all, Windows is a pathetic, shody state of affairs. I ask myself everyday: "how the hell does this criminal organization con so many sheeple into using their products!?"
IANAP, But, Uh, WTF?! (Score:3, Insightful)
For one, Mac OS X uses bundles. Each application has its code as well as libraries all wrapped up in a single package. Only that app uses the libraries there. Clean, simple.
I doubt Microsoft's solution will solve the problem because their operating systems rarely show the cojones to stop an errant application from taking advantage of "features" placed within Windows that are self-compromising (e.g., Visual Basic Scripts, ActiveX). Some programming yahoo would just write something to override Microsoft's effort.
Windows could use a DLL manager similar to the old Mac OS Extensions Manager. Actually giving the DLLs easily recognizable names and clear version numbers wouldn't hurt either.
Aw, fuck--just chuck the damned thing and run a *nix, for cryin' out loud.
That is in fact static linking, (Score:3, Insightful)
Sharing a library has one disadvantage: the interface of the library should not change, otherwise using applications will crash. When an interface changes, you have to update the version. Now, you can do this in several ways, most likely this is doable by using a filename version scheme, or as in
The central point where you register shared dll's shouldn't be based on a directory though, but a central repository which holds ID's that refer to files on disk. This is implemented in COM somewhat: COM objects are stored in DLL's mostly and when you register a COM dll, its COM Objects are registered in the registry: each CLSID is stored with the physical file where to find the object. If you now store the files locally with the app, as it should be, you can register the com dll's and each application using a COM object with a given CLSID that is stored in the local stored dll can use them.
The problem arises when you install 2 applications which use the same DLL with the same objects, only application A uses v1.0 and application B uses v2.0. v2.0 of the library has all the objects of v1.0 but also newer objects. You first install A. All dll's are stored local. Then you install B. A works, it will probably use the dll of B because B registered all teh objects with the local stored DLL. When you UNinstall B, A will not work anymore, unless you keep the dll with the objects around. Most people don't do this ("Hey the uninstaller left some dll files behind!" *executing del command*). That's DLL hell.
What's best is thus a central repository (Be it the GAC or the registry) and a central store which allows multiple versions of a DLL to be stored. I'm pretty sure that's what MS is heading at, and not your MacOS X nor any Unix does this.
So let me get this straight... (Score:5, Insightful)
One result is that from one machine to the next, not only will you not be sure what applications are using which DLLs, you will also have applications that use radically different identifiers for accessing their libraries.
This eliminates library confusion... how? I can't wait to have to troubleshoot it. Here's another solution Microsoft: document your standard libraries so that idiot application developers don't feel they need to re-invent the wheel and dump custom libraries all over the place.
Of course, the rest of us will continue using open source software.
Contradiction of replies (Score:5, Funny)
Reply 1- "This is a horrible idea! Look at all the RAM/disk space this is going to use. M$ programmers are idiots!"
Reply 2- "VMS/Apple/*enter slanted fav OS here* already does this! This was a good idea when it was done 10 years ago by ____
Reply 3- An idea like this is so stupid, it will NEVER work right.
So, its a stupid idea that will never work EXCEPT it has already been done BUT will take up too many resources UNLESS it is done by our fav OS AND then thats okay
*grin*
Unsolvable? (Score:5, Interesting)
The point of shared libraries is that you CAN upgrade one single library and have many applications "automatically" inherit the changes. This is how you can update a file dialog for instance, without recompiling every single one of your GUI applications. This is a Good Thing. The question then becomes "why is this shit breaking so much". The right solution is a proper combination of carefully-followed deprecation and backwards compatibility rules (preferentially married with some sort of standard version naming convention), and the ability for applications to explicitly choose the library version they want (or even better yet, runtime configuration directive that can be set by the user or administrator) in the cases that *it is known that the new shared library is not backwards compatible*.
This is NOT about DLLs! (Score:4, Informative)
A
The only aspect of it that has to do with DLLs is that
There's a Bigger Picture Here!! (Score:5, Interesting)
OS/2 - as an example only - had a much better scheme where o/s stuff lived in its own space and the stuff you built/bought lived in its own space (and never the twain shall meet). On top of that, they implemented the idea of a LIBPATH env variable so that you could set the path OS/2 would take when looking for DLLs. Consequently, screwups were minimized, versioning was not an issue, built/bought software could be maintained easily, and (wait for it...) you could upgrade the o/s without blowing away all your apps!
Can't wait to hear what MS 'discovers' next!
Only applies to .NET apps (Score:3, Informative)
Microsoft: get a clue: just do what Linux does (Score:3, Informative)
The real solution is to indicate whether a DLL is a bug-fix release or whether it represents a significant and incompatible change to the APIs. You know, like Linux major/minor dynamic library versioning, for example.
Static is the way to go (Score:5, Insightful)
First, what are the advantages of DLLs?
In Unix, when you have two instances of an application running, say, vi, the executable code between the two is automatically shared. The shared library gains you nothing. To gain memory footprint, you need to use the same shared library from two applications at the same time. For example, libc might be used by vi and cc.
However, if you compile statically, you bind in only the routines that are needed. For shared libraries, you need to have all routines available, since you don't know which of them are used. Now, your virtual memory system may notice that a shared libary page isn't used, and page it out. Yes, this requires additional run time execution time. The upshot is that you save memory only when you have enough different programs use the same shared library to overcome the overhead. I claim that this happens with libc, libX11, and not a whole lot else.
Less Disk Footprint
If you have 50 programs that use the same shared library, you can save some disk space becasue that libary code does not need to be duplicated that many times. However, shared libraries need to have the symbol information requried to perform the dynamic binding. The savings isn't that much.
In the old days, when an entire Unix distribution fit on a 150 MB tape, the libc shared library savings amounted to about 30%. You could get more reduction in size by using compression.
In fact, programs could be compressed on disk. The loader could decompress the image as it reads it into RAM. For slow disks, this may be faster than loading the uncompressed version into RAM. The down side here is that you then may not be able to use the original file on disk for virtual memory paging.
In any case, it's getting hard to get a disk drive under 20 GB. 30% overhead reduction for the most common shared library doesn't amount to much.
Global Security Fixes
So, your libzlib.so.5 has a bug. You whip up a quick fix, create a new libzlib.so.5, and drop it into your system. You've just fixed all of your libzlib dependent programs, right? In fact, you fixed programs you didn't even know used libzlib. You may also have broken programs that you didn't know use libzlib. And, short of testing every program on your system, you don't know. The more complex the patch, the more likely you are to have broken many things.
Quick. What is a utility which will tell you all the shared libraries that an application uses?
Use of third party binaries
Third party binaries can just as easily be distributed in source form or in a library that is statically bindable. Static binding is preferable, since you are unlikely to use a large fraction of a kitchen sink shared library - where the authors have no idea how the programmers will use it. Source is preferable, since the documentation rarely specifies enough semantic detail to allow proper use.
Plug ins
OK. Your application is Apache, and you want some real flexibility. If Apache is compiled so that modules can be loaded at run time, then the administrator can add the new module and turn on it's use in the configuration. This doesn't save any RAM or disk, but it may allow the admin to change a line of config, restart the web server, and start using some new feature.
For Apache, the admin can also recompile with the new module compiled statically. I've done it both ways. My measurements show a small run time advantage to static compilation.
Granted, if you can't recompile IIS, then DLLs will give you the same flexibility in exchange for a small performance penalty.
The Dark Side of Shared Libraries
If you compile your application statically, then upgrade your OS, you can copy the old application to the new OS, and it just runs.
If your app has shared libraries, you have to track them down on your old OS, and copy them to your new OS. If you make a mistake, and copy your old libc.so over your new one, you run the risk of trashing every program on your new system. Brilliant.
Take netscape as an example. It comes installed in it's own /usr/local/lib directory
tree. In /usr/local/bin, netscape is a
script which sets up the shared library search
path to include the libraries that netscape needs,
then runs the binary. This introduces script
overhead
and shell dependencies on a complicated package.
And, when you upgrade your OS, you still need
to find the old libc.so and copy it forward.
RPMs
Many seem to think that RPMs solve all these problems. However, many packages have bugs in their dependencies, etc. Many RPMs use different versions of the same shared libraries. I find that I have to override the dependencies to get stuff to install. Often, the requried package IS installed. Not just once in awhile. Much of the time. The difference between theory and practice is that, in theory, they are the same.
Conclusions
Shared libaries seldom save RAM or disk space. The problem with using them to fix bugs globally is that you don't know what you fixed, or even if you broke some things. Third party binaries should invariably be statically linked. In an open source environment, plug ins are not strictly needed. Shared libaries make OS upgrades more painful.
So, what I'd like is a Linux distribution with no shared libaries. The compiler, gcc, would be configured to compile statically by default. Then, after some years of running the system in production, and after adding hundreds of applications to it, I'd be able to upgrade to a new distribution without having to recompile or do the shared library search.
Re:Static is the way to go (Score:3, Informative)
Quick. What is a utility which will tell you all the shared libraries that an application uses?
librt.so.1 =>
libc.so.6 =>
libpthread.so.0 =>
Don't break DLL's/components then (Score:3, Insightful)
New major releases should be considered a completely different DLL/component, since it conceivably has a different API or changes its behavior in some incompatible way.
It seems to me that DLL's/components need to be treated as self-contained applications. They need to go through a rigorous testing and QA cycle (except that they don't generally expose anything directly to the users, but to other applications), and need to be installed as if they were their own application. Windows applications that have dependencies on DLL's can, during installation, tell the OS which DLL's they need and what the minimum version should be.
Bundle these with the application if you need to, but to suggest that DLL's/components need to be kept at the same *minor* version to avoid breaking applications indicates a bad problem with how you build and test DLL's. I'd rather fix this problem than introduce this layer of version matching.
Hmmm, this sounds like fun... (Score:5, Insightful)
As for the idea of "Strong Binding", I wonder what Billy G. expects to acomplish by adding yet more poorly designed, poorly documented LIBs to the programming mess that Windows has evolved into. On top of that, I wonder why I would need to save EVERY SINGLE VERSION of a DLL that makes it to release...
Version tracking will become a nightmare.
Consider:
+User installs program COOL_PROGRAM.EXE
-COOL_PROGRAM uses MS_COOLNESS.DLL
+User gets an update to MS_COOLNESS.DLL:
MS_COOLNESS_V2.DLL
-The fix in V2 repairs a buffer overflow
in a function that COOL_PROGRAM used from
COOLNESS.DLL.
Question : Does the installer for V2 know that COOL_PROGRAM is dependent on it? If this is the case, Billy G. is gonna have his hands full trying to keep track of what goes where with third party devs.
If not, perhaps COOL_PROGRAM will go by default to the newest version of COOLNESS.DLL. Ok, now Billy will likely contend with tracking and modifying functions that have previously been used in highly specialized ways for security/system critical functionality that Windows does not provide either by accident or by intention. So NOW third party devs developing well organized and functional code/programs are forced to keep up with the madness of Windows development to save space. Hmmm... Guess it got the better of the buffer overflow this time. Or maybe they introduced a new bug into the system {par for course with MS}...
Better yet, how about people developing security/system critical environments use their own code to avoid this whole mess? Ok, now you dont need DLLs do you? How about 3-5 times as many? Wait for the next Windows release? So the effort YOU made during XP to keep up with DLLs and other updates is pointless right? Or XP+1? The style of MS defines itself....
Security/system critical programs?...
Thats only one side?...
Ok. Try this:
Graphics, network comunication, encryption, file editing, database editing? Or maybe drivers, file converters, scripts, inter-app comunication, diagnostics?
The list goes on. The problems generated by and complications arising from this framework are not worth the hassle.
Instead of building a system where things get more complicated I would recomend a redesign of the system itself. Current and past states of instability/insecurity are more than I care to witness again. Billy has enough money to sit around daydreaming for the rest of his days while still paying his programers for doing nothing but daydreaming themselves for the rest of theirs... Perhaps they could get up off their butts and design a system from the ground up that is easy to use, safe, fast, and reliable for users old and new... Logical?
I love C programming. C++ and Java are lots of fun. But IF you want something done right the first time, assembly and careful thought is the only answer...
S-()-u-|-s-!-|)-E
Just put them in with the software! (Score:4, Interesting)
here (Score:4, Informative)
When a program installs a "shared" DLL, the assembly manager looks at the DLL version. One of three things will happen:
1. The DLL does not exist in the assembly cache - it is added.
2. The DLL exists, but all other instances of it are a different major/minor revision. (X.Y.0.0) In this case, the DLL is added to the assembly cache as a separate version.
3. The DLL exists in the cache, and the major/minor versions are the same. In this case, if the installing DLL has a newer revision (0.0.X.Y), then it will overwrite the old DLL. Otherwise, it is thrown away.
When a program executes, it's manifest specifys what major/minor version of the DLL it needs, and the assembly cache will fetch it. HOWEVER, bug fixes, etc are supposed to be changes to the revision numbers only, so if a bug fixed version of the DLL is installed, the app will use that version.
The assembly cache also keeps track of what set of DLLs go together. If version 1.2.7.X of FOO.DLL needs to also be run with 1.2.7.X of BAR.DLL, then the assembly cache can make sure a program never uses a mismatch, which has been a
MAJOR cause of difficult to track instability over the years.
The "new"
1. If you have a DLL only used by your application, install it in your application's folder.
2. If you have a suite or many apps that work together and use the same DLL, install it into program files\common\yourname.
3. ONLY install DLLs into the System folder if they are very very widely used, or are actual system objects or libraries. (I.E. your app needs a newer version of the microsoft common dialog runtime. In that case, you ship the MSM which has the latest version of common dialog and related libs that are all known to work together, for EACH version of windows. The Windows Installer knows how to read the MSM and pick the appropriate set of files for the current OS/service pack level you are on. That way, developers running Windows 2000 don't b0rk a Win9x user by shipping the w2k libraries.)
3a. An even easier way of handling things is to write your app for a specific service pack level on each OS (or possibly hotfix if a bug was fixed that is affecting your app.) In this way, you just tell your users "you need service pack X on OS Y, or service pack Z on OS A" to run the app.
Jeez. Grow a brain. (Score:3, Insightful)
I can't tell you what an improvement assemblies are compared to a "component"/COM object. You'd have to build your DLL, then regsvr32 it into the system.. and if you ever needed to update the DLL, you'd have to stop your service/app that uses the DLL, then regsvr32 -u it... and then overwrite the existing file, then regsvr32 it again, and start your service/program back up..
And now? You just overwrite the DLL.
If you have
It's pretty cool stuff.
Re:In other news (Score:2)
Re:In other news (Score:4, Informative)
look in
These point to the actual library.
As long as a libraries API doesn't change between major versions (as it should) there is no problem.
Jeroen
Re:In other news (Score:3, Insightful)
As long as a libraries API doesn't change between major versions (as it should) there is no problem.
Unless the semantics of an API change subtly from one version of the DLL to the next. This is sometimes done to fix bugs, security holes, etc. in one version of the DLL. You wouldn't believe how many proprietary programs in practice rely on undocumented behaviors of specific versions of libraries.
Re:In other news (Score:5, Informative)
Indeed most libraries have subversions, but most apps just link to the major version. When an app insists it needs version 6.3.2.4.33 it gets nasty..
Stop spreading FUD. You can access any library you want with LD_PRELOAD. So if libfoo is at 6.3.4 and you have a 6.3.2.4.33 on the system that your app absolutely requires, a simple
will do the trick. In fact, I do this specifically for StarOffice so I can use my local copy of freetype2 with the bytecode hinter turned on instead of the version which comes with StarOffice.
Re:In other news (Score:3)
If an application _requires_ a certain sub version to work (by that I mean version == x.y.z, not version >= x.y.z), then there's something wrong with either the design of the application, or the design of the library...
Re:In other news (Score:5, Informative)
I instruct people in Linux, and my biggest complaint with RPM is that the user must solve his dependencies himself.
E.g. we had made an installation, but left out the development tools. When you try to install gcc, it says which packages are missing, but not where you can find them. You have to dig them up yourself from the CD-ROM's, and sometimes you have to look on all of them.
I do not have any problems with the RPM system itself, but why has Red Hat still no system implemented like Debian apt ? After the installation it asks for the CD-ROM's, scans them and builds a database about what packages reside where.
So, in the case of gcc, it would say what packages are missing, select them automatically, load the needed packages onto the disk and asking for the appropriate CD-ROM whenever necessary.
This is much more friendly than the stock Red Hat approach. Oh, I know there are tools to do that with Red Hat, but you still have to install them yourself. It should come out of the box.
Petzold is your guy... (Score:3, Informative)
Re:How about versions in the file name (Score:2)
What MS really needs is a dependancy system that would eliminate the need for software vendors to have to include system DLLs with their software.
As it sits I worry this system will prevent upgrades.
Re:Slashdot is getting slow, lazy? (Score:5, Insightful)
What's the problem?
This is not a "Breaking News" site, it's a community discussion board. One doesn't come here for "news," per se, but to read what like-minded people in the "geek community" think about that news.
You're getting upset because your dog doesn't 'meow.'
Re:Old programs vs. new programs (Score:3, Insightful)
If you want more information on how this is going to work simply look into how
Global Assembly Cache (GAC):
The GAC can be used to register a dll on a system wide basis and allow other programs on the machine to link to that dll. It handles the different versions of each dll and how they are configured. BUT you do not have to use the GAC to use a dll within
The trade off of this system is that you have more files on the filesystem which need to be managed. It has it's own drawbacks. But to anyone who's ever f'd a win32 machine because a system dll got replaced when they installed that dvd player app from 1997 and it just happened to replace a critical system dll that was just updated in the last service pack, this is a godsend.
Re:All Programs Should be Self-Contained (Score:4, Interesting)
Well, I was going to say this earlier, but the network hung. All I can do now is expound on this...
The largest EXE on my box is a little over 3 megs (it's AbiWord, by the way). The largest DLL on my box is a little over 5 megs (it's the bulk of the image loader/editor that came with a cheopo digicam I bought). Let's be really, really conservative and say that AbiWord decides to load that DLL. Yeah, I know it would never happen but this is just a worst-case scenario. That's 8 megs resident in memory. Now, how many windows do I typicly have open? 5 or 6, and many times it's the same app like IE or MSVC. Even under a worst-case scenario like 8 separate huge apps open, that's 64megs. Now of course this worst-case scenario is an extreme. I wager a more typical scenario with everything self-contained would result in less than 32 megs of code resident in memory. What's 32 megs cost? They don't even sell 32 meg modules most places. A lot of boxes are coming with half a gig, and if you want more you just grab for some loose change and snap it in.
Of course, apps aren't the only thing on the box. The System Information in Windows shows a lot of DLLs loaded by Windows, many of them legacy support. On a box with 128 megs of ram, I sometimes break over 50% resource utilization, but there's no noticeable impact on performance so who cares?
Now, weigh the cost of RAM against all the hours spent putzing with different dynamic library versions.
Plainly, dynamic libraries are a holdover from the days when memory costs and address-space limits were something to think about.
Now, I'm not saying that there aren't circumstances where dynamics are a good idea. For example, it would have been nice if Microsoft had installed MFC DLLs with earlier versions of Windows. I shudder to think of all the bandwidth wasted downloading those.
The "solution" of maintaining different versions of DLLs and giving them unique IDs is almost an admission of defeat that dynamics don't work. It's probably better to think of it as a way to ween people of dynamics, and of providing those who still want to use them with the option.
Re:All Programs Should be Self-Contained (Score:5, Insightful)
Quick example. ls is 68k by itseIf you add the size of all libraries it links to, it becomes about 1.7 MB on a typical system. I would say ls is pretty consevative in terms of linking, so I'll pretend everything in
Not only is drive space not a moot point, but this has implications in terms of consistency and interoporability. If applications all used internal versions of GUI libraries, there would be absolutely nothing enforcing any sort of consistency and complex inter-process communication becomes really difficult due to version mismatches.
Re:Had to have a snide remark, didn't you. (Score:4, Informative)