A question on dynamic linking / version-changing libraries
Michael
keybounce at gmail.com
Sun Mar 5 16:43:51 UTC 2017
So here's a real basic question: Why dynamic linking? Why dynamic linking of libraries by library name (as opposed to linking by library name + API version)?
I understand that, way way back in the dawn of time, computer drives were small, computer memory was small, and needing to reuse code on disk was critial to make the systems fit, and reusing code in memory was important to make the systems fit the computers.
But now?
And even if you have dynamic linking, why does a change in the API keep the same library / linking?
I mean, it's not like Mac OS permits app-specific versions of frameworks to be shipped with the app so that as the libraries on the system change, the known-case frameworks used by the app stay the same. Oh wait, it does exactly that.
With frameworks, an app can have specific versions, shipped with it, duplicated on disk, and not shared in memory. If those frameworks have bugs and get improved, you don't automatically get to use the updated frameworks installed in the system.
So why do libraries *still* behave differently?
Why does Macports generate libraries that follow the 1970-era linking strategy?
Is it a limitation of the underlying dynamic library linking system in the OS?
Is it a case of "Apple never updated how their system works, so we just duplicate the same design flaw that Apple uses"?
Is it a case of "Fixing this behavior in Darwin would break all Linux compatibility"? If so, why not send those fixes back upstream and fix Linux at the same time?
This issue was discussed last month, with the key example being webp and ImageMagik. This month, it's libarchive and cmake. Next month it will probably be something else -- someone mentioned that a relatively simple fix to something else (icu, I think) could not be pushed until everything that used it was updated as well. Heck, Google points to this very issue as why they use a single monolithic source tree rather than separated, isolated libraries -- and in reading their paper, I realized that the whole argument against single monolithic systems is fundamentally, "Right now, in 1980, we don't have the tools or ability to maintain such a system", and Google basically had to make such tools (Heck, even modern desktop IDE's like Eclipse do a really good job for the 95% case).
Why should libraries for webp version 5.2 and webp version 6 occupy the same filename/location on the disk?
Why should programs that want different versions of webp be unable to be installed on the same system?
Why should a program not be built, by default, with the libraries that it needs (and shared libraries only when requested)?
Why does it seem like we are 30+ years out of date on linking technology/behavior/systems?
---
Entertaining minecraft videos
http://YouTube.com/keybounce
More information about the macports-users
mailing list