1. 5

    Is shared libraries really needed those days? Same space? My laptop has 2Tb of it.

    1. 2

      I’d love for them to go away and us keep everything separate.

      While the ‘disk’ space issue is probably not a problem in many cases (I know I’d rather sacrifice space for duplicate code than have to deal with dependency hell) there are likely more issues to consider.

      I was going to say that it’s a pain to have to update an identical library in multiple packages when there’s a (security) fix, but it’s common that a fix breaks some packages and not others, so you’re left with the choice of some broken packages or a security fix you may or may not feel causes a vulnerability for you.

      Being able to update some packages (where a ‘fix’ doesn’t break them) and leave others until later, accepting the lack of fix, seems like a potentially desirable option.

      Are there other reasons for shared libraries to continue existing?

      1. 14

        Sharing library pages between applications? Preloads (mixed bag)? Less shaking the tree at link-time? Ecosystem stability beyond the syscall level?

        FWIW, Linux is an aberration in how much it makes static. Most systems have a hard requirement on dynamically linking system libraries, and unlike Linux, they either have extreme ABI stability (Windows, Solaris, etc.) or change the syscall ABI and require a rebuild anyways (OpenBSD).

        1. 9

          FWIW, Linux is an aberration in how much it makes static. Most systems have a hard requirement on dynamically linking system libraries, and unlike Linux, they either have extreme ABI stability (Windows, Solaris, etc.) or change the syscall ABI and require a rebuild anyways (OpenBSD).

          Or both. Solaris and Windows change(d) the syscall interface regularly – the stable boundary is in the system libraries.

          1. 5

            This would be a blog post I would love to read!

            1. 1

              I’d love to know how much shared library code is actually shared in RAM in desktop systems, servers, containers, etc. It would seem intuitive that a desktop would have plenty of shared code in some large libraries (e.g. those from Qt and KDE) but I suspect there may be less sharing than we might hope.

              LD_PRELOAD? Is it used for something important? I can imagine it might be but I just haven’t noticed it being used.

              Are you referring to compile time or runtime linking? I seem to remember runtime linking being extremely slow for ‘large’ code under Linux and that meaning we had to put hacks in place to make KDE apps appear to launch faster. It was something that only affected C++ code - not C - and I didn’t know how it could be improved. Would static linking make this worse?

              1. 4

                It’s pretty easy on a Linux system—just read /proc/<pid>/maps, extract the libraries and count. I just did that on my virtual server (that handles email, web, gopher, etc.). The most commonly used libraries are /lib/ld.so and /lib/tls/libc.so (every process). Out of 118 libraries used, 44 are shared 8 times or less, one 10 times, 3 11 times, and then the rest at 21 reuses or more.

                Also, I use LD_PRELOAD to intercept certain C functions, but generally only on a development system.

              2. 1

                Well, talking about windows, it sure has tons of DLLs… but bundled for each program, so there’s almost no deduplication involved. I’d rather directly get static binaries that don’t break so easily (looking at you pacman, you should be fully static).

              3. 3

                Application launch time, memory overhead, etc are the big ones.

                But when you say “no shared libraries” where does that end? Every application should have its own complete copy of the windowing and UI libraries? If every application has its own copy of a library that has a security bug, then every application has to be updated.

                Put aside that means improvements to the OS have no benefits to applications that have already been compiled, and OS UI changes won’t be reflected in your app, the code bloat of essentially having a complete copy of the OS for every app obviously becomes insane.

                1. 1

                  The occurrence of security fixes in libraries with many consumers is much more frequent than ABI breakages.

                  This says nothing as well as to how difficult it can be to track packages which statically link said libraries (there is no way to easily examine the binaries for linkage - you have to look at the build recipe or do some heuristic check for the presence of a symbol).

                2. 1

                  First up, disk space isn’t the important bit for shared libraries these days. It’s in memory cost and application launch time.

                  The second issue is common to OS’s - If I have a library with a stable API, and two other libraries communicate with each other with that library, but they each link in their own copy of a library, I need the memory layout of both libraries to match, at which point you’re ABI locked, and may as well just have shared libraries.

                1. 13

                  I’ll note the other thing the announcment says is “On the other hand the level of interest for this architecture is going down, and with it the human resources available for porting is going down” and the author of this post isn’t offering to step up and maintain it (either for Debian or the other two distros they mention).

                  I’d expect Debian would be fine keeping it if there was people willing to maintain it, but if there isn’t then it’s better it gets dropped rather than keep decaying further. Also, IIRC this has happened before for some items like this, if there are in fact lurking people willing to maintain MIPS then this might get reversed if volunteers come to light as a result of this announcment.

                  1. 4

                    “Might” being the key word; a whole group of us got together to try to “save” ppc64 and Debian wasn’t interested, more than likely because we weren’t already Debian developers. It’d be nice if the “ports” system was more open to external contributions. But mips isn’t even going to ports, it’s being removed.

                    1. 3

                      From my experience, if you aren’t already a Debian developer, you aren’t going to become one. My experience trying to contribute to it was absolutely miserable. I’ve heard that changed somewhat, but I don’t feel like trying anymore.

                      1. 1

                        Can you speak more to this issue? I’m curious as to whether it was a technical or social problem for you, or both.

                        1. 3

                          More of a social problem. I wanted to package a certain library. I filed an “intent to package” bug, made a package, and uploaded it to the mentors server as per the procedure. It got autoremoved from there after a couple of months of being ignored by people supposed to review those submissions. Six months later someone replied to the bug with a question whether I’m going to work on packaging it.

                          I don’t know if my experience is uniquely bad, but I suspect it’s not. Not long ago I needed to rebuild a ppp package from Buster and found that it doesn’t build from their git source. Turned out there’s a merge request against it unmerged for months, someone probably pulled it, built an official package and forgot about it in the same fashion.

                          Now three years later that package is in Debian, packaged by someone else.

                          1. 2

                            I don’t know if my experience is uniquely bad, but I suspect it’s not.

                            Seem’s like you’re right: https://news.ycombinator.com/item?id=19354001

                    2. 3

                      …and the author of this post isn’t offering to step up and maintain it (either for Debian or the other two distros they mention).

                      From the author’s github profile:

                      Project maintainer of the Adélie Linux distro.

                      1. 0

                        Hmm, maybe. I’d bet against it. If Debian is going (reading between the lines) “maintaining modern software on this architecture is getting really hard” then I’d bet against anyone else adding support. Maybe I’ll lose that bet, in which case I owe someone here several beers, but I’ll be very surprised!