There was a time where this debate was bigger. It seems the world has shifted towards architectures and tooling that does not allow dynamic linking or makes it harder. This compromise makes it easier for the maintainers of the tools / languages, but does take away choice from the user / developer. But maybe that’s not important? What are your thoughts?

  • @[email protected]
    link
    fedilink
    410 months ago

    It seems the world has shifted towards architectures and tooling that does not allow dynamic linking or makes it harder.

    In what context? In Linux, dynamic links have always been a steady thing.

    • @[email protected]
      link
      fedilink
      910 months ago

      We could argue semantics here (I don’t really want to), but tools like Docker / Containers, Flatpack, Nix, etc. essentially use sort of a soft static link in that the software is compiled dynamically but the shared libraries are not actually shared at all beyond the boundary of the defining scope.

      So it’s semantically true that dynamic libraries are still used, the execution environments are becoming increasingly static, defeating much of the point of shared libraries.

      • @[email protected]
        link
        fedilink
        110 months ago

        but tools like Docker / Containers, Flatpack, Nix, etc. essentially use sort of a soft static link in that the software is compiled dynamically but the shared libraries are not actually shared at all beyond the boundary of the defining scope.

        This garbage practice is imported from windows.

    • @[email protected]
      link
      fedilink
      310 months ago

      In Linux, dynamic links have always been a steady thing.

      Hot take: This is only still the case because the GNU libc cannot be statically linked easily