Why do we need appimage when we can have single binary statically compiled executable?

Additionally, I can’t really understand why are dynamically linked libraries so popular and how on earth anyone who ever had a “.dll / .so not found” error thinks this is a good idea.

The main idea as far as I understand was to be able to share code which I think everyone knows work only in theory, besides would it not be easier to just recompile the executable from source with updated dependency statically linked?

Other idea behind dlls were that you could replace a dll with different dll as long as the api was compatible which is very unlikely scenario for average people.

Yet another possible advantage would be that the library code is shared so it takes less space on disk which might be true for some libraries which are very common but on the other hand static compilation only includes the part of library code that is used by the program so it takes less space anyway and is more optimized.

So the reasons to use the dlls can be easily dismissed.

About the disadvantages - if the dll is not present the program will not work. It’s quite simple and in my view if you create a program which does not work by itself then that’s a failure on your side.

The dlls are just nightmare most of the time, static compilation for the win. (pic for attention)

  • arcimboldo@lemmy.sdf.org
    link
    fedilink
    arrow-up
    66
    ·
    1 year ago

    When a basic dynamic library needs to be updated because, for instance, there is a big security issue, then all your statically linked binaries will have to be updated. Which means every one of those developer teams need to keep track of all the security fixes, release a new version of the binary and push it, and every user will have to download gigabytes and gigabytes of data.

    While if you have dynamic libs you only have to download that one, and the fix will be pushed earlier and all the apps will benefit from it.

    • muleunchangedstarvedOP
      link
      fedilink
      arrow-up
      4
      arrow-down
      38
      ·
      1 year ago

      if users compile the program on their computers (like AUR) then no need to download gigabytes, you just need the source code.

      • Kache@lemm.ee
        link
        fedilink
        arrow-up
        33
        ·
        edit-2
        1 year ago

        That route already exists today as “the web”, where the “latest” JavaScript source is downloaded and JIT-ed by browsers. That ecosystem is also not the greatest example of stable and secure software.

      • MNByChoice@midwest.social
        link
        fedilink
        arrow-up
        10
        arrow-down
        2
        ·
        1 year ago

        Gentoo builds take a long time. Plus hard to use the computer during that time.

        (I may be presuming as the comment above yours is about OS wide. And this was how Gentoo handled things.)

        • zagaberoo@beehaw.org
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          I use my machine all the time when its updating no problem. You can always configure portage to leave a core or two idle anyway.

          • MNByChoice@midwest.social
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            Oh, thank you! I am realizing I have not used Gentoo in ages. It was the only option on the Xbox at the time, and that didn’t have many cores. I should give it another go.

            • chocobo13z@pawb.social
              link
              fedilink
              arrow-up
              3
              ·
              1 year ago

              As someone using it as a daily driver, I wish you the best of luck. If you stick with it, I expect you’ll learn a lot about what goes on behind the scenes in other distros that have a lot pre-configured, like greeters, compositors (or DEs and WMs), etc.

            • zagaberoo@beehaw.org
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              I use all cores for updating and still have no problems. Doesn’t even make videos stutter. I think you’ll find things much less heinous than they were on Xbox.

      • arcimboldo@lemmy.sdf.org
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Great idea. I’m sure Microsoft, Apple, Adobe, and almost every company that makes money with their software will be super happy to share their source code with me!

        Edit: typo

      • exi@feddit.de
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        But it’s a gigantic waste of energy and time when you could just download a 2mb package and be done with it.

  • SIGSEGV@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    4
    ·
    1 year ago

    Did you,… hrm,… did you even take classes about this stuff. Ffs, this is why this career pays well: you have to understand complicated things.

    Maybe your issue is with Windows. I suggest moving away from that platform.

    Dynamic libraries are essential to computing, and allow us to partition out pieces of the code. One giant library would have to be recompiled with every change.

    • philm
      link
      fedilink
      arrow-up
      8
      arrow-down
      14
      ·
      1 year ago

      I mean yeah, dynamic libraries are great if used correctly (via something like Nix), but the unfortunate truth is, that they are not used correctly most of the time (the majority of the Unix and Windows landscape is just a mess with dynamic libraries).

      With modern systems programming (Rust) the disadvantages of static compilation slowly fade away though via e.g. incremental compilation.

      That said dynamic libraries are still a lot faster to link and can e.g. be hot-swapped.

  • AProfessional@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    ·
    edit-2
    1 year ago

    Shared libraries are not a theoretical good, they have been the backbone of computers for decades and many vendors have successfully maintained ABIs for decades.

    Modern languages do the statically compiled solution and it has its own downsides. Makes language bindings hard, no stable ABI means no binary platforms exist (other than awkward C wrappers), rebuilds are slow and OS wide results in a lot of churn, reasoning about security fixes is very hard.

  • sznio@beehaw.org
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    1 year ago

    Additionally, I can’t really understand why are dynamically linked libraries so popular and how on earth anyone who ever had a “.dll / .so not found” error thinks this is a good idea.

    1. You can load a DLL once and all programs can share it, saving memory. It also makes programs start faster since the DLL might be already loaded, so there’s less to load from disk. That mattered more back in the 90s
    2. You can update one file and have the patch apply to all programs
    • jarfil@beehaw.org
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      1 year ago
      1. Your program can also NOT load a DLL until it’s actually needed, making it definitely start much faster.
  • 1984@lemmy.today
    link
    fedilink
    arrow-up
    14
    arrow-down
    4
    ·
    edit-2
    1 year ago

    Honestly, I love statically compiled binaries for their simplicity. I was writing a small utility in Rust today and I wanted to share it with a colleague on windows.

    One command to cross compile my Linux version to windows version and it worked on first attempt on his computer. To me it’s worth giving up a lot of the advantages of shared libraries for that kind of simplicity.

    There is no install. There is only run.

    • mustardman@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 year ago

      We should replace software repositories with the friendly person who stops by with a USB. Running “apt upgrade” pulls up an Uber-like interface that says when your software will arrive. Latency is terrible but bandwidth is phenomenal.

  • nixfreak@sopuli.xyz
    link
    fedilink
    arrow-up
    9
    ·
    1 year ago

    Cause statically linked libs can be very large. I agree though , rather have statically linked then dynamic.

  • steltek@lemm.ee
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    1 year ago

    No one seems to mention license considerations when talking about static linking. Even if your app is open source, your particular license may not be legally compatible with the GPL, for example. 3BSD, MIT, and Apache most likely don’t change in a single binary but it’s kind of a new thing that no one was really thinking of before when mixing licenses together.

    I think this default okay assumption comes from most developers having a cloud-centric view where there’s technically no “distribution” to trigger copyright.

    • TechieDamien@lemmy.ml
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      Even in the cloud you need to consider licenses such as the AGPL. Personally I don’t get this almost apathetic approach many developers have towards licensing and abiding by licenses.

  • ArbitraryValue@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    2
    ·
    1 year ago

    Another advantage: enforced compartmentalization. If you have a single binary, someone will always give in to the temptation to bypass whatever honor policy is keeping your code from becoming spaghetti.

  • gens
    link
    fedilink
    arrow-up
    8
    arrow-down
    4
    ·
    1 year ago

    Because programmers find a good way to do something then apply it to everything. It becomes the one true way, a dogma, a rule. Like how OOP was the best thing ever for everything, and just now 30 years later is proven to be actually bad. At least appimage is more like DOS-s “just unzip and run it” then “download another 500MB of useless stuff because the program depends on 1 20kB file in it”.

    That said, well made libraries are good. As in those that have a stable API so versions don’t matter that much.

    • leviosa
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Like how OOP was the best thing ever for everything, and just now 30 years later is proven to be actually bad.

      Alan Kay coined the term 57 years ago and we have to look at the landscape back then to see just how much OOP has actually influenced pretty much all languages, including ones that distance themselves from the term now. Avoiding shared global state. Check. Encapsulating data and providing interfaces instead of always direct access. Check. Sending signals to objects/services for returned info. Check check check.

      • gens
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Data oriented design is the new thing, much different from that.

        OOP, other then smalltalk and maybe few other languages, is somewhat different in practice from the original idea. I can dig up a great talk from Alan Kay on OOP if you want. Actually i want to watch it again so i’l edit it in here when i find it.

        Edit: https://www.youtube.com/watch?v=fhOHn9TClXY Great talk, as far as i remember.

        That said, we often have to process seemingly unrelated data together which is slow with the model of passing data arround (even when by reference). When OOP was invented memory access was as fast as actual operations on it, while today memory is much slower then processing. With caches and simd and such, it is much faster if everything is an array. Peronally i’m not a fan of OOP because of the “everything has to be an object” mentality, but do whatever you like.

        • jarfil@beehaw.org
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          1 year ago

          DOP, OOP… just give me “C with classes” and I’ll cast whatever void* to whatever’s needed 😜

  • glockenspiel
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    1 year ago

    From Ellen Ullman’s Close to the Machine:

    "The project begins in the programmer’s mind with the beauty of a crystal. I remember the feel of a system at the early stages of programming, when the knowledge I am to represent in code seems lovely in its structuredness. For a time, the world is a calm, mathematical place. Human and machine seem attuned to a cut-diamond-like state of grace.

    Then something happens. As the months of coding go on, the irregularities of human thinking start to emerge. You write some code, and suddenly there are dark, unspecified areas. All the pages of careful documents, and still, between the sentences, something is missing.

    Human thinking can skip over a great deal, leap over small misunderstandings, can contain ifs and buts in untroubled corners of the mind. But the machine has no corners. Despite all the attempts to see the computer as a brain, the machine has no foreground or background. It cannot simultaneously do something and withhold for later something that remains unknown[1]. In the painstaking working out of the specification, line by code line, the programmer confronts all the hidden workings of human thinking.

    Now begins a process of frustration.

    [1] clarifies how multitasking typically works, which was usually just really fast switching at the time of the book.

  • leviosa
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Windows shared libs could do with having an rpath equivalent for the host app. I tried to get their manifest doohickeys working for relative locations but gave up and still just splat install them in the exe directory.

    Aside from that shared libraries are great. Can selectively load/reload functions from them at runtime which is a fundamental building block of a lot of applications that have things like plugin systems or wrappers for different hardware etc. Good for easier LGPL compliance as well.

    • jarfil@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Modern Windows does a lot of shenanigans with DLLs to avoid the “DLL hell” effect, like keeping multiple versions, hardlinking, and transparently redirecting the DLLs accessible to a program, even when they “seem” to be in the exe’s dir.

  • nyan@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    There is one case I can think of where statically linked binaries make sense: games. They’re almost always closed-source even on otherwise open-source systems, and so cannot be recompiled against newer library versions, and (for smaller indies especially) it isn’t unusual for the people who do have the code to close up shop and vanish off the face of the Internet. For those, it honestly is better for them to carry all their libraries around with them, even if it results in some binary bloat.

    For open-source software, dynamic linking isn’t usually an issue until some piece of software goes unmaintained for so long that it catches bit-rot.

    For software not in one of those two categories . . . well, maybe you ought to move off Windows?

    • UlrikHDA
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      For modding, it’s very useful to not have everything statically linked. DLSS swapping is probably the most prominent use case nowadays.