• skilltheamps@feddit.de
    link
    fedilink
    arrow-up
    14
    ·
    9 months ago

    As far as I understand, in this case opaque binary test data was gradually added to the repository. Also the built binaries did not correspond 1:1 with the code in the repo due to some buildchain reasons. Stuff like this makes it difficult to spot deliberately placed bugs or backdors.

    I think some measures can be:

    • establish reproducible builds in CI/CD pipelines
    • ban opaque data from the repository. I read some people expressing justification for this test-data being opaque, but that is nonsense. There’s no reason why you couldn’t compress+decompress a lengthy creative commons text, or for binary data encrypt that text with a public password, or use a sequence from a pseudo random number generator with a known seed, or a past compiled binary of this very software, or … or … or …
    • establish technologies that make it hard to place integer overflows or deliberately miss array ends. That would make it a lot harder to plant a misbehavement in the code without it being so obvious that others note easily. Rust, Linters, Valgrind etc. would be useful things for that.

    So I think from a technical perspective there are ways to at least give attackers a hard time when trying to place covert backdoors. The larger problem is likely who does the work, because scalability is just such a hard problem with open source. Ultimately I think we need to come together globally and bear this work with many shoulders. For example the “prossimo” project by the Internet Security Research Group (the organisation behind Let’s Encrypt) is working on bringing memory safety to critical projects: https://www.memorysafety.org/ I also sincerely hope the german Sovereign Tech Fund ( https://www.sovereigntechfund.de/ ) takes this incident as a new angle to the outstanding work they’re doing. And ultimately, we need many more such organisations and initiatives from both private companies as well as the public sector to protect the technology that runs our societies together.

    • gedhrel@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      The test case purported to be bad data, which you presumably want to test the correct behaviour of your dearchiver against.

      Nothing this did looks to involve memory safety. It uses features like ifunc to hook behaviour.

      The notion of reproducible CI is interesting, but there’s nothing preventing this setup from repeatedly producing the same output in (say) a debian package build environment.

      There are many signatures here that look “obvious” with hindsight, but ultimately this comes down to establishing trust. Technical sophistication aside, this was a very successful attack against that teust foundation.

      It’s definitely the case that the stack of C tooling for builds (CMakeLists.txt, autotools) makes obfuscating content easier. You might point at modern build tooling like cargo as an alternative - however, build.rs and proc macros are not typically sandboxed at present. I think it’d be possible to replicate the effects of this attack using that tooling.