• mox@lemmy.sdf.org
    link
    fedilink
    arrow-up
    48
    ·
    edit-2
    3 months ago

    Why do I need to know all of this stuff, why isn’t the web safe by default?

    The answer to questions like this is often that there was no need for such safety features when the underlying technology was introduced (more examples here) and adding it later required consensus from many people and organizations who wouldn’t accept something that broke their already-running systems. It’s easy to criticize something when you don’t understand the needs and constraints that led to it.

    (The good news is that gradual changes, over the course of years, can further improve things without being too disruptive to survive.)

    He’s not wrong in principle, though: Building safe web sites is far more complicated than it should be, and relies far too much on a site to behave in the user’s best interests. Especially when client-side scripts are used.

    • marcos@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      2
      ·
      3 months ago

      Anything that didn’t need that kind of security from the beginning also wouldn’t break if it’s built.

      The stuff that would break are all vulnerable because it doesn’t exist.

    • leisesprecher@feddit.org
      link
      fedilink
      arrow-up
      6
      arrow-down
      19
      ·
      3 months ago

      It’s easy to criticize something when you don’t understand the needs and constraints that led to it.

      And that assumption is exactly what led us to the current situation.

      It doesn’t matter, why the present is garbage, it’s garbage and we should address that. Statements like this are the engineering equivalent of “it is what it is shrug emoji”.

      Take a step back and look at the pile of overengineered yet underthought, inefficient, insecure and complicated crap that we call the modern web. And it’s not only the browser, but also the backend stack.

      Think about how many indirections and half-baked abstraction layers are between your code and what actually gets executed.

      • mox@lemmy.sdf.org
        link
        fedilink
        arrow-up
        19
        arrow-down
        2
        ·
        3 months ago

        Statements like this are the engineering equivalent of “it is what it is shrug emoji”.

        No, what I wrote is nothing like that. Please re-read until you understand it better.

        • leisesprecher@feddit.org
          link
          fedilink
          arrow-up
          2
          arrow-down
          18
          ·
          3 months ago

          Of course it is like that. You’re saying that the complaint is wrong because the author doesn’t know the history, and now you accuse me of not understanding you, because I pointed this out.

          If you have to accuse everyone of “not understanding”, maybe you’re the one who doesn’t understand.

          • atzanteol@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            20
            arrow-down
            1
            ·
            3 months ago

            You’re saying that the complaint is wrong because the author doesn’t know the history

            That’s not at all what he said. He literally even said “He’s not wrong in principle.”

            If you don’t understand the history of why something is the way it is you can’t fix it. You can suggest your new “perfectly secured web site” but if Amazon, Microsoft, Google, Firefox, Apple, etc. don’t agree on your new protocol then there’s going to be exactly 1 person using it.

      • lysdexicOP
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        3 months ago

        It doesn’t matter, why the present is garbage, it’s garbage and we should address that. Statements like this are the engineering equivalent of “it is what it is shrug emoji”.

        I don’t think your opinion is grounded on reality. The “it is what it is” actually reflects the facts that there is no way to fix the issue in backwards-compatible ways, and it’s unrealistic to believe that vulnerable frameworks/websites/webservices can be updated in a moment’s notice, or even at all. This fact is mentioned in the article. Those which can be updated already moved onto a proper authentication scheme. Those who didn’t have to continue to work after users upgrade their browser.

        • jacksilver@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          3 months ago

          A lot of the web used to run on flash. Then apple comes around and says “flash is terrible and insecure”. Within a number of years everything moved away from flash, so it’s definitely possible to force the web in new directions.

      • BatmanAoD
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        3 months ago

        Take a step back and look at the pile of overengineered yet underthought, inefficient, insecure and complicated crap that we call the modern web…

        Think about how many indirections and half-baked abstraction layers are between your code and what actually gets executed.

        Think about that, and then…what, exactly? As a website author, you don’t control the browser. You don’t control the web standards.

        I’m extremely sympathetic to this way of thinking, because I completely agree. The web is crap, and we shouldn’t be complacent about that. But if you are actually in the position of building or maintaining a website (or any other piece of software), then you need to build on what already exists, unless you’re in the exceedingly rare position of being able to near-unilaterally make changes to an existing platform (as Google does with Chrome, or Microsoft and Apple do with their OSes) or to throw out a huge amount of standard infrastructure and start as close to “scratch” as possible (e.g. GNU Hurd, Mill Computing, Oxide, Redox OS, etc; note that several of these are hobby projects not yet ready for “serious” use).

      • Carighan Maconar@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        3 months ago

        Okay, and how would you address it? The limitation is easy to criticize when you can think in a vacuum about it. But in the real world, we’d need to find a way to change things that can actually be implemented by everyone.

        Which usually means transformative change.

      • magic_lobster_party@fedia.io
        link
        fedilink
        arrow-up
        1
        ·
        3 months ago

        It doesn’t matter, why the present is garbage, it’s garbage and we should address that.

        The problem is fixing it without inadvertently breaking for someone else. Changing the default behavior isn’t easy.

        There’s probably some critical systems that relies on old outdated practices because that’s the way it worked when it was written 20 years ago. Why should they go back and fix their code when it has worked perfectly fine for the past two decades?

        • BatmanAoD
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          3 months ago

          If you think anything in software has worked “perfectly fine for the past two decades”, you’re probably not looking closely enough.

          I exaggerate, but honestly, not much.

            • BatmanAoD
              link
              fedilink
              arrow-up
              1
              ·
              3 months ago

              Yes, popular programs behave correctly most of the time.

              But “perfectly fine for the last two decades” would imply a far lower rate of CVEs and general reliability than we actually have in modern software.

  • BatmanAoD
    link
    fedilink
    arrow-up
    21
    ·
    3 months ago

    First and foremost _____ is a giant hack to mitigate legacy mistakes.

    Wow, every article on web technology should start this way. And lots of non-web technologies, too.

  • mormund@feddit.org
    link
    fedilink
    arrow-up
    17
    ·
    3 months ago

    Unless I’m missing something, the post is plain wrong in some parts. You can’t POST to a Cross-Site API because the browser will send a CORS preflight first before sending the real request. The only way around that are iirc form submits, for that you need csrf protection.

    Also the CORS proxy statement is wrong if I don’t misunderstand their point. They don’t break security because they are obviously not the cookie domain. They’re the proxy domain so the browser will never send cookies to it.

    Anyways, don’t trust the post or me. Just read https://owasp.org/ for web security advice.

  • milliams@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    Thanks, very interesting. I’m a bit confused about what this means:

    explicit credentials are unsuitable for server-rendered sites as they aren’t included in top-level navigation

    What does “top-level navigation” mean here?

    • Lysergid@lemmy.ml
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      3 months ago

      ‘’’ Note: When I say “top-level” I am talking about the URL that you see in the address bar. So if you load fun-games.example in your URL bar and it makes a request to your-bank.example then fun-games.example is the top-level site. ‘’’ Meaning explicit creds won’t be sent. Even if fun-games knows how to send explicit creds, it can’t because fun-games does not have access to creds which stored for your-bank. Say suppose your-bank creds stored in local store. Since current URL is fun-games it can only access local storage of fun-games, not your-bank.

  • Lysergid@lemmy.ml
    link
    fedilink
    arrow-up
    1
    arrow-down
    3
    ·
    3 months ago

    Thank you! I was always wondering why the heck this (mostly) useless and broken mechanism exists. I had hesitations about disabling it but had doubts about my understanding. Now I know I’m right