• Donjuanme@lemmy.world
    link
    fedilink
    arrow-up
    98
    arrow-down
    5
    ·
    edit-2
    1 year ago

    No, there’s no such thing as breaking encryption, they’re trying to outlaw, or require back doors for strong encryption. Outside of a quantum computing miracle expansion, there’s no breaking strong encryption.

    • moody@lemmings.world
      link
      fedilink
      arrow-up
      32
      arrow-down
      1
      ·
      1 year ago

      require back doors for strong encryption

      That’s what the headline implies. Encryption is useless if a third party can decrypt it.

      • sudoshakes@reddthat.com
        link
        fedilink
        arrow-up
        30
        arrow-down
        2
        ·
        edit-2
        1 year ago

        His point, which seems pedantic, but isn’t, is to illustrate the specific attack vector.

        Breaking encryption would mean that the cryptographic process is something that an attacker can directly exploit. This is as close to impossible as it gets in that line of work.

        While you can compromise the effectiveness of encryption by subverting it using other attack vectors like man in the middle or phishing or the good old fashioned physical device access, these don’t break the algorithm used in a way that it makes it vulnerable to decrypting other data.

        None of those mean an algorithm used like say the ole Two fish encryption is “broken”.

        Blowfish Triple DES Twofish RC4 Etc. All are fine and not currently broken. All however cannot protect your data if some other attack vector companies you or your site’s security.

    • Facebones@reddthat.com
      link
      fedilink
      arrow-up
      28
      ·
      1 year ago

      Basically nothing being done “for the children” is ever actually “for the children.” It’s just an emotional life hack to do whatever you want and call anybody who calls you on it a pedo.

      I know I’m mostly preaching to the choir here but oh well.

      • XTL@sopuli.xyz
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Any sane country should make a “for the children” or other bs justifications in law or policy proposals immediately punishable.

  • folkrav@lemmy.ca
    link
    fedilink
    arrow-up
    38
    ·
    1 year ago

    Facebook, Google et al. literally just said “fuck it” and stopped serving news content to Canadian and Australian users when they tried legislating around it. I’m curious if UK users won’t just get geoblocked out of many major services lol

    • sarchar
      link
      fedilink
      English
      arrow-up
      19
      ·
      1 year ago

      The owner these companies are apparently liable for correctly policing content on their platform. If they fail, they face jail time. That’s certainly not a risk I’d be comfortable with, so I sure as hell would gtfo too.

  • Echo Dot@feddit.uk
    link
    fedilink
    arrow-up
    33
    ·
    edit-2
    1 year ago

    This is their stated intention, but there are a bunch of idiots, and even they know it won’t work.

    So this will probably end up getting quietly walked back to avoid yet another embarrassing scandal of governmental uselessness, and you’ll never hear about it again. They are currently getting absolutely rinsed in the enquiry, so hopefully they’re feeling a little bit humble at the moment.

    • zzzzz@beehaw.org
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      Maybe it’ll be used like “no loitering” laws. Often not enforced, but useful when you don’t like something and can call it illegal.

      • The Doctor@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        ref, USian “pretext laws.” Trying to pay for something with defaced currency comes immediately to mind.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    This is the best summary I could come up with:


    Specific harms the bill aims to address include underage access to online pornography, “anonymous trolls,” scam ads, the nonconsensual sharing of intimate deepfakes, and the spread of child sexual abuse material and terrorism-related content.

    The first covers how platforms will have to respond to illegal content like terrorism and child sexual abuse material, and a consultation with proposals on how to handle these duties is due to be published on November 9th.

    Ofcom says it expects to publish a list of “categorised services,” which are large or high-risk platforms that will be subject to obligations like producing transparency reports, by the end of next year.

    Social media companies will be held to account for the appalling scale of child sexual abuse occurring on their platforms and our children will be safer,” said UK Home Secretary Suella Braverman.

    Meanwhile, the Wikimedia Foundation has said that the bill’s strict obligations for protecting children from inappropriate content could create issues for a service like Wikipedia, which chooses to collect minimal data on its users, including their ages.

    In a statement, Ofcom’s chief executive Melanie Dawes pushed back against the idea that the act will make the telecoms regulator a censor.


    The original article contains 659 words, the summary contains 197 words. Saved 70%. I’m a bot and I’m open source!