• Thorny_Thicket@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    9
    ·
    1 year ago

    Good on them for canceling those plans but they only did so because of the massive public outcry. They still intended to start scanning your photos and that is worrying.

    However I’m not denying that it’s probably still the most privacy focused phone you can get. For now.

    • monad
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Apple proposes change

      Users vote against it

      Apple doesn’t do change

      Nothing to see here folks

      • Thorny_Thicket@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        edit-2
        1 year ago

        I don’t quite see it like that myself. If you want to potray yourself as a user privacy focused company then why would you even suggest such feature? Even if their intentions are purely to just protect children with zero malicious future plans they still know it’s going to have bad optics and be widely controversial.

        • monad
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          they still know it’s going to have bad optics and be widely controversial

          How would they know that? It’s often hard to predict how users will react, sometimes your expectations are wrong.

    • kirklennon@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      They still intended to start scanning your photos and that is worrying.

      They wanted to scan photos stored in iCloud. Apple has an entirely legitimate interest in not storing CSAM on their servers. Instead of doing it like every other photo service does, which scans all of your photos on the server, they created a complex privacy-preserving method to do an initial scan on device as part of the upload process and, through the magic of math, these would only get matched as CSAM on the server if they were confident (one in a trillion false-positives) you were uploading literally dozens of CSAM images, at which point they’d then have a person verify to make absolutely certain, and then finally report your crime.

      The system would do the seemingly impossible of preserving the privacy of literally everybody except the people that everyone agrees don’t deserve it. If you didn’t upload a bunch of CSAM, Apple itself would legitimately never scan your images. The scan happened on device and the match happened in the cloud, and only if there were a enough matches to guarantee confidence. It’s honestly brilliant but people freaked out after a relentless FUD campaign, including from people and organizations who absolutely should know better.

    • dynamojoe@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      but they only did so because of the massive public outcry

      Well, shit. For once the voice of the people worked and you’re still bitching about it.

      • Thorny_Thicket@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        You’re right. Maybe I’m being a bit too harsh and should give them some credit. After all they reversed the decision to switch to those shitty butterfly switches on the macbook keyboard too and brought back HDMI and SD card slot. Also ditched that stupid touch bar