• @[email protected]
    link
    fedilink
    English
    7011 months ago

    Yep. They really doubled down on privacy/security and it’s pretty admirable. The President doesn’t use an android or a blackberry for a reason. (Well, two in the case of blackberry. Security and existing). If only there were no other problematic areas of Apple’s business (manufacturing, wages, environmental impact).

    • @[email protected]
      link
      fedilink
      English
      3011 months ago

      Can’t wait for them to put their money where their mouth is and do the same in China and other large population countries that demand the same thing 😂

    • @[email protected]
      link
      fedilink
      English
      -911 months ago

      They’re hypocrites though. Branding themselves as privacy focused and in some cases actually being that too but at the same time also scanning your photos and messages and reporting to authorities/parents if there something inappropriate.

      Inb4 no need to worry if you have nothing to hide -argument

      • @[email protected]
        link
        fedilink
        English
        811 months ago

        Ok…so I’m aware there is a feature “check for sensitive media” that parents can turn on and AI can send an alert to you if it seems like your kid might be texting nude pics….only works with iMessage since apple doesn’t have access to photos in other apps. No human sees the photos. But that isn’t the same as what you’re saying and I don’t know if what you’re saying is accurate.

          • 6xpipe_
            link
            fedilink
            English
            1411 months ago

            Apple Kills Its Plan to Scan Your Photos for CSAM

            That headline literally says they’re not doing that. It was a well-meaning initiative that they rightfully backed down on when called out.

            I am one of the first to typically assume malice or profit when a company does something, but I really think Apple was trying to do something good for society in a way that is otherwise as privacy-focused as they could be. They just didn’t stop to consider whether or not they should be proactive in legal matters, and when they got reamed by privacy advocates, they decided not to go forward with it.

            • @[email protected]
              link
              fedilink
              English
              -511 months ago

              Good on them for canceling those plans but they only did so because of the massive public outcry. They still intended to start scanning your photos and that is worrying.

              However I’m not denying that it’s probably still the most privacy focused phone you can get. For now.

              • @monad
                link
                English
                611 months ago

                Apple proposes change

                Users vote against it

                Apple doesn’t do change

                Nothing to see here folks

                • @[email protected]
                  link
                  fedilink
                  English
                  -1
                  edit-2
                  11 months ago

                  I don’t quite see it like that myself. If you want to potray yourself as a user privacy focused company then why would you even suggest such feature? Even if their intentions are purely to just protect children with zero malicious future plans they still know it’s going to have bad optics and be widely controversial.

                  • @monad
                    link
                    English
                    311 months ago

                    they still know it’s going to have bad optics and be widely controversial

                    How would they know that? It’s often hard to predict how users will react, sometimes your expectations are wrong.

              • kirklennon
                link
                fedilink
                5
                edit-2
                11 months ago

                They still intended to start scanning your photos and that is worrying.

                They wanted to scan photos stored in iCloud. Apple has an entirely legitimate interest in not storing CSAM on their servers. Instead of doing it like every other photo service does, which scans all of your photos on the server, they created a complex privacy-preserving method to do an initial scan on device as part of the upload process and, through the magic of math, these would only get matched as CSAM on the server if they were confident (one in a trillion false-positives) you were uploading literally dozens of CSAM images, at which point they’d then have a person verify to make absolutely certain, and then finally report your crime.

                The system would do the seemingly impossible of preserving the privacy of literally everybody except the people that everyone agrees don’t deserve it. If you didn’t upload a bunch of CSAM, Apple itself would legitimately never scan your images. The scan happened on device and the match happened in the cloud, and only if there were a enough matches to guarantee confidence. It’s honestly brilliant but people freaked out after a relentless FUD campaign, including from people and organizations who absolutely should know better.

              • @[email protected]
                link
                fedilink
                English
                411 months ago

                but they only did so because of the massive public outcry

                Well, shit. For once the voice of the people worked and you’re still bitching about it.

                • @[email protected]
                  link
                  fedilink
                  English
                  011 months ago

                  You’re right. Maybe I’m being a bit too harsh and should give them some credit. After all they reversed the decision to switch to those shitty butterfly switches on the macbook keyboard too and brought back HDMI and SD card slot. Also ditched that stupid touch bar

          • @[email protected]
            cake
            link
            fedilink
            English
            1011 months ago

            i mean, that’s a pretty niche case and maybe your underage kid shouldn’t be sending nudes via imessage anyways.

            • @[email protected]
              link
              fedilink
              English
              -611 months ago

              That’s a whole another discussion. It just one example anyways. My point still stands; this does not increase user privacy.

              • @[email protected]
                link
                fedilink
                English
                1011 months ago

                The child in that case is not the user (or at least not the owner). The user is the parent who configures the phone as they choose and loans it to the child. It’s no different than Apple allowing a business to configure a MacBook as they choose, including tools to monitor its usage, and then offering that computer to one of their employees. The owner of the device gets to choose the privacy settings, not necessarily the end user.

    • @[email protected]
      link
      fedilink
      English
      -1511 months ago

      Well that and the fact that he’s 900 years old and probably thinks all phones are iPhones.