• 0 Posts
  • 21 Comments
Joined 1 year ago
cake
Cake day: June 21st, 2023

help-circle
  • csm10495: Hey, guys. Can we not be assholes about why we are here?

    becool: Not to say “fuck you and get out” but fuck you and get out.

    Personally, I see both sides of this matter. As a long time redditor, I am seething at the thought spez is effectively claiming "We are entitled to all the free data users gave us over the years and to jack up our API prices which will make for a worse user experience. I also think there is a right way and a wrong way to discuss the overreach of some people, like spez, who have a pathological lack of awareness in a way which makes constructive resolution easier and more imminent.

    For example, instead of what you wrote, I might have said something like “Fortunately, reddit is still a thing and we can work on showing the people over there just how shitty spez’s actions have been and how they will lead to a worsening experience for them while simultaneously encouraging them to join kbin”.

    I’m not saying your position and its premises are necessarily wrong; I am saying time, place, and manner of communication often – if not always – matter.


















  • @generalpotato Ish. I read the technical write up and they actually came up with a very clever privacy-focused way of scanning for child porn.

    First, only photos were scanned and only if they were stored in iCloud.

    Then, only cryptographic hashes of the photos were collected.

    Those hashes were grepped for other cryptographic hashes of known child porn images, images which had to be in databases of multiple non-governmental organizations; so, if an image was only in the database of, say, the National Center For Missing And Exploited Children or only in the database of China’s equivalent, its cryptographic hash couldn’t be used. This requirement would make it harder for a dictator to slip in a hash to look for dissidents by making it substantially more difficult to get an image in enough databases.

    Even then, an Apple employee would have to verify actual child porn was being stored in iCloud only after 20 separate images were flagged. (The odds any innocent person even makes it to this stage incorrectly was estimated to be something like one false positive a year, I think, because of all of the safeguards Apple had.)

    Only after an Apple employee confirmed the existence of child porn would the iCloud account be frozen and the relevant non-government organizations alerted.

    Honestly, I have a better chance of getting a handjob from Natalie Portman in the next 24 hours than an innocent person being incorrectly reported to any government authority.