To stop the malware, we became the malware.
But also we didn’t stop the malware.
Recall records everything users do on their PC, including activities in apps, communications in live meetings, and websites visited for research. Despite encryption and local storage, the new feature raises privacy concerns for certain Windows users.
Gee, you think??!
I loved this:
At first glance, the Recall feature seems like it may set the stage for potential gross violations of user privacy. Despite reassurances from Microsoft, that impression persists for second and third glances as well.
Definitely no dystopian sci-fi that applies here.
Sounds about right. Now, hands up, who didn’t see this coming?
🧍
I’ve seen a few of these insane AI privacy violations by Microsoft today - I’m assuming from the same event. In every article, Microsoft insists “its okay cause all the data is on the user’s device”.
First of all, I don’t trust a word they say. Second, I’m sure there is some other service by them that would pick these files up and send them to Microsoft anyway. Third, unless the AI is running locally, they need to send that data to their servers for this to function. Fourth, I guarantee this “local” promise will get quietly dropped the millisecond they think they can get away with it. Fifth, these tech companies have been acting in such bad faith they should be hit with an insane amount of regulations at the very least.
I’m generally not against AI, but Microsoft absolutely cannot be trusted with any of this.
records everything
It always has.
Microsoft says that the Recall index remains local and private on-device, encrypted in a way that is linked to a particular user’s account. “Recall screenshots are only linked to a specific user profile and Recall does not share them with other users, make them available for Microsoft to view, or use them for targeting advertisements. Screenshots are only available to the person whose profile was used to sign in to the device,” Microsoft says.
Users can pause, stop, or delete captured content and can exclude specific apps or websites. Recall won’t take snapshots of InPrivate web browsing sessions in Microsoft Edge or DRM-protected content
Optional local feature. Of course the thread acts like eggs of the universe.
Ok, but now picture on day 1, MS pops a little message box up on every computer with it installed that says, “Enable advanced functionality?” with a teeny tiny link to a long legal document that, somewhere in it, says that actually with advanced features turned on, they do upload all your data.
Because companies do that, all the time. It allows you to both have press releases saying “we collect no data, we love privacy!” but then actually collect and sell data on like 95% of your customers.
Point out when MS has done this.
Heres an article about them doing it last year, specifically around how much of your data they can use for AI training, so it’s exactly the same thing. The article also mentions several other companies doing it around the same time. https://venturebeat.com/ai/microsoft-changes-services-agreement-to-add-restrictions-for-ai-offerings/
Here’s another article from 12 years ago about MS changes the ToS of their cloud storage policies to allow them to use all your stored files for advertising and “new features”. https://www.csoonline.com/article/545804/microsoft-subnet-microsoft-raises-privacy-issues-with-tweaked-tos-to-share-data-across-the-cloud.html
I can also tell you from my personal experience of using dozens of enterprise MS applications that they all constantly pester you to set up a cloud account, log into it, and link all of your data and activities into this account. In the last few months, every one of them has added an “optional” co-pilot feature that intrusively tries to get me to use it at every opportunity.