- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
NO SHIT.
And this is a surprise how?
The entire digital economy is based on spying. It’s called corporate surveillance and it’s been around for 25 years. Why would AI escape this business model? If anything, it turbocharges it.
Wait wait wait. Hold on. Okay. Okay. Wait.
Wait so - Microsoft?? Has been spying? On its own customers?!?
I just . . . I mean, it . . . I don’t know what to say!
i giggled
Removed by mod
Reassuring that the 35 phishing emails I report to them a day from my Hotmail junker are going to be addressed.
Who would have ever thought they could stoop so low!
Oh wait, every single person on the face of the planet.
I’m shooked I say. shooked!
Well, not that shocked.
I’m mildly shooked then.
That was a Futurama reference I was continuing that I thought you were making. Lol.
Given how hard they’ve been pushing Copilot/Bing Chat/etc I’m not surprised
I never touched any GPT crap, let alone BingGPT. Enjoy the doomsday! Off I go to the high seas alone!
Dramatic talk aside, it is obvious enough. US companies are trying to gather data and train AI by giving it to people for free so that they can install it in war robots and continue their genocide economy. If this feels like conspiracy…
Guess what HoloLens is powered by? ClearView face recognition database. https://interestingengineering.com/innovation/microsoft-smart-goggles-fail-us-army-tests
ClearView powered. https://www.thedefensepost.com/2022/02/04/usaf-facial-recognition-tech/
Isn’t that their business model? How else can windows be offered for “free”?
As a bad Python scripter, I’m stuck using Microsoft’s AI because there isn’t a privacy-focused alternative anywhere near as good.
Don’t overuse AI, there is plenty of resources on the web and at least you can practice reading docs. Use Phind. https://www.phind.com/privacy
It’s not as good, but running small LLMs locally can work. I’ve been messing around with ollama, which makes it drop dead simple to try out different models locally.
You won’t be running any model as powerful as ChatGPT - but for quick “stack overflow replacement” style of questions I find it’s usually good enough.
And before you write off the idea of local models completely, some recent studies indicate that our current models could be made orders of magnitude smaller for the same level of capability. Think Moore’s law but for shrinking the required connections within a model. I do believe we’ll be able to run GPT3.5-level models on consumer grade hardware in the very near future. (Of course, by then GPT-7 may be running the world but we live in hope).
SkyGPT
GPT4all is another good local one. Runs on CPU but you can use GPU acceleration. Some models even run on my crappy dual core laptop.
Check out github copilot
Not free. But it’s cheap paid and supposedly privatey focused.
“If you’re not paying for a product, you are the product.” Shame usually it’s both!
Github copilot pirates other peoples code. Legally that’s hard to pursue buts its enough to make me dislike them.
Oh their model is 100% taken from public repositories. I doubt they bothered to even filter it out to open source/fair use code.
My issue here is AI isn’t going to replace my job, but an engineer who uses AI as a tool would replace me…
It’s still Microsoft. Here’s what they say about privacy anyway:
I’m aware. I looked into it regarding your source code being used to train their ML. I looked over the FAQ and got the “Your code is your own.” vibe. Sadly it does point to their standard Privacy statement that could change anytime and allow them to do what they want.
Will my private code be shared with other users?
No. We follow responsible practices in accordance with our Privacy Statement to ensure that your code snippets will not be used as suggested code for other users of GitHub Copilot.