• 0 Posts
  • 63 Comments
Joined 27 days ago
cake
Cake day: January 15th, 2026

help-circle



  • You’ve never come across something and view it out of curiosity? The algorithms love when you branch out like that.

    The problem with tech companies is that they’ve hyped up their tech so much that people actually think it’s sentient. Algorithms don’t love anything anymore than a cake recipe loves anything.

    You’re being amazingly condescending to people being abused and guided by the algorithms, acting like you’re above it.

    No, I’m being normally condescending which is difficult not to do when people are being pushed around by their computers and phones. It’s like somebody getting manipulated by a light switch. “Oh my god! It knows when I want the light to be on! Get out of my mind!” I’m not acting like I’m above it.

    You’re 3 clicks away from conspiracy theories flooding your feed by way of “here’s how flat earthers explain gravity” because your chosen video, the bridge video, and the conspiracy videos are all using the same keywords.

    So? I get curious and look at that stuff but I don’t go walled-eyed and start drooling. If I don’t look, it goes away. Or better yet, if it doesn’t go away, I search for videos I’m actually interested in.

    You’re not noticing all the “harmless” unrelated suggested content from the games you don’t play like Factorio, Stardew, Hollow Knight, No Man’s Sky, or Star field but it’s there, just as predatory, seeing where you’ll bite. The overlapping keywords and viewerships are there. It’s exactly the same situation.

    This is some weird conspiracy stuff. So Hollow Knight and Stardew Valley is trying to eat my soul? This is the same kind of cryptic talk that people freaking out about heavy metal and DnD in the 80s used.

    This category association is how people get drawn into deep, dark corners. This is how segmented conspiracy groups converge. This is how the manosphere becomes an echo chamber. This is how self-harm and self-hate content puts someone in a hole by themselves.

    That and the search bar. Most people who get stuck in that stuff are seeking it out. It’s what they’re interested in.

    You’re acting morally superior without an actual understanding of what these platforms are designed to do.

    No, you think these algorithms are way more effective than they actually are. A lot of what you describe happens because algorithms can’t read people’s minds. They’re just dumb machines. Bare in mind that chess AI from the 80s can wipe it’s ass with your face and those mighty algorithms were like 25KB in size. Just 5KB is enough to present challenging AI. This is enough to give the appearance of Blinky, Pinky, Inky, and Clyde having different personalities.



  • And it’s pretty well established at this point that social media is harmful to mental health, is it not?

    In the same way it was well established that heavy metal was harmful to mental health in the 80s. The moral panics of the 80s seem goofy to people today, but back then it was just as deadly serious and real as the magical mystery algorithms hacking and controlling our brains today.

    The evil YouTube wizardry is currently recommending me videos about the Flax engine, Blender, Satisfactory, and old clips of Norm MacDonald. If it’s telling you to kill yourself, that’s because you’re asking it how to. It’s just matching keywords in videos: user watched video with keywords Einstein, gravity, and relativity therefor, recommend Einstein Oppenheimer clip, Kerbal Space Program review, and Sabine Hossenfelder video. Replace those keywords with nasty ass shit and you’re getting more nasty ass shit.

    On Reddit, I always find it hilarious when people out themselves when they’re complaining about porn sites sending them some weird ass videos. “Oh man! I don’t understand why Pornhub keeps showing me videos of women being strangled!” No, no, you don’t understand but I sure as hell do.











  • Between all of the AI hype and the AI panic, my biggest concern is that the laws will be so poorly written that simple algorithms like A* will end up illegal and AI in general will be outlawed. We’ll have a Butlerian Jihad because a bunch of daffy CEOs simply said their machines will replace humans when they can’t. Our children will be forced to drink sapho juice and eat spice so they can work in a server farm made of humans.