• WatDabney@sopuli.xyz
    link
    fedilink
    arrow-up
    26
    arrow-down
    3
    ·
    3 months ago

    Probably an odd take, but this is actually something I sort of like about this timeline.

    I keep getting this amusing visual image of actual people tiptoing away and giggling and shushing each other, as somewhere in the background, the site they used to be on is nothing but corporations showing ads to bots posting to bots.

  • Etterra@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    3 months ago

    Yeah it’s a room full of bots talking to bots while other bots try to scam bots into paying other bots. It’s recursive botshit.

    • pivot_root@lemmy.world
      link
      fedilink
      arrow-up
      14
      ·
      3 months ago

      Don’t forget that it’s also being used to train more bots. Lotsa bot inbreeding—or inbotting, if you will.

  • Lvxferre@mander.xyz
    link
    fedilink
    arrow-up
    18
    ·
    3 months ago

    Theory of Reddit in 2024 is becoming surrealistic to watch:

    The bot problem is not even in its final form. It’s probably way worse already, as they aren’t detecting all pieces of bot content, only the blatantly obvious ones. And eventually the mods are going to say “you know what… fuck it, too much effort” and simply leave the bots alone, leading to a further increase of bot activity, in a vicious circle.

  • Anon518@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    3 months ago

    People should stop linking to reddit. Use an archived version. More websites linking to reddit is why they’re at the top of search results. It’s called “domain authority”.

  • TheDannysaur@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    3 months ago

    This post is both insightful and troubling. Using generative AI services to simulate conversations without explicit disclosure can be seen as unethical. Some might argue that this damages the connection that users can feel towards each other, even in an online community. Such matters should be addressed in order to restore consumer trust in the platform.

    (I wrote that to sound like a GenAI response, how did I do?)

  • I_Has_A_Hat@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    3 months ago

    Even before the API change, reposts were becoming rampant. For a while, this wasn’t too bad as new people got to see and enjoy them. The real issue started when all the top level comments, and the replies to those comments, all started to be word for word identical to the comments from the last time it was posted. A copy of a copy of a copy.

    I am convinced that most of the big subreddits no longer have any real human engagement in the top level comments. It’s all just bots talking to each other.

    • RandomVideos
      link
      fedilink
      arrow-up
      3
      ·
      3 months ago

      Wasnt the problem that the reposts were being made by bots to get karma to allow them to make fake posts to scam people?

  • Funkwonker@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    3 months ago

    I’m rather certain that a number of the ‘top level’ subreddits are not only aware and OK with the bots on their subs but are intentionally keeping them around to boost activity.

    I kept an account around for a good while after the API changes until it was perma banned for “report abuse.” I only reported bots who were stealing comments word for word.