A jailbreak version of ChatGPT is becoming popular with women who prefer it to real world dating.

  • enkers
    link
    fedilink
    English
    824 days ago

    Unless you jailbreak your AI, they’re generally designed to deescalate any potentially romantic situations, so I’d imagine it’d result in a very platonic friends situation where both parties chatter on about nothing.

    • Possibly linux
      link
      fedilink
      English
      123 days ago

      Eventually it would devolve into gibberish as language models today can not create new content from scratch