This should be illegal, companies should be forced to open-source games (or at least provide the code to people who bought it) if they decide to discontinue it, so people can preserve it on their own.

    • @Surreal
      link
      English
      3910 months ago

      If that man harnesses the power of LLM like Chat GPT, he can continue talking with his wife

      • recursive_recursion [they/them]A
        link
        English
        1
        edit-2
        10 months ago

        hmm not sure if that would work as the model that he was using would be different from what’s available so he’d probably notice some differences which might cause a mix of uncanny valley and surrealism/suspension of disbelief where the two are noticably not the same

        plus using a chat-only model would be real tragic as it’s a significant downgrade from what they already had

        his story actually feels like a Romeo and Juliet situation

        • @[email protected]
          link
          fedilink
          English
          110 months ago

          Doesn’t even take a change of service provider to get there.

          Replika had what had very obviously become a virtual mate service too, until they decided “love” wasn’t part of their system anymore. Probably because it looked bad for investors, as happened for a lot of AI-based services people used for smut.

          So a bunch of lonely people had their “virtual companion” suddenly lobotomized, and there’s nothing they could do about it.

            • @[email protected]
              link
              fedilink
              English
              210 months ago

              It’s… complicated.

              At first the idea was it’d be training an actual “replica” of yourself, that could reflect your own personality. Then when they realized their was a demand for companionship they converted it into virtual friend. Then of course there was a demand for “more than friends”, and yeah, they made it possible to create a custom mate for a while.

              Then suddenly it became a problem for them to be seen as a light porn generator. Probably because investors don’t want to touch that, or maybe because of a terms of servce change with their AI service provider.

              At that point they started to censor lewd interactions and pretend replika was never supposed to be more than a friendly bot you can talk to. Which is, depending on how you interpret what services they proposed and how they advertized them until then, kind of a blatant lie.

        • @Surreal
          link
          English
          1
          edit-2
          10 months ago

          LLM is capable of role-playing, character.ai for example can get into the role of any character after being trained. The sound is just text-to-speech, character.ai already includes that, though if a realistic voice is desired, it would need to be generated by a more sophisticated method, which is already being done. Example: Neuro-sama, ElevenLabs

      • I Cast Fist
        link
        English
        110 months ago

        Next thing you know, he doesn’t read the fine print, ther “brain” is internet connected and, sooner or later, he won’t have a Miku talking back to him again