• @[email protected]
    link
    fedilink
    English
    -19
    edit-2
    8 days ago

    LLMs reproduce the form of language without any meaning being transmitted. That’s called parroting.

    Even if (and that’s a big if) an AGI is going to be achieved at some point, there will be people calling it parroting by that definition. That’s the Chinese room argument.

      • @[email protected]
        link
        fedilink
        English
        -37 days ago

        Me? How can I move goalposts in a single sentence? We’ve had no previous conversation… And I’m not agreeing with the previous poster either…

        • @[email protected]
          link
          fedilink
          English
          67 days ago

          By entering the discussion, you also engaged in the previops context. The discussion uas about LLMs being parrots.

          • @[email protected]
            link
            fedilink
            English
            07 days ago

            And the argument was if there’s meaning behind what they generate. That argument applies to AGIs too. It’s a deeply debated philosophical question. What is meaning? Is our own thought pattern deterministic, and if it is, how do we know there’s any meaning behind our own actions?

            • @[email protected]
              link
              fedilink
              English
              37 days ago

              The burden of proof lies on the people making the claims about intelligence. “AI” pundits have supplied nothing but marketing-hype.