• GetOffMyLan
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    3 hours ago

    I mean they literally do analyze text. They’re great at it. Give it some text and it will analyze it really well. I do it with code at work all the time.

    Because they are two completely different tasks. Asking them to recall information from their training is a very bad use. Asking them to analyze information passed into them is what they are great at.

    Give it a sample of code and it will very accurately analyse and explain it. Ask it to generate code and the results are wildly varied in accuracy.

    I’m not assuming anything you can literally go and use one right now and see.

    • apotheotic (she/her)@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      56 minutes ago

      The person you’re replying to is correct though. They do not understand, they do not analyse. They generate (roughly) the most statistically likely answer to your prompt, which may very well end up being text representing an accurate analysis. They might even be incredibly reliable at doing so. But this person is just pushing back against the idea of these models actually understanding or analysing. Its slightly pedantic, sure, but its important to distinguish in the world of machine intelligence.

      • GetOffMyLan
        link
        fedilink
        arrow-up
        1
        ·
        3 minutes ago

        I literally quoted the word for that exact reason. It just gets really tiring when you talk about AIs and someone always has to make this point. We all know they don’t think or understand in the same way we do. No one gains anything by it being pointed out constantly.