@[email protected] to Science [email protected]English • 4 months agoHuhsh.itjust.worksimagemessage-square48fedilinkarrow-up1560arrow-down111
arrow-up1549arrow-down1imageHuhsh.itjust.works@[email protected] to Science [email protected]English • 4 months agomessage-square48fedilink
minus-square@[email protected]linkfedilinkEnglish74•4 months agoIt all comes down to the fact that LLMs are not AGI - they have no clue what they’re saying or why or to whom. They have no concept of “context” and as a result have no ability to “know” if they’re giving right info or just hallucinating.
minus-square@[email protected]linkfedilinkEnglish1•4 months agoHey, but if Sam says it might be AGI he might get a trillion dollars so shut it /s
It all comes down to the fact that LLMs are not AGI - they have no clue what they’re saying or why or to whom. They have no concept of “context” and as a result have no ability to “know” if they’re giving right info or just hallucinating.
Hey, but if Sam says it might be AGI he might get a trillion dollars so shut it /s