• Veraticus@lib.lgbtOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    LLMs do not “teach,” and that is why learning from them is dangerous. They synthesize words and return other words, but they do not understand the content presented to them in any sense. Because of this, there is the chance that they are simply spouting bullshit.

    Learn from them if you like, but remember they are absolutely no substitute for a human, and basically everything they tell you must be checked for correctness.

    • Buttons
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 years ago

      GPT4 did teach me. I say this as the one who learned, whatever that’s worth.