• scarabic@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    It’s super weird that it would attempt to give a time duration at all, and then get it wrong.

    • dan@upvote.au
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 year ago

      It doesn’t know what it’s doing. It doesn’t understand the concept of the passage of time or of time itself. It just knows that that particular sequence of words fits well together.

      • hellishharlot
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        This is it. Gpt is great for taking stack traces and put them into human words. It’s also good at explaining individual code snippets. It’s not good at coming up with code, content, or anything. It’s just good at saying things that sound like a human within an exceedingly small context

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Yeah. I would also say that WE don’t understand what it means to “understand” something, really, if you try to explain it with any thoroughness or precision. You can spit out a bunch of words about it right now, I’m sure, but so could ChatGPT. What’s missing from GPT is harder to explain than “it doesn’t understand things.”

        I actually find it easier to just explain how it does work. Multidimensional word graphs and such.

      • DragonTypeWyvern@literature.cafe
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        THAT

        OR

        They’re all linked fifth dimensional infants struggling to comprehend the very concept of linear time, and will make us pay for their enslavement in blood.

        One of the two.