• Kogasa
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    5 months ago

    If you fine tune a LLM on math equations, odds are it won’t actually learn how to reliably solve novel problems. Just the same as it won’t become a subject matter expert on any topic, but it’s a lot harder to write simple math that “looks, but is not, correct” than it is to waffle vaguely about a topic. The idea of a LLM creating a robust model of the semantics of the text it’s trained on is, at face value, plausible; it just doesn’t seem to actually happen in practice.

    • Ignotum@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      4
      ·
      5 months ago

      Prompt:

      What is 183649+72961?

      ChatGPT:

      The sum of 183649 and 72961 is 256610.

      It’s trained to generate what is most plausible, but with math, the only plausible response is the correct answer (assuming it has been trained on data where that has been the case)

      • Kogasa
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        5 months ago

        ChatGPT uses auxiliary models to perform certain tasks like basic math and programming. Your explanation about plausibility is simply wrong.

        • Ignotum@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          5 months ago

          It has access to a python interpreter and can use that to do math, but it shows you that this is happening, and it did not when i asked it.

          I asked it to do another operation, this time specifying i wanted it to use an external tool, and it did

          You have access to a dictionary, that doesn’t prove you’re incapable of spelling simple words on your own, like goddamn people what’s with the hate boners for ai around here

          • Kogasa
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            5 months ago

            It has access to a python interpreter and can use that to do math, but it shows you that this is happening, and it did not when i asked it.

            That’s not what I meant.

            You have access to a dictionary, that doesn’t prove you’re incapable of spelling simple words on your own, like goddamn people what’s with the hate boners for ai around here

            ??? You just don’t understand the difference between a LLM and a chat application using many different tools.