I’m usually the one saying “AI is already as good as it’s gonna get, for a long while.”

This article, in contrast, is quotes from folks making the next AI generation - saying the same.

  • MajorHavocOP
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    7
    ·
    4 hours ago

    Are you asserting that chatbots are so fundamentally different from LLMs that “oh shit we can’t just throw more CPU and data at this anymore” doesn’t apply to roughly the same degree?

    • makyo@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      I feel like people are using those terms pretty well interchangeably lately anyway

      • Greg Clarke@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        41 minutes ago

        People that don’t understand those terms are using them interchangeably

    • Greg Clarke@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      42 minutes ago

      Yes of course I’m asserting that. While the performance of LLMs may be plateauing, the cost, context window, and efficiency is still getting much better. When you chat with a modern chat bot it’s not just sending your input to an LLM like the first public version of ChatGPT. Nowadays a single chat bot response may require many LLM requests along with other techniques to mitigate the deficiencies of LLMs. Just ask the free version of ChatGPT a question that requires some calculation and you’ll have a better understanding of what’s going on and the direction of the industry.