• Lembot_0006
    link
    fedilink
    arrow-up
    26
    arrow-down
    2
    ·
    13 days ago

    That’s logical. If I would want an LLM powered search(sometimes I search for the things I’m not even sure to exist) I would use an LLM. If I know concrete terminology I would use a standard search engine expecting it to provide more precise results.

    I suppose other people do the same.

    • eatCasserole@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      13 days ago

      An LLM might not be a great way to search for things that may or may not exist…I’ve seen them generate entirely fictional, confident-sounding responses describing things that absolutely do not exist.

      • Riskable
        link
        fedilink
        English
        arrow-up
        1
        ·
        13 days ago

        It’s the difference between “asking the LLM” and “using the LLM” (as an agent). If you prompt an LLM with a question it’ll try to answer it using it’s own training (which could contain hallucinations). If you ask an LLM to search the Internet on your behalf, that’s actually quite useful.

      • Lembot_0006
        link
        fedilink
        arrow-up
        1
        ·
        13 days ago

        I use chatGPT regularly for figuring out if some library/engine has this or that functionality. Yes, 10-15% of answers are straightforward lies. But I am a specialist so I don’t have much problem filtering out garbage.

        Typical Internet forums provide garbage too. At slightly smaller rates, but still.

        Nothing can be trusted nowadays. On no theme. Any data should be checked.

        • eatCasserole@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          13 days ago

          Well if 10-15% lies is useful to you, cool, but it’s definitely not a practice I would recommend to anyone.