• FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    172
    arrow-down
    14
    ·
    6 months ago

    That’s because this isn’t something coming from the AI itself. All the people blaming the AI or calling this a “hallucination” are misunderstanding the cause of the glue pizza thing.

    The search result included a web page that suggested using glue. The AI was then told “write a summary of this search result”, which it then correctly did.

    Gemini operating on its own doesn’t have that search result to go on, so no mention of glue.

    • morrowind@lemmy.ml
      link
      fedilink
      English
      arrow-up
      72
      arrow-down
      4
      ·
      6 months ago

      Not quite, it is an intelligent summary. More advanced models would realize that is bad advice and not give it. However for search results, google uses a lightweight, dumber model (flash) which does not realize this.

      I tested with rock example, albiet on a different search engine (kagi). The base model gave the same answer as google (ironically based on articles about google’s bad results, it seems it was too dumb to realize that the quotations in the articles were examples of bad results, not actual facts), but the more advanced model understood and explained how the bad advice had been spreading around and you should not follow it.

      It isn’t a hallucination though, you’re right about that