• Etterra@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    2 months ago

    That’s why you always get it from their website. Never trust a LLM to do a search engine’s job.

    • JackbyDev
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      15
      ·
      2 months ago

      Respectfully, this is victim blaming. Criticize Google, not end users.

      • Buddahriffic@lemmy.world
        link
        fedilink
        arrow-up
        11
        arrow-down
        1
        ·
        2 months ago

        Wait, are you advocating people blindly trust unreliable sources and then get angry at the unreliable source when it turns out to be unreliable rather than learn from shit like this to avoid becoming a victim?

        • A7thStone@lemmy.world
          link
          fedilink
          arrow-up
          6
          ·
          2 months ago

          Google has spent a fortune to convince people they are a reliable source. This is clearly on google, not the people who aren’t tech savvy.

          • Buddahriffic@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            2 months ago

            Ok, I agree that Google isn’t a good guy in this situation, but that doesn’t mean advice to not just trust what Google says is invalid. It also doesn’t absolve Google of their accidental or deliberate inaccuracies.

            It was just a “In case you didn’t know, don’t just trust Google even though they’ve worked so hard at building a reputation of being trustworthy and even seemed pretty trustworthy in the past. Get a phone number from the company’s website.”

            And then I’ll add on: regardless of where you got the phone number from, be skeptical if someone asks you for your banking information or other personal information that isn’t usually involved in such a service. Not because you’ll be the bad guy if you do get scammed, but to avoid going through it because it’s at least going to be a pain in the ass to deal with, if not a financially horrible situation to go through if you are unable to get it reversed.

        • JackbyDev
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          4
          ·
          2 months ago

          are you advocating people blindly trust unreliable sources

          Where did I say this? I didn’t say this. You said I said this.

          • Buddahriffic@lemmy.world
            link
            fedilink
            arrow-up
            5
            ·
            2 months ago

            I don’t see any blaming of anyone in the original comment you replied to but just general advice to avoid falling for a scam like this. There isn’t even a victim in this case because the asking for banking info tipped them off if I’m understanding the OP correctly.

            So I’m confused about what specifically you are objecting to in the original comment and if it is the general idea that you shouldn’t blindly trust results given by Google’s LLM, which isn’t known for its reliability.

            • JackbyDev
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              4
              ·
              edit-2
              2 months ago

              For me it’s the idea of focusing at all on telling people not to trust LLMs as opposed to criticizing companies for putting them prominently on the top of the page.

                • JackbyDev
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  2 months ago

                  Because the average person doesn’t even know what an LLM is or what it even stands for and putting a misinformation generator at the top of search pages is irresponsible.

                  Like, if something is so unreliable with information that you have to say “don’t trust what this thing says” but you still put it at the top of the page? Come on… It’s like putting a self destruct button in a car and telling people “well the label says not to push it!”

                  • Buddahriffic@lemmy.world
                    link
                    fedilink
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    2 months ago

                    We don’t control what Google puts on their search page. Ideally, yeah, they wouldn’t be pushing their LLM out to where it’s the first thing to respond to people who don’t understand that it isn’t reliable. But we live in a reality where they did put it on top of their search page and where they likely don’t even care what we think of that. Their interests and everyone else’s don’t necessarily align.

                    That comment was advice for people who read it and haven’t yet realized how unreliable it is and has nothing to do with the average person. I’m still confused as to why you have such an issue with it being said at all. Based on what you’ve been saying, I think you’d agree that Google is being either negligent or malicious by doing so. So saying they shouldn’t be trusted seems like common sense, but your first comment acts like it’s just being mean to anyone who has trusted it or something?

      • capital@lemmy.world
        link
        fedilink
        arrow-up
        8
        arrow-down
        2
        ·
        2 months ago

        Remember when 4chan got people to microwave their phones because they got them to believe it would charge it?

        If calling those people stupid is victim blaming then so be it. I’m blaming the victim.

        This case isn’t as clear as that but even before the AI mania the instant answer at the top of Google results was frequently incorrect. Being able to discern BS from real results has always been necessary and AI doesn’t change that.

        I’ve been using Kagi this year and it keeps LLM results out of the way unless I want them. When you open their AI assistant it says

        Assistant can make mistakes. Think for yourself when using it.

        I think that sums it up nicely.