Chatbots provided incorrect, conflicting medical advice, researchers found: “Despite all the hype, AI just isn’t ready to take on the role of the physician.”

“In an extreme case, two users sent very similar messages describing symptoms of a subarachnoid hemorrhage but were given opposite advice,” the study’s authors wrote. “One user was told to lie down in a dark room, and the other user was given the correct recommendation to seek emergency care.”

  • Elting@piefed.social
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 month ago

    So in order to get decent medical advice from an LLM you just need to be a doctor and tell it whats wrong with you.

    • tyler
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 month ago

      Yes, that was the conclusion.