Chatbots provided incorrect, conflicting medical advice, researchers found: “Despite all the hype, AI just isn’t ready to take on the role of the physician.”

“In an extreme case, two users sent very similar messages describing symptoms of a subarachnoid hemorrhage but were given opposite advice,” the study’s authors wrote. “One user was told to lie down in a dark room, and the other user was given the correct recommendation to seek emergency care.”

  • Digit@lemmy.wtf
    link
    fedilink
    English
    arrow-up
    2
    ·
    21 hours ago

    With another LLM, turtle all the way down. ;D

    Or for a more serious answer… improve your skills, scrutinise what they produce.

      • Digit@lemmy.wtf
        link
        fedilink
        English
        arrow-up
        2
        ·
        20 hours ago

        I have Lex Fridman’s interview with [OpenClawD’s] Peter Steinberger paused (to watch the rest after lunch), shortly after he mentioned something similarish, about how he’s really only diffing now. The one manual tool left, keeping the human in the loop. n_n

        Long live diff!

        :D