• unexposedhazard@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    3 days ago

    Its kind of an obvious outcome if you combine the concept of automation bias with the hallucinations produced by LLMs. They really are perfect misinformation and confusion machines. Even the way we call them “hallucinations” already attributes more intelligence and agency to LLMs then they deserve.