Its kind of an obvious outcome if you combine the concept of automation bias with the hallucinations produced by LLMs. They really are perfect misinformation and confusion machines. Even the way we call them “hallucinations” already attributes more intelligence and agency to LLMs then they deserve.
Its kind of an obvious outcome if you combine the concept of automation bias with the hallucinations produced by LLMs. They really are perfect misinformation and confusion machines. Even the way we call them “hallucinations” already attributes more intelligence and agency to LLMs then they deserve.