

“We show that by exploiting the physics of specular reflection, an adversary can inject phantom obstacles or erase real ones using only inexpensive mirrors,” the researchers wrote in a paper submitted to the journal Computers & Security.
“Experiments on a full AV platform, with commercial-grade LIDAR and the Autoware stack, demonstrate that these are practical threats capable of triggering critical safety failures, such as abrupt emergency braking and failure to yield.”
I’d be fooled, too, at first - and suspicious (who’s fucking around with mirrors on the road?) - but I’d probably figure it out after a second.
My main concern is people could use these kinds of exploits to “jailbreak” robo-cars (or whatever we’re calling them) to behave in dangerous ways in real traffic.
Those are definitely words.