

As a consumer product, you are responsible and supposed to be paying attention at all times and be ready to take over.
It is completely acceptable that it does not function perfectly in every scenario and something like a fake wall put on the road causes issues, that is why you need to pay attention.
There is nothing to recall about this situation.
If the car is failing on things it shouldn’t be, like both Tesla and Waymo failing to properly stop for school busses while in autonomous mode, that does require an update. Alhough ive seen 0 reports of an autonomous Tesla doing this yet only supervised ones.
A Tesla not stopping for a school bus in supervised mode is acceptable though because the driver is responsible to stop.
Edit: and note, a problem like the school busses is a visual processing understanding problem. Lidar won’t help with that kind or problem.
Edit: and sorry to be clear, it is hardware still on the road, but I’m saying its acceptable that hardware does it because its not autonomous. If the newer hardware running without supervisors was doing it, that’s another story.

Ya, hardware that is on the road that won’t ever be autonomous without getting upgraded hardware amd software because its insufficient for autonomy, but has been shown to not be a problem on the latest autonomous versions.