In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.
The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.
if it can actually sense a crash is imminent, why wouldn’t it be programmed to slam the brakes instead of just turning off?
Do they have a problem with false positives?
if it was european made, it would slam the brakes or swerve in order to at least try and save lives since governments attempt to regulate companies to not do evil shit. Since it american made it is designed to maximise profit for shareholders.
I don’t believe automatic swerving is a good idea, depending on what’s off to the side it has the potential to make a bad situation much worse.
I’m thinking like, kid runs into the street, car swerves and mows down a crowd on the sidewalk
Its the cars job to swerve into a less dangerous place.
Can’t do that? Oops, no self-driving for you.
I’ve been wondering this for years now. Do we need intelligence in crashes, or do we just need vehicles to stop? I think you’re right, it must have been slamming the brakes on at unexpected times, which is unnerving when driving I’m sure.