• @[email protected]
    link
    fedilink
    English
    62 months ago

    It’s not really an issue. 99.9% of the time the passengers will already be safe and the pedestrian is the one at risk. The only time I see this being an issue is if the car is already out of control, but at that point there’s little anyone can do.

    I mean, what’s the situation where a car can’t break but has enough control where it HAS to kill a pedestrian in order to save the passengers?

    • MeanEYE
      cake
      link
      fedilink
      English
      22 months ago

      Tesla on their autopilot during night. All the time basically. There were number of motorcycle deaths where Tesla just mowed them down. The reason? They had two tail lights side by side instead one big light. Tesla thought this was a car far away and just ran through people.

      • @[email protected]
        link
        fedilink
        English
        32 months ago

        That’s a problem with the software. The passengers in the car were never at risk and the car could have stopped at any time, the issue was that the car didn’t know what was happening. This situation wouldn’t have engaged the autopilot in the way we are discussing.

        As an aside, if what you said is true, people at Tesla should be in jail. WTF

        • MeanEYE
          cake
          link
          fedilink
          English
          12 months ago

          Tesla washes their hands of any wrongdoing with terms of use where owner agrees he’s responsible bla bla bla.

          Here’s a related video.