• BastingChemina@slrpnk.net
    link
    fedilink
    English
    arrow-up
    36
    ·
    edit-2
    8 days ago

    The recent Not Just Bike video about self driving cars is really good about this subject, very dystopic

  • WoodScientist@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    7
    ·
    8 days ago

    People, and especially journalists, need to get this idea of robots as perfectly logical computer code out of their heads. These aren’t Asimov’s robots we’re dealing with. Journalists still cling to the idea that all computers are hard-coded. You still sometimes see people navel-gazing on self-driving cars, working the trolley problem. “Should a car veer into oncoming traffic to avoid hitting a child crossing the road?” The authors imagine that the creators of these machines hand-code every scenario, like a long series of if statements.

    But that’s just not how these things are made. They are not programmed; they are trained. In the case of self-driving cars, they are simply given a bunch of video footage and radar records, and the accompanying driver inputs in response to those conditions. Then they try to map the radar and camera inputs to whatever the human drivers did. And they train the AI to do that.

    This behavior isn’t at all surprising. Self-driving cars, like any similar AI system, are not hard coded, coldly logical machines. They are trained off us, off our responses, and they exhibit all of the mistakes and errors we make. The reason waymo cars don’t stop at crosswalks is because human drivers don’t stop at crosswalks. The machine is simply copying us.

    • SkybreakerEngineer@lemmy.world
      link
      fedilink
      English
      arrow-up
      34
      ·
      8 days ago

      All of which takes you back to the headline, “Waymo trains its cars to not stop at crosswalks”. The company controls the input, it needs to be responsible for the results.

      • FireRetardant@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        8 days ago

        Some of these self driving car companies have successfully lobbied to stop citys from ticketing their vehicles for traffic infractions. Here they are stating these cars are so much better than human drivers, yet they won’t stand behind that statement instead they are demanding special rules for themselves and no consequences.

    • tiramichu@lemm.ee
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      edit-2
      8 days ago

      I think the reason non-tech people find this so difficult to comprehend is the poor understanding of what problems are easy for (classically programmed) computers to solve versus ones that are hard.

      if ( person_at_crossing ) then { stop }
      

      To the layperson it makes sense that self-driving cars should be programmed this way. Aftter all, this is a trivial problem for a human to solve. Just look, and if there is a person you stop. Easy peasy.

      But for a computer, how do you know? What is a ‘person’? What is a ‘crossing’? How do we know if the person is ‘at/on’ the crossing as opposed to simply near it or passing by?

      To me it’s this disconnect between the common understanding of computer capability and the reality that causes the misconception.

      • Starbuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 days ago

        I think you could liken it to training a young driver who doesn’t share a language with you. You can demonstrate the behavior you want once or twice, but unless all of the observations demonstrate the behavior you want, you can’t say “yes, we specifically told it to do that”

      • Iceblade@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 days ago

        Difference is that humans (usually) come with empathy (or at least self-preservation) built in. With self-driving cars we aren’t building in empathy and self (or at least passenger) preservation, we’re hard-coding in scenarios where the law says they have to do X or Y.

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        8 days ago

        But for a computer, how do you know? What is a ‘person’? What is a ‘crossing’? How do we know if the person is ‘at/on’ the crossing as opposed to simply near it or passing by?

        Most walkways are marked. The vehicle is able to identify obstructions in the road and things on the side of the road that are moving towards the road just like cross street traffic.

        If (thing) is crossing the street then stop. If (thing) is stationary near a marked crosswalk, stop and go if they don’t move in (x) seconds. If they don’t move in a reasonable amount of time, then go.

        You know, the same way people are supposed to handle the same situation.

        • hissing meerkat@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          ·
          8 days ago

          Most crosswalks in the US are not marked, and in all places I’m familiar with vehicles must stop or yield to pedestrians at unmarked crosswalks.

          At unmarked crosswalks and marked but uncontrolled crosswalks we have to handle the situation with social cues about which direction the pedestrian wants to cross the street/road/highway and if they will feel safer crossing the road after a vehicle has passed than before (almost always for homeless pedestrians and frequently for pedestrians in moderate traffic).

          If waymo can’t figure out if something intends or is likely to enter the highway they can’t drive a car. Those can be people at crosswalks, people crossing at places other than crosswalks, blind pedestrians crossing anywhere, deaf and blind pedestrians crossing even at controlled intersections, kids or wildlife or livestock running toward the road, etc.

          • snooggums@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            8 days ago

            Person, dog, cat, rolling cart, bicycle, etc.

            If the car is smart enough to recognize a stationary atop sign then it should be able to ignore a permantly mounted crosswalk sign or indicator light at a crosswalk and exclude those from things that might move into the street. Or it could just stop and wait a couple seconds if it isn’t sure.

            • Dragon Rider (drag)@lemmy.nz
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 days ago

              A woman was killed by a self driving car because she walked her bicycle across the road. The car hadn’t been programmed to understand what a person walking a bicycle is. Its AI switched between classifying her as a pedestrian, cyclist, and “unknown”. It couldn’t tell whether to slow down, and then it hit her. The engineers forgot to add a category, and someone died.

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      8 days ago

      The machine can still be trained to actually stop at crosswalks the same way it is trained to not collide with other cars even though people do that.

    • Wolf314159@startrek.website
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      8 days ago

      Whether you call in it programming or training, the designers still designed a car that doesn’t obey traffic laws.

      People need to get it out of their heads that AI is some kind of magical monkey-see-monkey-do. AI isn’t magic, it’s just a statistical model. Garbage in = Garbage out. If the machine fails because it’s only copying us, that’s not the machine’s fault, not AI’s fault, not our fault, it’s the programmer’s fault. It’s fundamentally no different, had they designed a complicated set of logical rules to follow. Training a statistical model is programming.

      You’re whole “explanation” sounds like a tech-bro capitalist news conference sound bite released by a corporation to avoid guilt for running down a child in a crosswalk.

    • Justin@lemmy.jlh.name
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 days ago

      It’s telling that Tesla and Google, worth over 3 trillion dollars, haven’t been able to solve these issues.

  • acargitz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    26
    ·
    edit-2
    8 days ago

    I’m sure a strong legal case can be made here.

    An individual driver breaking the law is bad enough but the legal system can be “flexible” because it’s hard to enforce the law against a generalized (bad) social norm and then each individual law breaker can argue an individual case etc.

    But a company systematically breaking the law on purpose is different. Scale here matters. There are no individualized circumstances and no crying at a judge that the fine will put this single mother in a position to not pay rent this month. This is systematic and premeditated. Inexcusable in every way.

    Like, a single cook forgetting to wash hands once after going to the bathroom is gross but a franchise chain building a business model around adding small quantities of poop in food is insupportable.

    • Cris@lemmy.world
      link
      fedilink
      English
      arrow-up
      30
      ·
      8 days ago

      Yeah it increasingly feels like the “things” is just “people” in whatever context

        • SlopppyEngineer@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          8 days ago

          Only for hitting gold member insurance or above. And our platinum members automatically get absolute priority in traffic. Every autonomous vehicle will yield and let you through like Mozes through the red sea, so call now for that upgrade.

  • Skvlp@lemm.ee
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    8 days ago

    Being an Alphabet subsidiary I wouldn’t expect anything less, really.

  • Kuinox@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    4
    ·
    edit-2
    8 days ago

    It’s interesting how waymos get more article against them compared to tesla.
    There is a targeted campaign against waymo.
    How can i not think the journalist is in bad faith, when he complain that the waymo doesn’t stop… in case he run under another car ?

    As an european, when I see this video, the problem isn’t the automated cars, but the fact the car are allowed to go this fast on a lane without a traffic light to protect the pedestrian.

    Edit:

    Waymo admitted that it follows “social norms” rather than laws.

    The reason is likely to compete with Uber, 🤦

    Because they slowed down too much the traffic and have a campaign against them, about how they slowed too much the traffic, for respecting the law.

    • FarceOfWill@infosec.pub
      link
      fedilink
      English
      arrow-up
      9
      ·
      8 days ago

      Waymo is running driverless (or at least remote monitored) taxis all over SF. that’s why they’re getting headlines, they’re out and being used at scale.

      • Kuinox@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        8 days ago

        Tesla are also used at scale, while sold as “autopilot” (it’s not).

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 days ago

          Teslas scan not legally be driverless. Whatever happens is the drivers fault.

          But yes, I remember the same thing, being amazed they intentionally don’t come to a complete stop at stop signs. That may be how the world works these days but it shouldn’t

          • Kuinox@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 days ago

            afaik, the waymo always respected stops and tesla had to remove the rolling stop feature.

    • Miles O'Brien@startrek.website
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 days ago

      “The squeaky hinge gets the grease”

      “The nail that sticks out farthest vets the hammer first”

      These are metaphors to say “since Waymo is the one doing things like driverless taxis all over a city, they’re getting news stories and social media posts”

      Yes, things like Tesla suck too. But tesla isn’t operating a “driverless taxi” service. Yet.

      I’m sure that as something advertised as “driverless” that tesla’s owner gets pissy about it and probably feeds into negative press against them, but that doesn’t excuse what they do.

      • Kuinox@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        8 days ago

        Tesla sells an “autopilot” and make consumers think you don’t need to drive.
        Waymo didn’t caused any death yet, and when any piece of media I seen that wasn’t a charge against Waymo, they behaved extremely well compared to the average driver.

  • Blackout@fedia.io
    link
    fedilink
    arrow-up
    8
    ·
    8 days ago

    Pedestrians have had it too easy long enough. If elected President I will remove the sidewalks and install moats filled with alligators and sharks with loose 2x4s to cross them. Trained snipers will be watching every crosswalk so if you want a shot at making it remember to serpentine. This is Ford™ country.

  • thann@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 days ago

    How do you admit to intentionally ignoring traffic laws and not get instantly shutdown by the NTSB?

  • RandomVideos
    link
    fedilink
    English
    arrow-up
    3
    ·
    8 days ago

    I read RoboTaxis as RobotAxis and wondered what a mechanical version of the losers of WW2 had to do with cars

  • MrFappy@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 days ago

    Is this site being weird or am I tripping, because I just came in here, and there was an on point comment about being a parent and wiping pee being a part of life, and ended with a solid joke, but, I come back in here and it’s gone with no deleted or anything. It was good and on point enough that I returned to reply…. What is happening? I’m too tired for this confusion!