• pivot_root@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    I get what you’re saying, but I think that’s giving them too much credit.

    The logic is obvious that “AI not trained on thing is not good at doing thing.” The people who care more about feelings than facts would see it as “AI works fine for everyone else but not for me” and react emotionally instead.

    • EndlessNightmare@reddthat.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      I mean that’s already true for a whole bunch of things.

      Military has stringent specifications due to functional and security requirements. They aren’t buying consumer grade vehicles, tools, electronics, weapons, etc. There was recently huge stink about Signal (a consumer-grade chat app) being used.

      This would be putting a lot of demands on the company and imposing significant liability.

      If the military wants a technology, there is already a process for developing such. This is a major departure from that process.