I get what you’re saying, but I think that’s giving them too much credit.
The logic is obvious that “AI not trained on thing is not good at doing thing.” The people who care more about feelings than facts would see it as “AI works fine for everyone else but not for me” and react emotionally instead.
I mean that’s already true for a whole bunch of things.
Military has stringent specifications due to functional and security requirements. They aren’t buying consumer grade vehicles, tools, electronics, weapons, etc. There was recently huge stink about Signal (a consumer-grade chat app) being used.
This would be putting a lot of demands on the company and imposing significant liability.
If the military wants a technology, there is already a process for developing such. This is a major departure from that process.
I get what you’re saying, but I think that’s giving them too much credit.
The logic is obvious that “AI not trained on thing is not good at doing thing.” The people who care more about feelings than facts would see it as “AI works fine for everyone else but not for me” and react emotionally instead.
I mean that’s already true for a whole bunch of things.
Military has stringent specifications due to functional and security requirements. They aren’t buying consumer grade vehicles, tools, electronics, weapons, etc. There was recently huge stink about Signal (a consumer-grade chat app) being used.
This would be putting a lot of demands on the company and imposing significant liability.
If the military wants a technology, there is already a process for developing such. This is a major departure from that process.