as with apple “not unlocking iphones” this is probably just show and they already doing what they want after all they can be forced under the patriot act and have to stay silent about it.
The state claiming they would not do what they want is a reward for the company due to it beeing free PR.
Malicious compliance: don’t prohibit it’s use, just make it shitty and useless for the task
Maliciously complying to sabotage a tool that will be used for military purposes and an overly zealous admin that loves to accuse people of treason. A bold strategy.
They aren’t marketing or selling it for the task. If someone intentionally chooses to misuse a tool after being told not to use it that way, the results are not always favorable.
They can even clearly state that the tool isn’t designed for that/does not meet the specifications while not outright prohibiting it’s use. A disclaimer to the effect of “this tool is not certified or tested for such use and we cannot guarantee results.” That could even be the reason why they want to bar if for such use, and they don’t want the liability.
Edit: It is very common for companies to remove features or limit capabilities to prevent misuse, including the legal risks that come with a product being used beyond its design specifications.
I get what you’re saying, but I think that’s giving them too much credit.
The logic is obvious that “AI not trained on thing is not good at doing thing.” The people who care more about feelings than facts would see it as “AI works fine for everyone else but not for me” and react emotionally instead.
I mean that’s already true for a whole bunch of things.
Military has stringent specifications due to functional and security requirements. They aren’t buying consumer grade vehicles, tools, electronics, weapons, etc. There was recently huge stink about Signal (a consumer-grade chat app) being used.
This would be putting a lot of demands on the company and imposing significant liability.
If the military wants a technology, there is already a process for developing such. This is a major departure from that process.
I thought I saw an article that anthropic already knuckled under.
They misspelled arsehole
I don’t think “antropic bars use for mass arsehole” would have the same ring…
/s yes this is a joke.
And ‘anthropic bars use for arsehole surveillance’ seems overly restrictive…
True, do you think we should allow it use of any hole? Maybe we could increase shareholder profit if it goes up my nostril




