• einfach_orangensaft@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    2 months ago

    as with apple “not unlocking iphones” this is probably just show and they already doing what they want after all they can be forced under the patriot act and have to stay silent about it.

    The state claiming they would not do what they want is a reward for the company due to it beeing free PR.

    • pivot_root@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      Maliciously complying to sabotage a tool that will be used for military purposes and an overly zealous admin that loves to accuse people of treason. A bold strategy.

      • EndlessNightmare@reddthat.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        2 months ago

        They aren’t marketing or selling it for the task. If someone intentionally chooses to misuse a tool after being told not to use it that way, the results are not always favorable.

        They can even clearly state that the tool isn’t designed for that/does not meet the specifications while not outright prohibiting it’s use. A disclaimer to the effect of “this tool is not certified or tested for such use and we cannot guarantee results.” That could even be the reason why they want to bar if for such use, and they don’t want the liability.

        Edit: It is very common for companies to remove features or limit capabilities to prevent misuse, including the legal risks that come with a product being used beyond its design specifications.

        • pivot_root@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          I get what you’re saying, but I think that’s giving them too much credit.

          The logic is obvious that “AI not trained on thing is not good at doing thing.” The people who care more about feelings than facts would see it as “AI works fine for everyone else but not for me” and react emotionally instead.

          • EndlessNightmare@reddthat.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            I mean that’s already true for a whole bunch of things.

            Military has stringent specifications due to functional and security requirements. They aren’t buying consumer grade vehicles, tools, electronics, weapons, etc. There was recently huge stink about Signal (a consumer-grade chat app) being used.

            This would be putting a lot of demands on the company and imposing significant liability.

            If the military wants a technology, there is already a process for developing such. This is a major departure from that process.

    • vrek
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      I don’t think “antropic bars use for mass arsehole” would have the same ring…

      /s yes this is a joke.

      • zeet@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        And ‘anthropic bars use for arsehole surveillance’ seems overly restrictive…

        • vrek
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 months ago

          True, do you think we should allow it use of any hole? Maybe we could increase shareholder profit if it goes up my nostril