• mindbleach@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    Nah, if the computer manufacturer can’t stop you from running evil software, the technology has no right to exist. Demand these assurances!

    • 5C5C5C
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      7 months ago

      You’re being pretty dense if you can’t wrap your head around a basic concept of accountability.

      A human can choose to commit crimes with any product, including … I don’t know … a fork. You could choose to stab someone with a fork, and you’d be a criminal. We wouldn’t blame the fork manufacturer for that because the person who chose for a crime to be committed was the person holding the fork. That’s who’s accountable.

      But if a fork manufacturer starts selling forks which might start stabbing people on their own, without any human user intending for the stabbing to take place, then the manufacturer who produced and sold the auto-stabbing forks is absolutely guilty of criminal negligence.

      Edit: But I’ll concede that a law against the technology being used to assist humans in criminal activity in a broad sense is unrealistic. At best there would need to be bounds around the degree of criminal help that the tool is able to provide.

      • mindbleach@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        But a human asking how to make a bomb is somehow the LLM’s fault.

        Or the LLM has to know that you are who you say you are, to prevent you from writing scam e-mails.

        The guy you initially replied to was talking about hooking up an LLM to a virus replication machine. Is that the level of safety you’re asking for? A machine so safe, we can give it to supervillains?