• einlander@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    9 hours ago

    The problem I see with poisoning the data is the AI’s being trained for law enforcement hallucinating false facts used to arrest and convict people.

    • patatahooligan@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      8 hours ago

      Law enforcement AI is a terrible idea and it doesn’t matter whether you feed it “false facts” or not. There’s enough bias in law enforcement that the data is essentially always poisoned.

    • limonfiesta@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 hours ago

      They aren’t poisoning the data with disinformation.

      They’re poisoning it with accurate, but irrelevant information.

      For example, if a bot is crawling sites relating to computer programming, or weather, this tool might lure the crawler into pages related to animal facts, or human biology.

    • melpomenesclevage@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 hours ago

      that’s the entire point of laws, though, and it was already being used for that.

      giving the laws better law stuff will not improve them. the law is malevolent. you cannot fix it by offering to help.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      8 hours ago

      Law enforcement doesn’t convict anyone, that’s a judge’s job. If a LEO falsely arrests you, you can sue them, and it should be pretty open-and-shut if it’s due to AI hallucination. Enough of that and LEO will stop it.