• @[email protected]
    link
    fedilink
    32
    edit-2
    2 months ago

    They better be careful, the AI could actually make stuff more impartial. They wouldn’t want that

    • @[email protected]
      link
      fedilink
      152 months ago

      Nah, they’ll just make the AI racist to compensate.

      Also, until they can’t turn off the camera, it’s worth nothing.

    • @[email protected]
      link
      fedilink
      English
      12
      edit-2
      2 months ago

      They better be careful, the AI could actually make stuff more impartial. They wouldn’t want that

      I dunno, when the cops scream “stop resisting” 400 times while kicking a man in the fetal position on the ground, will it conclude he’s resisting or conclude excessive force is being used? I know where my money is at.

    • FaceDeer
      link
      fedilink
      52 months ago

      My first thought too, “finally something in the chain that’s honest.”

      It’d be good to audit it now and then, of course.

      • @[email protected]
        link
        fedilink
        82 months ago

        They are probably going to train the AI it on existing reports and videos. Why train an AI to work against you?

    • @[email protected]
      link
      fedilink
      22 months ago

      I mean if it’s based on the audio the police officer just can say I’m under attack I feel a tank even when they’re not before they walk up to somebody. Is very very very easily to manipulate this

  • Deebster
    link
    142 months ago

    It feels off that the headline talks about body cam footage but the AI actually just uses the audio. Technically that may be considered footage but I think I’m with most in considering that to mean the audio and video together.

    Anecdotally, I’ve found that AI systems set up to summarise are reliable, probably using that “turn off creativity” setup that’s mentioned.

      • Deebster
        link
        82 months ago

        It’s already a report written by the police - they can make it say whatever they want with or without AI.

        • @[email protected]
          link
          fedilink
          English
          22 months ago

          Problem is, a lot of uninformed people trust “AI” more than they trust people. This could influence jurors to trust the police more.

  • kamenLady.
    link
    fedilink
    52 months ago

    In 2022, nine out of the 12 international ethics board members resigned following the announcement — and prompt reversal — of having Taser-wielding drones patrol US schools.

    This ain’t no futurism anymore, it’s already time for an ancient_dystopia community‽

  • AutoTL;DRB
    link
    fedilink
    English
    22 months ago

    This is the best summary I could come up with:


    As Forbes reports, it’s a brazen and worrying use of the tech that could easily lead to the furthering of institutional ills like racial bias in the hands of police departments.

    “It’s kind of a nightmare,” Electronic Frontier Foundation surveillance technologies investigations director Dave Maass told Forbes.

    Axon claims its new AI, which is based on OpenAI’s GPT-4 large language model, can help cops spend less time writing up reports.

    But given the sheer propensity of OpenAI’s models to “hallucinate” facts, fail at correctly summarizing information, and replicate the racial biases from their training data, it’s an eyebrow-raising use of the tech.

    “This is going to seriously mess up people’s lives — AI is notoriously error-prone and police reports are official records,” another user wrote.

    In 2022, nine out of the 12 international ethics board members resigned following the announcement — and prompt reversal — of having Taser-wielding drones patrol US schools.


    The original article contains 555 words, the summary contains 152 words. Saved 73%. I’m a bot and I’m open source!