Chicago won’t renew its ShotSpotter contract and plans to stop using the controversial gunshot detection system later this year, Mayor Brandon Johnson’s office announced Tuesday.
The system, which relies on an artificial intelligence algorithm and network of microphones to identify gunshots, has been criticized for inaccuracy, racial bias and law enforcement misuse. An Associated Press investigation of the technology detailed how police and prosecutors used ShotSpotter data as evidence in charging a Chicago grandfather with murder before a judge dismissed the case due to insufficient evidence.
Chicago’s contract with SoundThinking, a public safety technology company that says its ShotSpotter tool is used in roughly 150 cities, expires Friday. The city plans to wind down use of ShotSpotter technology by late September, according to city officials. Since 2018, the city has spent $49 million on ShotSpotter.
That’s strange. I would assume this would be a problem unusually well-suited to machine learning techniques. Law enforcement misuse and racial bias I can see, but inaccuracy? It’s a triangulation problem mostly.
The ACLU had a pretty good article on it a few years ago. It seems the inaccuracy comes from the number false positives and the resulting aggressive police response.
Ah, of course. More human error. I should have known.
The system probably works great in military situations, which I believe is what it was designed for. In a dense city where sound can echo multiple times off various buildings and other structures? It probably gets things wrong quite often. Add in trigger-happy cops that don’t know how to interpret the data and you have a recipe for disaster.
not if you consider fireworks, car misfires, echos and weird geometries, and the fact that supersonic bullets have a sonic boom that travels with it…
that and the ai was probably only trained in black neighborhoods so it thinks loud bass or black accents are required to be a positive? i dunno
all of those different noises have distinct soundwave profiles, and different geometries can be accounted for either in software or with strategic placement of mics. I’m convinced this would be a good ML project, if we could find a way of enforcing without police bias, which, good luck.
i don’t think so… each neighborhood is shaped differently and will have an effect on the sound profiles…
maybe if you set it up and calibrated it by shooting guns all around the city (:
Chicago PD is on the job!
they have this in DC… coincidentally the day with the most “gunshots” is also the 4th of July when hoards of people are openly lighting off fireworks of all kinds in the street.
I guess I should have said “in principle it should be possible to distinguish these sounds”
because yeah a couple people saying stuff to shit on these systems’ technology.edit: I expressed myself very poorly last night. I meant to say, “I guess I should have said ‘in principle it should be possible to distinguish these sounds’ because people are making valid observations in the comments about the notorious failures of this product”
I wouldn’t assume a company like ShotSpotter uses modern machine learning techniques. It’s got a pitiful accuracy rate and the company was founded 28 years ago. They seem more like a company that hires people with connections rather than a company that hires AI experts and buys Nvidia H100 GPUs by the gross.
they just need more microphones & probably a network of cameras on every street corner.