They replicate bias, entrench inequalities, and distort institutional aims. They devalue much of what makes us human: our capacities to exercise discretion, act spontaneously, and reason in ways that can’t be quantified. And far from being objective or neutral, technical decisions made in system design embed the values, aims, and interests of mostly white, mostly male technologists working in mostly profit-driven enterprises. Simply put, these tools are dangerous; in O’Neil’s words, they are “weapons of math destruction.”
The first half of the article goes over the problems we know well but in the second half there are some proposed solutions.
I’m pretty sure this was moderated by machine. This was right when they started bragging about their automated moderation. It’s possible that this was a language barrier issue, but it didn’t feel right.