Software used to predict crime can now be scoured for bias


Dave Gershgorn in Quartz: “Predictive policing, or the idea that software can foresee where crime will take place, is being adopted across the country—despite being riddled with issues. These algorithms have been shown to disproportionately target minorities, and private companies won’t reveal how their software reached those conclusions.

In an attempt to stand out from the pack, predictive-policing startup CivicScape has released its algorithm and data online for experts to scour, according to Government Technology magazine. The company’s Github page is already populated with its code, as well as a variety of documents detailing how its algorithm interprets police data and what variables are included when predicting crime.

“By making our code and data open-source, we are inviting feedback and conversation about CivicScape in the belief that many eyes make our tools better for all,” the company writes on Github. “We must understand and measure bias in crime data that can result in disparate public safety outcomes within a community.”…

CivicScape claims to not use race or ethnic data to make predictions, although it is aware of other indirect indicators of race that could bias its software. The software also filters out low-level drug crimes, which have been found to be heavily biased against African Americans.

While this startup might be the first to publicly reveal the inner machinations of its algorithm and data practices, it’s not an assurance that predictive policing can be made fair and transparent across the board.

“Lots of research is going on about how algorithms can be transparent, accountable, and fair,” the company writes. “We look forward to being involved in this important conversation.”…(More)”.