Credit: ProPublica analysis of data from Broward County, Fla.

A deep dive from ProPublica reporters who wanted to vet whether an algorithm really could determine the likelihood that someone would commit a crime in the future has produced some worrisome findings. ProPublica zeroed in on some 7,000 people in Broward County, Florida, who were arrested and given risk scores to decide the likelihood that they would reoffend in the two years following their arrest.

Among other things, ProPublica concluded that the risk formula wrongly flagged black defendants as future criminals and failed to flag certain white defendants as a higher risk for committing future crimes.

According to ProPublica:

“Scores like this – known as risk assessments – are increasingly common in courtrooms across the nation. They are used to inform decisions about who can be set free at every stage of the criminal justice system, from assigning bond amounts – as is the case in Fort Lauderdale – to even more fundamental decisions about defendants’ freedom. In Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington and Wisconsin, the results of such assessments are given to judges during criminal sentencing.”

Despite former Attorney General Eric Holder worrying that racial bias could surface in such risk assessments and perpetuate biases that already existed in the criminal justice system, the U.S. Sentencing Commission has never studied the issue. The ProPublica reporters also concluded that the tool was exceptionally bad at predicting future acts of violence. Just 1 in 5 people from the analysis later committed a violent crime after the prediction was made that they would do so.

“When a full range of crimes were taken into account – including misdemeanors such as driving with an expired license – the algorithm was somewhat more accurate than a coin flip,” according to ProPublica.

That doesn’t mean police should ignore what data might be trying to tell them.

Harnessing Big Data for public safety purposes has been a continually growing trend in law enforcement and intelligence circles since the Sept. 11 attacks. While intelligence agencies have a long tradition of trying to analyze Big Data to solve problems, state and local police still are new on the block but are no less persuaded by claims that innovations in computing can do more than a patrol officer merely walking a beat.

Police in Chicago have developed a more narrowly tailored “Strategic Subject List” made up of 1,400 people. So far this year, over 70 percent of the people shot in the Windy City turned out to be on the list. Reportedly, so have 80 percent of the shooters.

Nonetheless, there are fears in Chicago that the tool could be misused. Karen Sheley, a director for the American Civil Liberties Union of Illinois, told the New York Times this week: “We’re concerned about this. There’s a database of citizens built on unknown factors, and there’s no way for people to challenge being on the list. How do you get on the list in the first place? We think it’s dangerous to single out somebody based on secret police information.”

G.W. Schulz can be reached at Follow him on Twitter: @GWSchulzCIR.

Creative Commons License

Republish our articles for free, online or in print, under a Creative Commons license.

G.W. Schulz is a reporter for Reveal, covering security, privacy, technology and criminal justice. Since joining The Center for Investigative Reporting in 2008, he's reported stories for NPR, KQED,, The Dallas Morning News, the Chicago Tribune, the San Francisco Chronicle, Mother Jones and more. Prior to that, he wrote for the San Francisco Bay Guardian and was an early contributor to The Chauncey Bailey Project, which won a Tom Renner Award from Investigative Reporters and Editors in 2008. Schulz also has won awards from the California Newspaper Publishers Association and the Society of Professional Journalists’ Northern California Chapter. He graduated from the University of Kansas and is based in Austin, Texas.