A deep dive from ProPublica reporters who wanted to vet whether an algorithm really could determine the likelihood that someone would commit a crime in the future has produced some worrisome findings. ProPublica zeroed in on some 7,000 people in Broward County, Florida, who were arrested and given risk scores to decide the likelihood that they would reoffend in the two years following their arrest.
Among other things, ProPublica concluded that the risk formula wrongly flagged black defendants as future criminals and failed to flag certain white defendants as a higher risk for committing future crimes.
According to ProPublica:
“Scores like this – known as risk assessments – are increasingly common in courtrooms across the nation. They are used to inform decisions about who can be set free at every stage of the criminal justice system, from assigning bond amounts – as is the case in Fort Lauderdale – to even more fundamental decisions about defendants’ freedom. In Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington and Wisconsin, the results of such assessments are given to judges during criminal sentencing.”
Despite former Attorney General Eric Holder worrying that racial bias could surface in such risk assessments and perpetuate biases that already existed in the criminal justice system, the U.S. Sentencing Commission has never studied the issue. The ProPublica reporters also concluded that the tool was exceptionally bad at predicting future acts of violence. Just 1 in 5 people from the analysis later committed a violent crime after the prediction was made that they would do so.
“When a full range of crimes were taken into account – including misdemeanors such as driving with an expired license – the algorithm was somewhat more accurate than a coin flip,” according to ProPublica.
That doesn’t mean police should ignore what data might be trying to tell them.
Harnessing Big Data for public safety purposes has been a continually growing trend in law enforcement and intelligence circles since the Sept. 11 attacks. While intelligence agencies have a long tradition of trying to analyze Big Data to solve problems, state and local police still are new on the block but are no less persuaded by claims that innovations in computing can do more than a patrol officer merely walking a beat.
Police in Chicago have developed a more narrowly tailored “Strategic Subject List” made up of 1,400 people. So far this year, over 70 percent of the people shot in the Windy City turned out to be on the list. Reportedly, so have 80 percent of the shooters.
Nonetheless, there are fears in Chicago that the tool could be misused. Karen Sheley, a director for the American Civil Liberties Union of Illinois, told the New York Times this week: “We’re concerned about this. There’s a database of citizens built on unknown factors, and there’s no way for people to challenge being on the list. How do you get on the list in the first place? We think it’s dangerous to single out somebody based on secret police information.”
G.W. Schulz can be reached at gwschulz@cironline.org. Follow him on Twitter: @GWSchulzCIR.
Republish this article
This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.
Republish Our Content
Thanks for your interest in republishing a story from Reveal. As a nonprofit newsroom, we want to share our work with as many people as possible. You are free to embed our audio and video content and republish any written story for free under the Creative Commons Attribution-NonCommercial-NoDerivs 3.0 license and will indemnify our content as long as you strictly follow these guidelines:
-
Do not change the story. Do not edit our material, except only to reflect changes in time and location. (For example, “yesterday” can be changed to “last week,” and “Portland, Ore.” to “Portland” or “here.”)
-
Please credit us early in the coverage. Our reporter(s) must be bylined. We prefer the following format: By Will Evans, Reveal.
-
If republishing our stories, please also include this language at the end of the story: “This story was produced by Reveal from The Center for Investigative Reporting, a nonprofit news organization. Learn more at revealnews.org and subscribe to the Reveal podcast, produced with PRX, at revealnews.org/podcast.”
-
Include all links from the story, and please link to us at https://www.revealnews.org.
PHOTOS
-
You can republish Reveal photos only if you run them in or alongside the stories with which they originally appeared and do not change them.
-
If you want to run a photo apart from that story, please request specific permission to license by contacting Digital Engagement Producer Sarah Mirk, smirk@revealnews.org. Reveal often uses photos we purchase from Getty and The Associated Press; those are not available for republication.
DATA
-
If you want to republish Reveal graphics or data, please contact Data Editor Soo Oh, soh@revealnews.org.
IN GENERAL
-
We do not compensate anyone who republishes our work. You also cannot sell our material separately or syndicate it.
-
You can’t republish our material wholesale, or automatically; you need to select stories to be republished individually. To inquire about syndication or licensing opportunities, please contact Sarah Mirk, smirk@revealnews.org.
-
If you plan to republish our content, you must notify us republish@revealnews.org or email Sarah Mirk, smirk@revealnews.org.
-
If we send you a request to remove our content from your website, you must agree to do so immediately.
-
Please note, we will not provide indemnification if you are located or publishing outside the United States, but you may contact us to obtain a license and indemnification on a case-by-case basis.
If you have any other questions, please contact us at republish@revealnews.org.