Should data scientists be in the business of charging Americans for crimes they commit? I could commit one day? Last month, a group of federal lawmakers asked the Justice Department to stop funding such programs, at least until safeguards are implemented. It’s just the latest battle in a controversial field of law enforcement that seeks to peer into the future of fighting crime. .
“We write to urge you to suspend all Department of Justice (DOJ) grants for predictive policing systems until the DOJ can ensure that grant recipients will not use such systems in ways that have a discriminatory impact,” reads in a January letter to Attorney General Merrick Garland from U.S. Senator Ron Wyden (D–Ore.) and Representative Yvette Clarke (D–NY), joined by Senators Jeff Merkley (D–Ore.), Alex Padilla, (D –Calif.), Peter Welch (D–Ore.) –Vt.), John Fetterman, (D–Penn.), and Ed Markey (D–Mass.). “Mounting evidence indicates that predictive policing technologies do not reduce crime. Instead, they worsen the unequal treatment of Americans of color by law enforcement.”
The letter highlights concerns about racial discrimination, but it also raises concerns about accuracy and civil liberties that, since day one, have dogged plans to address crimes that have not yet occurred.
The rattle is a weekly newsletter from JD Tuccille. If you care about government excesses and tangible threats to everyday freedom, this is for you.
Fingering future criminals
Criminal justice theorists have long dreamed of stopping crimes before they happen. Prevented crimes mean no victims, costs or perpetrators to punish. This has led to proposals for welfare and education programs aimed at deterring children from becoming predators. It has also inspired “predictive policing” efforts that assume that numbers can tell you who is inclined to prey on others. It’s an intriguing idea, if you ignore the dangers of targeting people for who they are Could be do in the future.
“For years, companies have used data analytics to anticipate market conditions or industry trends and guide sales strategies,” Beth Pearsall wrote in the Justice Department. NEW diary in 2010. “Police can use similar data analytics to help make their work more efficient. The idea is called “predictive policing” and some in the industry believe it has the potential to transform law enforcement by enabling police to anticipate and prevent crime rather than simply respond to it.”
Interesting. But marketers who target neighborhoods for home warranty proposals only annoy people when they get it wrong; policing efforts have much higher stakes when they are flawed or harmful.
“The accuracy of predictive policing programs depends on the accuracy of the information they receive,” Reasonnoted Ronald Bailey in 2012. “We should always keep in mind that any new technology that helps police better protect citizens can also be used to better oppress them.”
Predictive policing in (bad) action
People concerned about the dangers of predictive policing often reference the 2002 film Minority report, in which a sci-fi version of the practice is abused to implicate innocent people. Recent years, however, have provided real-life warnings about the misuse of data science to torment people for crimes they didn’t commit.
“First, the sheriff’s office generates lists of people it believes are likely to break the law, based on arrest records, unspecified intelligence information, and arbitrary decisions by law enforcement analysts,” Tampa Bay Times reported in 2020 Pasco County, Florida’s predictive policing program. “Then sends deputies to find and interrogate anyone whose name appears, often without probable cause, a search warrant, or evidence of a specific crime.”
Essentially, as one former congressman described the program’s treatment of those targeted: “You make their lives miserable until they move or sue.”
They did so, with many plaintiffs represented by the Institute for Justice. Last year, as legal bills mounted, the sheriff’s office said in court documents it stopped predictive policing efforts.
Garbage in, garbage out
A big problem with predictive policing is that it relies heavily on the honesty and impartiality of the people creating algorithms and inputting data. As recent discussions about bias in Internet search results and artificial intelligence reveal, the results that come out of a data-driven system are only as good as what goes in.
“A fundamental problem with data-driven policing is that it treats information as neutral, ignoring how it may reflect historical over-policing and de-escalation,” Ángel Díaz of the Brennan Center for Justice wrote in 2021. He added that providers of technology dealing with the NYPD’s predictive policing program “proposed relying on data such as education level, availability of public transportation, and the number of health care facilities and liquor licenses in a given neighborhood to predict areas of the city where crime was likely to occur.”
Are these real predictors of criminal activity? Perhaps. Or maybe they’re excuses to make people’s lives miserable until they move or sue, as happened in Pasco County.
Forecasts fueled by the Fed
As with so many big ideas with scary potential, the impetus for development and implementation comes from government funding and encouragement.
“The National Institute of Justice, the research, development, and evaluation arm of the Department of Justice, regularly provides seed money for grants and pilots to test ideas like predictive policing,” the law professor commented earlier this month from Andrew Guthrie Ferguson American University. “It was a grant from the National Institute of Justice that funded the first conference on predictive policing in 2009 that launched the idea that past crime data could be run through an algorithm to predict future criminal risk.”
Of course, it’s not bad to seek innovation and look for new tools that can make the public safer. But hopefully those who fund such research want to make the world a better place, not a worse one. And when lawmakers asked the Justice Department in 2022 for documentation on predictive policing, officials admitted they didn’t really know how the money was being spent, let alone its impact.
“It remains an unanswered answer [question]for example, the extent to which such tools are, or have ever been, evaluated for compliance with civil rights laws,” Gizmodo” Dell Cameron wrote at the time.
Hence the letter from Wyden and co. After years of funding and haphazard developments, warnings from civil libertarians, and abuse by police, some lawmakers want the federal government to stop funding predictive policing efforts until due diligence is done and they are put into practice. implement safeguard measures.
It makes you wonder whether predictive policing programs have predicted the current problems in the industry.