Skip to content

AI fails to predict crime. But researchers keep trying.

Comment

In the world of the 2002 film “Minority Report”, crime is almost non-existent. Clairvoyants predict when murders are about to happen, allowing the police to step in and arrest would-be criminals.

Although Tom Cruise’s all-powerful police force is evidence of a dystopian society, scholars have long pursued the tantalizing prospect of being able to predict crime before it happens.

And as the United States grapples with rising rates of violent crime, another research project has emerged: a group of University of Chicago scientists unveiled an algorithm last month, boasting in a statement of its ability to predict crime with “90% accuracy”.

The algorithm identifies locations in major cities that it believes have a high likelihood of crimes, such as homicides and burglaries, occurring within the next week. The software can also assess the variation in policing between neighborhoods in eight major US cities, including Chicago, Los Angeles and Philadelphia.

But the use of artificial intelligence to direct law enforcement is ringing alarm bells for many social justice scholars and criminologists, who cite a long history of such technology unfairly suggesting increased policing of blacks and Latinos. Even one of the study’s authors acknowledges that an algorithm’s ability to predict crime is limited.

“The past tells you nothing about the future,” said Ishanu Chattopadhyay, a University of Chicago professor and the algorithm’s lead researcher. “The question is: to what extent does the past really influence the future? And to what extent are the events spontaneous or genuinely random? … Our ability to predict is limited by this.

The Google engineer who thinks the company’s AI has come to life

Police have a long history of using every tool available to predict crime. Before advances in technology, cops huddled in conference rooms and placed crime incident pins on a map, hoping the clusters would help them figure out where they should look next.

Over the past 15 years, the nation’s largest police departments – such as New York, Los Angeles and Chicago – have begun to think of ways to use artificial intelligence to not only analyze crime, but also the to predict. They often turned to data analytics companies such as PredPol and Palantir, which create software that law enforcement can use to predict crime.

Predictive policing tools are built by feeding data – such as crime reports, arrest records and license plate images – to an algorithm, which is trained to look for patterns to predict where and when some type of crime will happen in the future.

But algorithms are only as good as the data they receive, which is a problem especially for people in the United States, said Vincent Southerland, co-director of faculty at the Center on Race, Inequality and the Law at the University. ‘New York University.

Historically, police data in the United States has been skewed, according to Southerland. Cops are more likely to arrest or charge someone with a crime in low-income neighborhoods dominated by people of color, a reality that doesn’t necessarily reflect where the crime is happening, but where the cops go. their time.

This means that most criminal activity datasets overrepresent people of color and low-income neighborhoods. Feeding this data into an algorithm leads it to suggest that there is more criminal activity in these areas, creating a racially and socioeconomically biased feedback loop, Southerland added.

“You have data that is infected or tainted by a bias – and that bias is going to show up on the other side of the analysis,” he said. “You get out of it, what you put into it.”

The War Inside Palantir: Data Mining Firm’s Ties to ICE Under Attack by Employees

In the real world, predictive policing software has caused significant problems.

In 2019, the Los Angeles Police Department suspended its crime prediction program, LASER, which used historical crime data to predict crime hotspots and Palantir software to assign criminal risk scores to crime hotspots. people, after an internal audit showed it led police to unfairly subdue blacks and Latinos. people to more surveillance.

In Chicago, police used predictive policing software from the Illinois Institute of Technology to create a list of people most likely to be involved in a violent crime. A RAND study and a subsequent Chicago Sun-Times investigation showed that the software included on the list every person arrested or fingerprinted in Chicago since 2013. The program was discontinued in 2020.

Predictive policing algorithms are “not a crystal ball,” said John S. Hollywood, senior operations researcher at RAND, who helped audit the Chicago Police Department’s use of predictive algorithms. “It’s better to look more holistically… what’s going on in terms of specific things in my community that are leading to crimes right now.”

Chattopadhyay said his team’s software was designed with knowledge of the algorithms’ troubled past.

In creating the algorithm, Chattopadhyay’s team segmented major cities into 1,000 square foot city blocks and used urban crime data from the last three to five years to train it. The algorithm indicates whether there is a high or low risk of crime in a segment at any given time, up to a week in the future.

To limit bias, the team omitted crime data such as marijuana arrests, traffic stops, or low-level misdemeanors, as research shows that blacks and Latinos are more often targeted for these types. of offences. Instead, they fed the algorithm data on homicides, assaults, and assaults, as well as property crimes like burglaries and motor vehicle thefts.

But the main point of the study, he said, was to use the algorithm to interrogate how biased the police are. His team compared data on arrests in neighborhoods of different socioeconomic levels. They found that crimes in wealthier areas resulted in more arrests, while in poorer neighborhoods crimes did not always have the same effect, showing a gap in law enforcement. .

Chattopadhyay said the findings help provide evidence for people who complain that law enforcement is ignoring poorer neighborhoods when violent or property crime spikes. “It allows you to quantify that,” he said. “To show the evidence.”

Arvind Narayanan, a professor of computer science at Princeton University, said the study’s press release and news articles about it did not focus enough on the study’s attempt to investigate the bias in police enforcement of crimes and overestimated algorithms’ claims of accuracy.

“For predictive policing, a single figure of accuracy…is totally insufficient to assess whether a tool is useful or accurate,” he said. “Crime is rare, so most crime predictions are likely to be false positives.”

There are racial biases in our policing systems. Here is the proof.

Criminal justice scholars, law enforcement experts, and technologists note that even if an algorithm is accurate, it can still be used by law enforcement to target people of color and those living in the darkest neighborhoods. poor for unwarranted monitoring and control.

Andrew Papachristos, a sociology professor at Northwestern University, said when law enforcement uses algorithms to map and analyze crime, they often subject people of color and low-income communities to more policing. ‘order. When criticized for oversurveillance in certain neighborhoods, they often use data to justify the tactics, he said.

Papachristos said if community groups could use the tools instead to determine where to provide more social services, increase community engagement and address the root social causes of violence, that would be a better use of technology. However, he said, that’s unlikely to happen because the organizations doing the work are cash-strapped and skeptical about the use of the data.

“They have seen data misused against them in court. They saw it to use it to profile individuals,” he said. “So if someone comes along like me and says, ‘Hey, we want to help you use the data. It’s not an immediate, ‘Oh my God, thank you.’ It’s like, “What data are you using?” ”

Hollywood of the RAND Corporation agreed. He said to truly reduce crime, police departments must work in tandem with social workers and community groups to address issues of education, housing and civic engagement.

“[Algorithms] are a shiny, shiny object,” he said. “These things tend to be distractions.”