Real-life Minority Report: Analytics assist police in detecting crime
Police at the Los Angeles Police Department are trialling predictive analytics
computerworlduk.com | Nov 16, 2011
By Linda Rosencrance
Captain Sean Malinowski of the Los Angeles Police Department (LAPD) has just done something once unimaginable for a commanding officer: He’s given up control of deploying his beat officers to a computer.
Malinowski, commanding officer of the LAPD’s Foothill Community Police Station, is a pioneer in the field of “predictive policing.” That means using predictive analytics to analyse data, such as the times and locations of past crimes, to forecast where and when certain crimes are likely to happen in the future so police can stop them before they occur.
“We’re doing a rigorous examination, an experiment, for the next three months of predictive analytics and for the first time we’re going to rely 100 percent on the computer to forecast property crimes, which are the lion’s share of our crime,” he says. The experiment began 6 November.
Malinowski says he’s willing to make some sacrifices in terms of control if it means reducing crime in his jurisdiction.
“That’s unusual for me to do because, as a [commanding officer], I like to be in control of things, especially the mission,” he says. “But I’m going to give that up and I’m going to let the computer generate the geographic assignment of the missions.”
Across the country, police departments must fight crimes in the face of decreasing budgets and manpower. But in Los Angeles and Santa Cruz, California, the police departments are turning to new technologies like predictive analytics to help them save time and money by enabling them to prevent crime by more effectively deploying patrol officers.
The LAPD and the Santa Cruz Police Department are using a crime-fighting tool developed by researchers – social scientists and mathematicians – at the University of California Los Angeles (UCLA) to target property crimes such as home and business burglaries, as well as vehicle thefts and break-ins.
Like predicting earthquakes
The tool, which identifies criminal hotspots, is modeled on a mathematical algorithm used to predict earthquakes and their aftershocks because the researchers discovered that, just as aftershocks are in close proximity to the initial earthquake, criminals tend to commit crimes in close proximity to past crimes.
The technology grew out of a long-standing UCLA-based project looking at the mathematics of crime, says P. Jeffrey Brantingham, one of the UCLA researchers and an associate professor of anthropology at the school. For the first six years of the seven-year project, researchers focused on trying to figure out what models do a good job determining how and why crime patterns form in the way they do. Now that they’ve developed those models, the researchers are putting them into practice.
“We’re now testing these predictive analytics in the field and we launched a controlled, randomized trial in LA earlier this month,” Brantingham says.
The theory is that predictive analytics might work better on property crimes because the targets are stationary and the nature of the targets doesn’t change that much over time, he says, unlike crimes where the victims are mobile and change their behaviors.
Criminologists find it’s easier to predict these types of crimes because there are patterns regarding where and when they occur. For example, burglaries tend to be clustered in terms of time and location and the individuals committing these crimes tend to have predictable patterns – usually they commit them somewhere near their homes or near familiar locations.
Additionally, property crimes are not displaceable crimes, which means if police departments target these crimes in particular areas, the criminals won’t simply move two miles to another location.
Zach Friend, a crime analyst at the Santa Cruz Police Department, says his department is the “operational test case agency” for the system, although Santa Cruz didn’t set its program up as a controlled experiment, as did the LAPD.
“What [the researchers] did before was just test crime data, but we were actually willing to test it in the field,” he says.
Friend says data from the department’s records management system is fed into the computer program on a daily basis, and then transferred to Microsoft Excel software where it’s cleaned, ordered and geocoded. Next, the data is combined with a master Excel database of all pertinent crimes for the past seven years and run through the UCLA algorithm.
Hotspots on Google Maps
“We recalibrate on a daily basis and the algorithm produces 10 Google hotspot maps every day of approximately 500 feet by 500 feet where burglary or vehicle theft is likely to occur in our city on that day,” he says.
Officers are given the hotspot maps at roll call. The officers check those areas during their “free” patrol times, when they’re not obligated on other calls, and they document their activities for tracking purposes. Because the city of Santa Cruz is only 13 square miles, the hotspot maps significantly reduce the area that officers need to patrol.
“Law enforcement in the past has taken a reactive approach to enforcement – if crime occurs in one place you need to go to that place,” Friend says. “This is breaking that mold. You don’t necessarily have to go there. Maybe it will send you to a separate location to prevent the next crime from occurring.”
The point of predictive policing is not to make arrests but rather to reduce the numbers of the targeted crimes from happening in the first place. And it seems to be working in Santa Cruz.
“In the first month, July 2011, the only variable we introduced was the application of this model and there was a 27 percent reduction year-over- year of the targeted crime types, because there was a police presence in the area where maybe there wouldn’t have been a police presence at all,” Friend says.
The SCPD just finished it’s three-month analysis of the algorithm and the department learned a couple things: There isn’t enough crime in Santa Cruz to make a definitive statement about causality, but there was a correlation between the number of extra checks the officers ran in the hotspot areas and a reduction in the crime types the department was targeting.
“So for every extra 50 checks we ran in the city per week we found a two percentage-point decrease in the targeted crime types,” Friend says. “The predictions [based on the algorithm] where crimes will occur are 10 times more accurate than if you let an officer go where he wants to go.”
Malinowski isn’t impressed with the Santa Cruz department’s methodology.
“Santa Cruz will have a difficult time making a scientific claim that the [computer] forecast contributed to a reduction in crime, because I think they had very little in the way of crime analysis before,” he says. “And they didn’t set it up as an experiment. It takes a little more time and effort to do it the way we’re going to do it.”
Malinowski, who explains that his station is the only one in the LAPD currently engaged in the experiment, wants to be able to tell his counterparts at other LAPD stations that he went strictly by the computer forecast and realized, say, an additional 2 percent, 3 percent or 4 percent reduction in property crimes.
“We’re experimenting and we’ll see how it goes and if it will answer the questions: ‘Does the forecast add value to the process of assigning missions for patrol?’ and ‘Will it give us some information on how many officers we need in a certain part of our jurisdiction and for how long?’ and ‘Will it make an impact on property crime in a certain very small geographic space like a block?’ We’re going to be collecting data as well so we’ll be able to track that,” he says.
Malinowski says LAPD Chief Charlie Beck as well as former LAPD Chief William Bratton both support using predictive analytics to inform the department’s decision-making in fighting crime because they know that it’s getting harder and harder to slash crime rates.
Crime is down so dramatically in the Foothills “that we’re victims of our own success in some way,” he says. “Take burglary of a motor vehicle: [We’re] down 25 percent year-to-date, so what else can I do? I’ve pretty much exhausted my arsenal, so if I want to eek out a couple more percentage points, then it looks like I have to use the data to do that.”
Malinowski says at some point he may think about using a commercial product, but for now the easiest thing to do is work with the UCLA researchers because they come with their own government funding–and unlike the vendors he’s talked to who are in it for the profit, the researchers’ motives are “more pure.”
It’s a win-win, he says. “We give the researchers the data and we’re a real-world laboratory for the researchers [and it doesn’t cost us anything].”
But there may be a small downside. Malinowski acknowledges that the patrol officers who are assigned to do crime analysis worry that they’ll be replaced by the new system.
“It’s difficult for people to get their heads around the fact that the computer could generate these specific geographic locations where crimes are most likely to occur,” he says. “And it’s hard because they feel there’s a lot of special knowledge that they can bring to the forecast that the computer can’t.”
But the bottom line for Malinowski is to deny the criminal the opportunity to commit the crime he intended to commit. “He doesn’t get arrested and we don’t spend time booking him,” the commanding officer says, “and someone doesn’t get his laptop stolen out of his car.”