The AI Police
A new company makes artificial intelligence software that鈥檚 in use at a handful of police departments. Can it make law enforcement more transparent?
There鈥檚 a story Brett Goldstein 鈥96 likes to tell. It starts on a Friday night in 2010 with him sitting in a darkened Crown Victoria on a Chicago street, poring over maps. Goldstein was a commander at the Chicago Police Department, in charge of a small unit using data analysis to predict where certain types of crimes were likely to occur at any time. Earlier that day, his computer models forecast a heightened probability of violence on a particular South Side block. Now that he and his partner were there, Goldstein was doubting himself.
鈥淚t didn鈥檛 look like it should be a target for a shooting,鈥 he recalled. 鈥淭he houses looked great. Everything was well manicured. You expect, if you鈥檙e in this neighborhood, you鈥檙e looking for abandoned buildings, you鈥檙e looking for people selling dope. I saw none of that.鈥
Still, they staked it out. Goldstein鈥檚 wife had just given birth to their second child, and he was exhausted after a day in the office. He started to doze off. Goldstein鈥檚 partner argued that the data must be wrong. At 11 p.m., they left.
Several hours later, Goldstein woke up to the sound of his BlackBerry buzzing. There had been a shooting鈥攐n the block where he鈥檇 been camped out. 鈥淭his sticks with me because we thought we shouldn鈥檛 be there, but the computer thought we should be there,鈥 said Goldstein. He took the near-miss as vindication of his vision for the future of law enforcement. 鈥淚 do believe in a policeman鈥檚 gut. But I also believe in augmenting his or her gut,鈥 he said.
Seven years after his evening on the South Side, Goldstein threw on a gray suit and some aerodynamic sunglasses and headed out from his hotel in Midtown Manhattan into New Jersey. This spring, he founded CivicScape, a technology company that sells crime-predicting software to police departments. Nine cities are either using the software or in the process of implementing it, including four of the country鈥檚 35 largest cities by population. Departments pay between $30,000 a year to use the software in cities with less than 100,000 people to $155,000 a year in cities with populations that exceed 1 million. Goldstein wanted to check in on the two clients who were furthest along鈥攖he police departments in the New Jersey towns of Camden and Linden.
Goldstein likes to harp on his own lack of charisma, but he鈥檚 well suited to be a pitchman for police departments. In Chicago, he rose from patrol officer to the city鈥檚 chief data officer over a seven-year government career and regularly drops a few war stories from the streets into his conversations with cops. He鈥檚 also peddling something that every department is after nowadays: technological sophistication. The criminal justice system produces reams of data, and new computing methods offer to turn any pool of numbers into something useful. Today, almost every major police department in the country is using or has used some form of commercial software that makes predictions about crime, either to determine what blocks warrant heightened police presence or even which people are most likely to be involved. Technology is transforming the craft of policing.
Not everyone is rubbing their hands in anticipation. Many police officers still see so-called predictive policing software as mumbo jumbo. Critics outside of law enforcement argue that it鈥檚 actively destructive. The historical information these programs use to predict patterns of crime aren鈥檛 a neutral recounting of objective fact; they鈥檙e a reflection of socioeconomic disparities and the aggressive policing of black neighborhoods. Computer scientists have held up predictive policing as a poster child of how automated decision making can be misused. Others mock it as pseudoscience. 鈥淪ystems that manufacture unexplained 鈥榯hreat鈥 assessments have no valid place in constitutional policing,鈥 wrote a coalition of civil rights and technology associations, including the ACLU, the Brennan Center for Justice, and the Center for Democracy & Technology, in a statement last summer.
A numbing progression of police shootings in the past several years serves as a reminder of what鈥檚 at stake when police officers see certain communities as disproportionately threatening. Over the course of eight days in late June, juries failed to convict officers who killed black men in Minnesota, Ohio and Wisconsin. In each case, the officer鈥檚 defense relied on his perception of danger. The worst-case scenario with predictive policing software is deploying officers to target areas with their ears raised, leading them to turn violent in what would otherwise be routine encounters.