Sensors Listen for Gunfire in Chicago. Does it Make a Difference?

In late November, the city of Chicago surpassed 600 homicides — the overwhelming majority from handguns — for the second year in a row, and for only the third time since 2003, according to data reported by The Chicago Tribune.

That brings the total to more than 3,400 shootings across the city so far this year.

Sadly, in their efforts to address this seemingly intractable epidemic of gun violence, police officials seem to be placing unfounded faith in technology to solve the problem. The CPD has expanded use not just of digital surveillance, for example, but also predictive policing algorithms — even though many experts question the latter’s scientific validity, saying it does not reduce homicides, is inherently biased, and creates a new layer of profiling.

Now, the department is also expanding its use of acoustic gunfire detection and location technology, which uses audio sensors, GPS software, and machine learning algorithms to identify and locate gunfire as it happens. The system then feeds these gunfire alerts, along with approximate locations, to police commanders and officers in real time.


CONVICTIONS: Where science & criminal justice meet.


Officials claim these systems work because there have been fewer shootings since the technology was deployed in two police districts. But, as many of us learned in first-year statistics and science courses, correlation does not mean causation. The technology certainly detects gunfire, but there is no independent scientific research to demonstrate that it decreases gun violence in Chicago — or anywhere else.

The gunfire detection system used in Chicago was developed and marketed by ShotSpotter, a San Francisco Bay Area-based firm that has become the global leader since the technology was introduced about 20 years ago. ShotSpotter first arrived in Chicago in 2012 but was not fully operational until last year in two police districts — Englewood on the city’s South Side and Harrison on the West Side — which experience disproportionate levels of gun violence.

Since then, the technology has migrated to four other police districts. CPD recently announced that six more police districts would receive the technology by the end of next year, which will bring its deployment to about half of the city’s 22 patrol districts, according to local reports.

Gunfire detection employs an array of acoustic sensors — each about the size of a small toaster — placed on the highest local rooftops, utility poles, street lights and other structures to detect and locate the source of gunfire. The sensors are automatically triggered by the two unique audio characteristics of gunfire: The muzzle blast following the explosion inside the chamber and the sonic boom that occurs when the bullet travels at supersonic speed. The system pinpoints the exact location of the gunfire by a triangulating the sound from three sensors. Machine learning algorithms factor the speed of sound — about 1,087 feet per second at 32 degrees Fahrenheit — to measure the difference between the time the gunfire is detected at three different sensors.

“We try to put [sensors] on the highest local building so that the sound goes to the horizon,” ShotSpotter co-founder and chief scientist Robert Showen said when we chatted over the phone in early November. “If four or more sensors respond, the system ‘votes’ on the three that have the most consistent arrival times” for the algorithm to use, explained Showen, who is a waveform physicist and acoustics expert.

Unlike the human ear, which is helpfully connected to the human brain, computers are not yet smart enough to quickly distinguish gunfire from other sounds, such as a car backfiring or fireworks, Showen said. The gunfire detection system is “trained” to recognize likely gunfire events. The geocoded location and an audio snippet are automatically transmitted to ShotSpotter for human vetting. Once confirmed, the gunfire event is immediately relayed via push notifications and SMS alerts to police dispatch centers, shift commanders, or to the smartphones of individual patrol officers. Gunfire can be detected, geocoded, vetted, and texted to police dispatchers in less than a minute from the time the trigger is pulled.

The recognition of gunfire events — from detection to location to human vetting to the dispatched alerts — is similar to the facial recognition and artificial intelligence machine learning algorithms described in my previous “Convictions” columns.

A high-density urban environment such as Chicago requires up to two dozen strategically placed microphones per square mile. The system’s accuracy is generally within 65 feet in cities, as opposed to 10 feet in flat, open spaces, because acoustic waves “bend” around buildings, said Showen, who previously worked at SRI International and the Arecibo Observatory in Puerto Rico. ShotSpotter claims that cities that use it see illegal gunfire plummet by an average of 35 percent within two years.

Local media and some national outlets have reported on the expansion of gunfire detection technology in Chicago. But there is little in the way of independent research supporting ShotSpotter’s claims about decreasing gun violence — or any other positive outcomes for that matter. This may be why the Chicago Police Department ignored my email requests to interview their chief technology officer to discuss ShotSpotter’s outcomes and its expansion in Chicago.

Jennifer Doleac is one of the very few academics who has studied ShotSpotter. She’s an assistant professor of public policy and economics at the University of Virginia who specializes in the economics of crime and how technology impacts public safety. “There is very good evidence that this technology detects gunfire,” she told me when I asked her about the Chicago Police Department’s claims that ShotSpotter was behind the decrease in shootings. “But does it have an impact on anything that we care about, such as saving lives by getting victims to the hospital faster, clearing more cases, reducing crimes, or decreasing gun violence? We have no evidence.”

More than 90 cities nationwide use ShotSpotter despite the lack of independent analysis. And in fact, there are hints that the technology doesn’t deliver meaningful benefits. The Charlotte-Mecklenburg Police Department ended its four-year contract in February 2016 because it “wasn’t successful in identifying or prosecuting the people who fired the shots picked up by the system,” according to a memo obtained by The Charlotte Observer.

True evidence for or against ShotSpotter’s ability to decrease gun violence is lacking for one key reason: The company’s data is proprietary and closely guarded. Doleac, who also directs UVA’s Justice Tech Lab, was able to obtain ShotSpotter data in two cities — Washington D.C. and Oakland, California — through the Freedom of Information Act to study the impact of juvenile curfew laws and reported versus actual levels of gunfire.

The data that Doleac obtained confirms what police and violence researchers have said for decades, namely, that only a fraction of gunfire is reported to 911. Only about 12 percent of all gunfire was reported to police in DC and Oakland, according to a study she conducted with Purdue economist Jillian Carr.

At the very least, one could expect that gunfire detection systems could decrease police response times, Doleac told me when we chatted in November. But it’s difficult to home in on this or any other specific outcome because ShotSpotter owns the gunshot data that its sensors collect in each city. Clients are prohibited from releasing data to the public, news media, or researchers.

Doleac says her two successful FOIA requests were outliers and flew under the radar.


This is particularly frustrating, Doleac said, because these data would be a goldmine for gun violence researchers. “One of the biggest challenges in that area of research is the poor quality of the data,” she said, referring to the federal government’s 20-year plus ban on funding gun violence research. A cache like ShotSpotter’s, with precise timestamp and location information, could help determine the extent of underreported gun violence and “hotspots,” identify some types of firearms being used, shoot outs in progress, police and ambulance response times, and more. “We need better data to [determine] which policies reduce gun violence. These data are perfect, and taxpayers are already paying to collect it.”

Gunfire detection systems are not cheap. ShotSpotter costs Chicago about $1.5 million per police district, paid for by a combination of public and private funding, according to local reports. That’s $18 million for the 12 police districts where the technology is operational or targeted for installation. The city is also expanding its video surveillance network of more than 2,700 cameras. Targeting violence-prone communities with resources for digital surveillance and high-tech policing might seem like a good thing. But this embrace of big data and “Big Brother” doesn’t address the many structural barriers and socioeconomic disparities that contribute to the cycles of violence concentrated among younger African American men in these communities, such as poverty, massive levels of unemployment, deindustrialization, housing insecurity, mental health challenges, and, racism.

“I have a huge concern with these technologies because corporations are benefitting and the community suffers,” said Lance Williams, a public health scientist and associate professor at Northeastern Illinois University’s Carruthers Center for Inner City Studies. “I don’t want to be negative toward any tools that can reduce violence. But those are resources—[millions] of dollars — that could have been directed to these communities and young African American men for education.”

Williams makes a point that has often been minimized or ignored: The body count in communities with heartbreaking rates of gun violence will keep growing until policymakers and law enforcement address some of the many socioeconomic challenges their residents face. Technology like gunfire detection, predictive policing, video surveillance, and police body cameras are impressive tools, and they may be part of a solution. But their worth must be determined empirically, through research that tracks measurable benefits — not just to law enforcement, but also to the vitality of these troubled neighborhoods.


Rod McCullom reports on the intersection of science, medicine, race, sexuality, and poverty. He has written for The Atlantic, The Nation, and Scientific American, among other publications.

Rod McCullom is a Chicago-based science journalist and senior contributor to Undark whose work has been published by Scientific American, Nature, The Atlantic, and MIT Technology Review, among other publications.