• Center for Problem oriented policing

POP Center Responses Gunshot Detection Page 3

previous page next page

Analyzing Acoustic Gunshot Detection System Data

Analyzing AGDS data serves at least four purposes:†

1. Understanding the dynamics of gunfire incidents

2. Optimizing response times to gunfire incidents

3. Determining whether the system is accurate

4. Understanding whether the system help reduce underlying gunfire problems

† See Appendix B for further discussion of the technical issues associated with AGDS data.

Gunfire Incidents

AGDS data not only indicate the possibility that gunshots have occurred, but when thoroughly analyzed, these data also indicate the number of shots, guns, locations, and shooters. Knowing that multiple shooters at different locations fired guns can indicate an active shooting incident, and this knowledge can raise responding officers’ situational awareness.

Figure 3 shows a variety of acoustic gunshot alert wave patterns. Pattern 1 shows multiple rounds fired from the same location and firearm; each spike in the wave pattern coincides with the initial muzzle blast, establishing a count for the total number of rounds fired. Pattern 2 begins in a similar fashion, but just past halfway, additional shooters join in and create a messier pattern of peaks. Pattern 3 shows two fully automatic bursts of gunfire, which are characterized by closely spaced peaks. Pattern 4, the reverse of Pattern 2, begins with multiple shooters firing together and ends with a three-round burst. Pattern 5 shows two shooters at different locations. It is a bit difficult to see, but the peaks at the beginning and end are one shooter, whereas the low peaks starting about  halfway reveal another shooter at considerable distance from shooter one; these types of gunfire incidents are probably more likely to involve injuries as it points to gunfire exchanges. Pattern 6 shows ambient noise interference with only two actual gunshots. Exploring both the sounds and wave patterns can provide increased situational awareness and may also assist in investigations.  Both responders and investigators should be trained in recognizing the patterns and their potential meaning. Analyzing the sounds can reveal the number of shooters and the number of rounds fired by each shooter. Such information can be crucial in verifying witness accounts and lead to a more accurate accounting of victims and offenders. Knowing whether an incident involves one or multiple shooters can assist in locating all victims and evidence (e.g., casings).

Figure 3. Wave Patterns of Acoustic Alerts

Pattern 1

Pattern 2

Pattern 3

Pattern 4

Pattern 5

Pattern 6

Source: ShotSpotter.

Response Times

One of the easiest things to do with AGDS data is to compare response times for AGDS alerts to community-reported gunfire incidents (i.e., “shots fired” calls for service).44 Ideally, the technology will decrease the time between gunfire detection and police dispatch, but because of better geographic accuracy, the technology may also increase the investigative time. Because response times in call for service data often have a high number of outliers, you should use median-based measures to test for significance.45 It is further important to determine which parts of the response (dispatch, travel, or investigation) are the most affected to understand how AGDS might improve response times. Police officer travel times are less likely to be affected by AGDS, unless the response priority between the two types of calls is substantially different. The time officers take to investigate gunshot incidents, by contrast, should ideally increase because they typically have more information about the shooting than is available from a community call for service. Such measures can provide important feedback to your agency on how well an agency’s response procedures are implemented in practice. An important caveat is that most CAD systems do not record the time of the gunfire—they record the time the alerts were received from the vendor or residents.

System Accuracy

AGDS data can also be used to determine how accurately the system reports gunfire. Although one study 46 indicates that the majority of AGDS notifications in Chicago had no actionable results upon initial investigation, this conclusion confuses finding evidence of a crime with determining the  accuracy of the AGDS. Calls for service data are not designed to determine criminal wrongdoing, and multiple data sources may need to be consulted for you to understand the investigative outcomes of acoustic alerts. False positives in AGDS data certainly exist; however, detecting and enumerating them with quantitative data alone is difficult, and officers would be required to thoroughly investigate each alert to find the source of the noise, which would be impractical.

Calculating false negatives, in contrast, is straightforward. Determining the number of gun violence incidents missed by AGDS is also more significant because false negatives leave police unaware of potentially dangerous gunfire; thus, false negatives provide more insight into the value of the system. As indicated above, AGDS typically miss around 20 percent of true gunfire cases.47 An agency can estimate its false negative rate by working backward from crime incident data and identifying reported outdoor aggravated assaults and homicides involving gunfire victims. Those incidents can then be matched to nearby acoustic gunshot detections. Reading the narratives of the incidents provides the greatest accuracy because some victims may have been at the scene for a long time or may have been shot elsewhere but collapsed at the scene, which can make it difficult to find a matching AGDS alert.

Underreporting of Gunfire

In addition to the immediate investigative use of acoustic gunshot data, the data can support police problem-solving. Problem-solving approaches rely on data analysis to identify hotspots of gunfire alerts. Such sites can be selected for additional preventative actions. Beyond additional patrols, these actions can also involve addressing nuisance properties and altering the immediate environment  (adding lighting, removing trash, etc.). Few agencies currently use AGDS data in such a capacity, but a problem-oriented approach was part of East Palo Alto’s SPI project.48 The department responded to AGDS hotspots with two approaches: (1) additional patrols and searches and (2) community education and outreach. Results indicate that the project may have reduced shootings in the targeted areas by 52 percent (compared with a 41 percent reduction city wide), but the implementation was inconsistent, so the outcomes cannot be thoroughly assessed. AGDS data are not widely used in directing problem-solving approaches partly because these data are outside of normal police data systems, such as CAD or records management systems, so they are not routinely analyzed.

Another metric that can be examined with AGDS data is the level of underreporting of gunfire by citizens. Underreporting can be assessed by determining what proportion of acoustic incidents are also reported by a citizen. This can be done expediently in a spatial software package (such as ArcGIS, using the “find space/time matches” function), but some arbitrary choices must be made with respect to what constitutes a “good enough” match, which depends on the amount and spatial density of such calls for service. The implementation of AGDS has been related to reductions of residential calls for “shots fired,” which suggests that that citizen reports of gunshots are preempted by an improved police response or that residents now rely on the system to bring police to the community.49 Exploring underreporting can be useful in comparing neighborhoods’ willingness or ability to report such offenses, and may help your agency develop targeted publicity campaigns to encourage citizen reporting. 

Accurate systems, however, are not necessarily effective ones. Increases in arrests or gun recoveries are only means toward the ultimate objective of reducing illegal gunfire and gun violence. Though the numbers of arrests and gun recoveries are typically easy to count, they are also driven by the amount of effort police put into achieving them, which can vary over time. Furthermore, arrests and gun recoveries are not necessarily the direct result of the AGDS and might result from other reporting and investigative methods.

Reductions in illegal gunfire are also difficult to measure because the implementation of AGDS may itself affect citizen reporting behavior. As a result, determining whether any reductions are real declines or just changes in reporting behavior is difficult. Gun-violence-event reductions are the most reasonable benchmark for efficacy, but they are rare and thus hard to detect statistically. 

Because most acoustic systems require a substantial amount of contiguous coverage, running controlled experiments on AGDS is difficult, but evaluations should, at a minimum, include a comparable area without AGDS. In addition, when AGDS is initially set up in high-gunfire areas during highgunfire periods, subsequent crime and gunfire reductions might merely be returns to normal levels that would have occurred even in the absence of AGDS (referred to as “reversion to the mean” in statistical language). All AGDS evaluations should be done with care and consideration of what the data represent and the context in which the police implemented the systems. Collaborating with academic partners versed in experimental analysis can help your agency achieve valid outcomes and conclusions about the efficacy and cost-effectiveness of AGDS. The SPI program at BJA has funded several AGDS-related projects that benefitted from such partnerships.50 Even without such funding streams, it makes financial sense to rigorously evaluate the impacts of AGDS.

previous page next page