POP Center Responses Gunshot Detection Appendix B
Appendix B: Using Acoustic Data
Some clarification on the nature of acoustic alert data is necessary, especially for those interested in conducting AGDS research. Acoustic alert data do not represent all gun violence, since only a fraction of the cases ends up being linked to serious assaults and homicides. In fact, AGDS notifications may not even fully indicate criminal behavior since they may capture justified uses of firearms, including by police. It is reasonable, however, to conclude that AGDS notification data represent outdoor firearm use. It is also true that AGDS data from multiple cities may not be comparable because the context in which firearm discharges occur (such as population density), the local availability of firearms (and legality of carrying firearms), and opportunities for firearm use (such as the number of vacant properties that offer opportunities for target practice) may differ substantially. Additionally, data from different AGDS systems are unlikely to be comparable, and careful contextualization is necessary to foster meaningful comparisons. For example, some systems produce an alert for each round, whereas others group all gunshots without a significant pause (typically 5+ seconds) into one alert. Even in the latter case, multiple alerts may result from one criminal incident.
Despite these caveats, police-operated AGDS almost invariably provide data that can be used within your agency. These data can typically be pulled directly from the system itself. Though some vendors “own” the data from their systems, their user policies allow police to use the information for investigative and internal research purposes. Restrictions on use generally apply only to sharing data outside your agency, so agencies must understand the terms of your contracts with the vendor. It is also important to understand that once an alert becomes a logged call for service, the call for service data are not subject to the vendor terms and can be more easily shared with external parties.
Nonetheless, vendor and CAD gunfire data are significantly different. In most cases, vendor data include the near-exact time gunfire occurs and provide exact geospatial coordinates of the likely point of origin. Metrics such as weather conditions and round counts may be attached as well. CAD data may mirror time and location, but often not precisely.
To give a more concrete example, consider the differences between how a AGDS generates data versus how data are typically coded in CAD. Consider a scenario in which five gunshots ring out at 23:55:20 in the backyard of a house at 1432 Main Street. The AGDS finishes its review in 48 seconds (23:56:08) and pushes the notification to the dispatch center, where it is entered in the CAD system 90 seconds later (23:57:38). Because this agency dispatches to addresses and because the backyard of 1432 Main Street is deeper than that of the abutting neighbor, the closest address ends up being 1433 2nd Street, one block from Main Street. As a result, the CAD data do not match the exact time of the gunfire and may contain small spatial discrepancies. Relying on the vendor data is important for investigative purposes because these data provide the most exact time and location. General research (internal and external to an agency) can comfortably rely on CAD data, but users should be aware of its limitations.
Some AGDS do not forward all captured noises to dispatch. An AGDS from one of the leading vendors, for example, will forward confirmed gunfire to dispatch, but it also captures two other classes of data that are not forwarded and thus do not appear in CAD records. Gunfire detected outside coverage areas can be found in the AGDS data portal, but the locational accuracy is limited and the incidents are not reviewed, meaning they likely contain false positives. In addition, noises that initially trigger the AGDS but are machine- or reviewer-classified as nongunfire can also be found in the AGDS portal. In rare instances, true gunfire can be found among these dismissed cases. For investigative purposes, reviewing both sources of data is essential, especially if gunfire is known to have occurred. For researchers, non-gunfire data are likely of little value.
Anyone analyzing acoustic data must recognize that raw AGDS data are not necessarily incident-level data, but rather event data of a discrete number of gunshots. For example, a homicide or assault in which multiple shooters at multiple locations shot firearms can lead to multiple AGDS alerts. As mentioned above, different systems can either report each noise separately or, more typically, group them in defined time chunks. Both approaches can generate a fair number of duplicate alerts for what are technically elements of the same incident, thereby inflating the perceived number of gunfire incidents. For example, a single homicide may be associated with three or four acoustic alerts. Many agencies may label such cases as duplicates in their CAD system, but the thoroughness with which labeling occurs can vary. For this reason, any aggregate analysis done with AGDS data should develop a procedure for handling duplicates.
There is no set guidance on how duplicates should be handled,85 since this process may be dependent on the nature of the coverage areas (density, grid layout, etc.), but it is reasonable to combine cases that occur within 5 minutes and 500 feet of one another. The easiest way to do so is to use spatial software such as ArcGIS (using the “find space and time matches” function) to identify duplicate cases. Duplicates can be discarded if one is strictly interested in the number of incidents, or they can be aggregated if one is interested in the average number of rounds per incident. Recognizing the limitations of the acoustic systems in place is critical. Because each system has unique features deployed in a unique policing context, assessing the relative reliability of the system and its data is important prior to thorough analytical assessments.