• Center for Problem oriented policing

POP Center Responses Gunshot Detection Page 5

previous page next page

Implementing Acoustic Gunshot Detection Systems

Implementing AGDS seems straightforward, but agencies should explore numerous elements before committing to a system.

Coverage Area

Your agency should determine whether the system is appropriate for its specific circumstances.  Though vendors understandably would prefer to maximize the coverage area, doing so is not always in the best interest of the police or the public. Ideally, an agency would begin by analyzing existing gunfire and gunviolence data to determine which areas might benefit most. The next step would be to find two equivalent (by rate and trend of gunfire) but geographically separate areas and install AGDS in one but not in the other. Doing so would enable comparisons to determine whether the acoustic system is beneficial. Note that these systems do not necessarily reduce violence levels or improve case outcomes, so evaluation of what their actual benefits are should be part of sound implementation practices.

Costs

Costs vary depending on the specific system, configuration, and options, but the typical cost for a leased system that includes vendor review of gunfire incidents is around $70,000–85,000 per square mile, per year. For this price, an agency will receive access to response applications and maintenance service. Leased systems often have minimum coverage requirements (1–3 square miles). Systems wholly owned by the department can therefore be more cost-efficient if the area to be covered is smaller. Still, the sensors themselves can cost anywhere from a few hundred to tens of thousands of dollars depending on the accuracy and interoperability options. One additional advantage of such systems is there is no annually recurring fee. However, maintenance and repairs often require specialized knowledge.

Cost calculations of AGDS should also include personnel costs, although these are harder to calculate. Nonetheless, it is reasonable to expect that uncovering and responding to more gunfire incidents would increase the demand on personnel and vehicles. The demand on forensic ballistic analysis is also likely to increase substantially.

Personnel Needs

Installing an acoustic gunshot detection system can double or triple the volume of gunfire calls.72 Your agency will need to plan for having enough officers available to handle not just overall call volumes but also call surges at peak times. Acoustic alerts tend to peak later at night (10PM–2AM) than calls made by residents (see the example in Figure 6). During this peak time, fewer officers are typically available to respond to the alerts, potentially causing delays. Particularly in large agencies with a significant gunfire problem, multiple unique and near-simultaneous AGDS alerts may demand attention from a limited number of officers. Inadequate staffing during distinct peak times may undermine the system’s effectiveness. A recurrent staffing analysis should therefore be conducted to determine whether changes in personnel allocation are needed to accommodate the volume of calls.

Figure 6. Acoustic Alerts in St. Louis by Time of Day: Shotspotter Data January 2017 through March

Source: St. Louis Metropolitan Police Department.

Gunshot detection data analysis is another important staffing consideration,73 and agency analysts should be involved in planning discussions.74 Agencies should also consider how the data generated by AGDS might be used more broadly for identifying and addressing crime hotspots.†

† See Problem-Solving Tools Guide No. 14, Understanding and Responding to Crime and Disorder Hot Spots, for further information.

Interoperability with Other Systems

Some AGDS can automatically send gunfire dispatch notifications to MDTs and officers’ cell phones; others operate by notifying dispatch or a Real-Time Crime Center (RTCC). Paying for a system that can push notifications directly to officers makes sense only if the officers have the capability and willingness to receive these notifications, which may involve discussions with unions if the phones are not department-issued. In addition, if gunshot detections can be streamed to an RTCC, responding officers can draw on additional information from nearby cameras and ALPR to secure visual evidence.75 Some systems can automatically activate nearby cameras and ALPRs to pan to the gunfire location. In such cases, it is important to review the coverage and interoperability of such technology prior to implementation.

Training Requirements

An increase in gunfire responses likely increases the risk of dangerous interactions between citizens and police, both from emergency responses to the scene and from interactions at them. Officers might need refresher training in how to approach gunfire locations, search for ballistic evidence, and interact with community members.76 In addition, additional training in how to apply initial trauma care may be beneficial.‡ Investigators may need training in how to use the data for active investigations, although some vendors provide this training as part of their service. AGDS calls for service are substantially different from most calls for service, even shots-fired calls reported by residents. Officers are asked to respond to active gunfire without an expressed invitation from residents (since most alerts do not have a matching call for service from the public), potentially increasing adversarial encounters. Whereas gunfire calls for service by residents typically guide police to a street address, AGDS notifications often pinpoint gunfire into backyards, alleys, and other locations where responding officers are not necessarily welcome. Also, individuals near the scene of a shooting alert may not be involved in the incident; it is important to remember that AGDS identifies gunfire, not shooters.

‡ One example is “Stop the Bleed” training: https://www.stopthebleed.org/.

Operating Procedure and Policies

The unique aspects of responding to AGDS alerts should be addressed in a dedicated AGDS standard operating procedure (SOP) and other policies. These documents should embody best practices in the field, adapted to your agency’s unique circumstance. At a minimum, an AGDS policy should cover the following:

  • Response to a notification
    • Source of a notification to be used by an officer (e.g., dispatch, MDT, application)
    • Personnel involved (e.g., how many officers respond and the role of a supervisor)
  • Investigative procedures and collection of evidence
  • Follow-up procedures
  • Community interaction.

The Cincinnati Police Department, for example, adopted a comprehensive SOP that requires officers to sign in to the AGDS vendor’s console on their MDT.77 Each alert requires a minimum of two officers responding. Officers are also instructed to respond to the mapped incident location, not the address, thereby avoiding some of the discrepancies in an addressbased response. Furthermore, officers are directed to search for evidence of gunfire in a 100-foot radius of the mapped location and attempt to contact residents in the eight nearest homes. Officers are also encouraged to request follow-up investigations if conditions precluded a thorough initial investigation.

Stakeholder Support

Conversations about AGDS among patrol officers, the community, and political leadership must start early. These conversations should include an explanation of how the system works and what steps the police department is taking to mitigate concerns about use-of-force and equity. Concern about surveillance technology is widespread, but acoustic systems do not continuously record audio. The brief audio snippets of loud events that do get recorded in combination with the sensor placement above street level make it unlikely that recordings capture conversations. Similarly, concerns about over-policing and targeting of communities of color may be brought up by residents, making it crucial for the department to address how the implementation and location of the technology are based on data detailing the concentration of gun violence (e.g., homicides and aggravated assaults with firearms). It is also important to remind stakeholders that AGDS is not a complete substitute for citizens notifying police of gunshots, and that successful prosecution of offenders is unlikely to rely on AGDS data alone. Securing buy-in also usually requires transparency in sharing research findings and making data accessible. Community meetings can reach residents and solicit input while explaining what the system does and does not do. Engaging with the media and providing data and visualizations in an open data portal can also enhance transparency once implementation has begun.

Community Considerations

Thus far, limited research has been done on residents’ views on or uses of acoustic technology. A survey conducted in Cincinnati’s neighborhoods that are outfitted with AGDSindicates that most respondents believe acoustic technology provides a deterrent and is linked to a lower number of shootings.78 Unfortunately, the survey results were drawn from a non-representative population composed primarily of older White residents. Similarly, community surveys in Wilmington, Delaware, indicate that 85 percent of polled residents reported that AGDS makes them feel safer, and only 5 percent conveyed concerns about privacy invasion.79 A survey of the general U.S. population reports that over 60 percent of Americans support acoustic technology, with only 11 percent indicating some opposition to its use.80

However, in recent years, public and political voices have become more skeptical of police technology in general, with acoustic technology often singled out as an example of poor police practices.81 Although initial concerns of civil rights groups––such as the ACLU––focused on the surveillance capacities of AGDS, recently attention has shifted to the potential of acoustic systems to lead to over-policing of communities of color.82 Such representations, however, might be rooted in a general lack of understanding of how acoustic technology works, as well as concerns over the  perceived high number of false-positive alerts.83

Police must be transparent about the technology, including what it does and how data are collected and stored. Sharing data with the public should be part of this transparency. Such discussions must stay within the bounds of vendor user policies, however, and your agency should provide these data limitations to the community. Producing and sharing maps or dashboards and making them available to the public is one way toward the transparency objective. Transparency helps counter unfounded criticism, especially if the limitations of the data and police efforts to measure (and publicize) the efficacy of such systems are explained. As an example of data transparency, Figure 7 provides gunfire data from Minneapolis, Minnesota, displaying AGDS alerts (ShotSpotter) and resident-reported gunfire incidents (shootings and shots fired).

Residents are typically not given much of a voice in the implementation of acoustic technology, or indeed the implementation of surveillance cameras and ALPRs. Most deployments of acoustic technology are based on police analysis of historical gunfire data, since it makes sense to put technology where it will provide the most actionable intelligence. However, AGDS alerts will inevitably bring police into a greater number of high-stress situations, which may rankle some residents. Calls for gunfire are fundamentally different than many other calls patrol officers handle. Officers often must rush at high speed into an active gunfire situation with little time to think or reflect on the scene encountered. No hard data yet address the way that acoustic systems affect officer-involved shootings, and the evidence we have now is largely anecdotal. Community concerns about the police response to AGDS notifications must be addressed through policies, procedures, and training—all of which should be shared with the community.

An unintended consequence of installing AGDS in high-crime areas is that more AGDS will be installed in communities of color, which has the potential to heighten pre-existing perceptions of inequities and may raise questions of fairness, such as whether it is reasonable to subject communities of color to higher levels of surveillance.84 Soliciting community input regarding installation decisions, carefully considered policies, and proper officer training can help address such concerns.

Figure 7. Example of Data Transparency: Gunfire Date from Minnapolis, Minnesota

Source: City of Minneapolis Open Data.

Note: Current data can be retrieved at: https://www.minneapolismn.gov/resident-services/public-safety/police-public-safety/crime-maps-dashboards/shots-fired-map/.

previous page next page