2015 POP Conference
Oct 19-21, 2015 Portland, OR

Center for Problem-Oriented Policing

Powerd by University at Albany, SUNY
Site Menu ▼

How is the concept of risky facilities different from hot spots and repeat victimization?

Risky facilities can show up as hot spots on a city’s crime map. Indeed, specific hospitals, schools, and train stations are often well-known examples. But simply treating these facilities as hot spots misses an important analytical opportunity: comparing the risky facilities with other like facilities. Such a comparison can reveal important differences between facilities that can account for the differences in risk, thereby providing important pointers to preventive action.

In addition, risky facilities are sometimes treated as examples of repeat victimization. However, this can create confusion when it is not the facilities that are being victimized, but rather the people who are using them. Thus, a tavern that repeatedly requests police assistance in dealing with fights is not itself being repeatedly victimized, unless it routinely suffers damage in the course of these fights or if members of staff are regularly assaulted. Even those participating in the fights may not be repeat victims, as different patrons might be involved each time. Indeed, no one need be victimized at all, as would be the case if the calls were about drugs, prostitution, or stolen property sales. Calling the tavern a repeat victim can be more than just confusing, however, because it might also divert attention from the role mismanagement or poor design plays in causing the fights. By keeping the concepts of repeat victimization and risky facilities separate, it may be possible to determine whether or not repeat victimization is the cause of a risky facility and thereby to design responses accordingly.

How can the concept of risky facilities assist problem-oriented policing projects?

The concept of risky facilities can be helpful in two types of policing projects. First, the concept can be useful in crime prevention projects that focus on a particular class of facilities, such as low rent apartment complexes or downtown parking lots. In the scanning stage, the objective is to list the facilities involved along with the corresponding number of problem incidents in order to see which facilities experience the most and which the fewest problems. This might immediately suggest some contributing factors. For example, a study of car break-ins and thefts in downtown parking facilities in Charlotte, North Carolina revealed that the number of offenses in each parking lot was not merely a function of size.14 Rather, it was discovered that some smaller facilities experienced a large numbers of thefts because of some fairly obvious security deficiencies. This finding was explored in more depth in the analysis stage by computing theft rates for each facility based on its number of parking spaces. The analysis found that the risk of theft was far greater in surface lots than in parking garages, a fact that had not been known previously. Subsequent analysis compared security features between the multilevel and surface lots and then within the members of each category in an effort to determine which aspects of security (e.g., attendants, lighting, security guards) explained the variation. This analysis guided the selection of measures that were to have been introduced at the response stage; and had these been implemented as planned (which was not the case), the assessment stage would have examined, not merely whether theft rates declined overall, but whether those at the previously riskiest facilities had declined most. Obviously, this type of analysis can be conducted within any group of facilities.

Second, risky facilities analysis can be helpful to crime prevention efforts that focus on a particular troublesome facility. In this sort of analysis, the scanning stage consists of comparing the problems at a particular facility with those at similar nearby facilities. For example, in a project that won the Herman Goldstein Award for Excellence in Problem-oriented Policing in 2003, 15 police in Oakland, California discovered that a particular motel experienced nearly 10 times as many criminal incidents as did any other comparable motel in the area. Although in this case the analysis convinced Oakland police to address the problems at the motel in question, in other cases analysis might reveal that some other facilities have far greater problems than the one which was the initial focus of the project. Comparing the facility being addressed in the project with other group members can also be useful in the analysis, response, and assessment stages described above. 

How can risk be measured?

Police reports and calls for service data are the most common sources of information about crime and disorder events. However, using these data can lead to errors if care is not taken to check for some of the following potential problems.

† Many of these data problems are also encountered when studying hot spots and repeat victimization. For further information see Deborah Weisel (2005), Analyzing Repeat Victimization, Problem Solving Tools Series No. 4.

  1. Underreporting. Not all incidents of crime and disorder are reported to the police. In fact, reporting practices can vary considerably from facility to facility, which can seriously distort estimates of risk concentration. For example, a facility that always reports crimes to the police will appear to suffer more incidents of victimization than will a similar facility that experiences the same number of incidents but reports fewer to police. Such distortions can be difficult to discover, which is why it can be important to ask facility managers about their reporting policies or to ask beat officers whether the recorded crime rates match their own perceptions of the crime problems at the facilities in question. In some cases, administrative records kept by a regulatory agency or the facilities themselves might be more accurate. For example, records of vandalism repairs kept by schools or other public facilities might be more accurate than police records of vandalism. However, these administrative records can be difficult to compare among facilities. Sometimes, it might be feasible to survey facility managers to obtain estimates of the number of incidents and at the same time to gather information about management practices (see below). However, surveys can be expensive and difficult to conduct if they are to provide reliable information.
  2. Incomplete address matching. When using police records, it can sometimes be difficult to determine whether two different events occurred at the same facility. There are several reasons for this.
    • Precise address information is sometimes unavailable for large facilities, such as parks, parking lots, or sports venues.
    • Some facilities have multiple addresses, including different street addresses.
    • Police sometimes record offense locations as intersections or hundred-block addresses, which can make it difficult to determine whether an event occurred at a particular facility.
    • Police data sometimes fail to distinguish between residential and commercial addresses or fail to make important distinctions between types of residential properties, such as apartment blocks or single-family dwellings.

    Incident reporting forms and police records can be revised to improve geographical information gathering; moreover, the increased use of geocoding for crime reports will gradually help resolve some of these difficulties.

  3. Mixed use locales. Sometimes, multiple facilities are situated at the same location. For example, some buildings with ground floor retail establishments have apartments on the floors above; hotels not only contain guest rooms, but also bars and restaurants. In addition, use may vary by time of day or day of week, at the same place. For example, a building that functions as a church on Sundays might house a daycare center or soup kitchen during the week. Although it can be difficult to determine which facility is responsible for which crime, such distinctions are crucial to determining which type of response to apply.
  4. Infrequent events. Where specific crime or disorder events are common, it is relatively easy to describe the distribution of crimes per facility. However, this can be more difficult for rarer events, such as homicide or rape, because short period estimates are unlikely to show a crime distribution that is distinguishable from a random variation. As a consequence, it may be necessary to analyze many years worth of data before any meaningful patterns become apparent.
  5. Long time periods. Studying facilities over long time periods can produce results that are confounded by changes in the facilities themselves; for example, some may go out of business, others may come into being, and yet others may be altered, both physically and managerially.
  6. Facilities with no events. Facilities that experience none of the events in question may be invisible if police data are the sole source of information, because police data only show locations with one or more events. Excluding such facilities can distort the assessment of the 80/20 rule. If a regulatory authority licenses the facilities under study (for example, locations that serve alcohol), then data from the regulatory agency can be compared to police data to estimate the number of facilities that experience no events. Remember, however, that it can be difficult to get accurate counts of facilities that are not required to register with some authority.
  7. Small numbers of facilities. Some facilities are more common than others. In a moderate sized city, for example, there will be few hospitals. Given at least two facilities, it is likely that one will have more crime than will the other. Although this can have some very practical consequences, the population may be too small to make any meaningful comparison. In such cases, analyzing data from a larger region may be more productive.
  8. Random variation. It is possible to find random concentrations of crime, although this is more likely to occur when only a few facilities with only a few incidents are being examined. In such cases, try checking the same facilities for a different time period. If the rank order of incidents is roughly the same in both periods, then it is probable that the variation is not random. Box 1 provides an example.

Box 1: Testing for random variation in risk

A study in England in 1964 found that absconding rates for residents in 17 training schools for delinquent boys ranged from 10 percent to 75 percent. To determine whether this variation was random, researchers reexamined the absconding rates two years later (1966) to see if the variation was much the same. They found that by and large the variation was consistent between the two years. For example, School 1 had the lowest absconding rate and School 17 the highest rate in both years (see the table below). In fact, the correlation was 0.65 between the two years. Because the variation was relatively stable and because very few boys would have been residents in both years, researchers determined that the variation was probably due to differences in management practices rather than to differences in the student populations.

† Correlation coefficients can be calculated quite simply from an Excel spreadsheet. 

Training School Absconding Rate
  1964 1966
1 10% 10%
2 13% 38%
3 14% 14%
4 21% 18%
5 21% 23%
6 22% 14%
7 22% 21%
8 24% 29%
9 25% 33%
10 26% 37%
11 27% 25%
12 28% 47%
13 29% 45%
14 32% 43%
15 34% 26%
16 46% 27%
17 75% 50%

Adapted from: Clarke and Martin (1975).