• Center for Problem oriented policing

POP Center Tools Understanding Risky Facilities Page 4

previous page next page

How Can Risk be Measured?

Police reports and calls for service data are the most common sources of information about crime and disorder events. However, using these data can lead to errors if care is not taken to check for some of the following potential problems.§

§ Many of these data problems are also encountered when studying hot spots and repeat victimization. For further information see Deborah Weisel (2005), Analyzing Repeat Victimization, Problem Solving Tools Series No. 4.

  1. Underreporting. Not all incidents of crime and disorder are reported to the police. In fact, reporting practices can vary considerably from facility to facility, which can seriously distort estimates of risk concentration. For example, a facility that always reports crimes to the police will appear to suffer more incidents of victimization than will a similar facility that experiences the same number of incidents but reports fewer to police. Such distortions can be difficult to discover, which is why it can be important to ask facility managers about their reporting policies or to ask beat officers whether the recorded crime rates match their own perceptions of the crime problems at the facilities in question. In some cases, administrative records kept by a regulatory agency or the facilities themselves might be more accurate. For example, records of vandalism repairs kept by schools or other public facilities might be more accurate than police records of vandalism. However, these administrative records can be difficult to compare among facilities. Sometimes, it might be feasible to survey facility managers to obtain estimates of the number of incidents and at the same time to gather information about management practices (see below). However, surveys can be expensive and difficult to conduct if they are to provide reliable information.
  2. Incomplete address matching. When using police records, it can sometimes be difficult to determine whether two different events occurred at the same facility. There are several reasons for this.

a) Precise address information is sometimes unavailable for large facilities, such as parks, parking lots, or sports venues.

b) Some facilities have multiple addresses, including different street addresses.

c) Police sometimes record offense locations as intersections or hundred-block addresses, which can make it difficult to determine whether an event occurred at a particular facility.

d) Police data sometimes fail to distinguish between residential and commercial addresses or fail to make important distinctions between types of residential properties, such as apartment blocks or single-family dwellings.

Incident reporting forms and police records can be revised to improve geographical information gathering; moreover, the increased use of geocoding for crime reports will gradually help resolve some of these difficulties.

3. Mixed use locales. Sometimes, multiple facilities are situated at the same location. For example, some buildings with ground floor retail establishments have apartments on the floors above; hotels not only contain guest rooms, but also bars and restaurants. In addition, use may vary by time of day or day of week, at the same place. For example, a building that functions as a church on Sundays might house a daycare center or soup kitchen during the week. Although it can be difficult to determine which facility is responsible for which crime, such distinctions are crucial to determining which type of response to apply.

4. Infrequent events. Where specific crime or disorder events are common, it is relatively easy to describe the distribution of crimes per facility. However, this can be more difficult for rarer events, such as homicide or rape, because short period estimates are unlikely to show a crime distribution that is distinguishable from a random variation. As a consequence, it may be necessary to analyze many years worth of data before any meaningful patterns become apparent.

5. Long time periods. Studying facilities over long time periods can produce results that are confounded by changes in the facilities themselves; for example, some may go out of business, others may come into being, and yet others may be altered, both physically and managerially.

6. Facilities with no events. Facilities that experience none of the events in question may be invisible if police data are the sole source of information, because police data only show locations with one or more events. Excluding such facilities can distort the assessment of the 80/20 rule. If a regulatory authority licenses the facilities under study (for example, locations that serve alcohol), then data from the regulatory agency can be compared to police data to estimate the number of facilities that experience no events. Remember, however, that it can be difficult to get accurate counts of facilities that are not required to register with some authority.

7. Small numbers of facilities. Some facilities are more common than others. In a moderate sized city, for example, there will be few hospitals. Given at least two facilities, it is likely that one will have more crime than will the other. Although this can have some very practical consequences, the population may be too small to make any meaningful comparison. In such cases, analyzing data from a larger region may be more productive.

8. Random variation. It is possible to find random concentrations of crime, although this is more likely to occur when only a few facilities with only a few incidents are being examined. In such cases, try checking the same facilities for a different time period. If the rank order of incidents is roughly the same in both periods, then it is probable that the variation is not random. Box 1 provides an example.

Box 1: Testing for random variation in risk

A study in England in 1964 found that absconding rates for residents in 17 training schools for delinquent boys ranged from 10 percent to 75 percent. To determine whether this variation was random, researchers reexamined the absconding rates two years later (1966) to see if the variation was much the same. They found that by and large the variation was consistent between the two years.§ For example, School 1 had the lowest absconding rate and School 17 the highest rate in both years (see the table below). In fact, the correlation was 0.65 between the two years. Because the variation was relatively stable and because very few boys would have been residents in both years, researchers determined that the variation was probably due to differences in management practices rather than to differences in the student populations.

§ Correlation coefficients can be calculated quite simply from an Excel spreadsheet.

Training SchoolAbsconding Rate
 19641966
110%10%
213%38%
314%14%
421%18%
521%23%
622%14%
722%21%
824%29%
925%33%
1026%37%
1127%25%
1228%47%
1329%45%
1432%43%
1534%26%
1646%27%
1775%50%

Source: Clarke and Martin (1975).

previous page next page