Data Driven Policing: Highlighting Some Risks Associated with Predicting Crime

The Human Rights, Big Data and Technology Project (the Project) recently made a submission to the UK Home Affairs Committee inquiry into ‘Policing for the future: changing demands and new challenges’. The submission summarised the opportunities and risks associated with the police’s adoption of ‘data driven technologies,’ which aid the bulk collection, storage, and analysis of data about persons, places, and practices. Data driven technologies can improve the efficiency of police work by offering law enforcement agencies methods of analysing crime data and generating predictions about crime that is yet to take place. For example, police can use data driven technology to make decisions regarding the most-effective allocation of scarce police resources. However, existing research indicates a series of risks that must be considered when adopting data driven technologies. This submission highlights three risks associated with data driven technologies, focusing on: the quality of the input data, the potential for discriminatory decision-making, and privacy harms. This blog recaps the conclusion of our submission by highlighting the risks associated with data driven technologies used to predict crimes.

Data Driven Technologies and Predicting Crime

Data driven technologies are central to daily interactions with legal, corporate, and social institutions. For example, online retailers rely on data driven technologies to: collect data on consumer habits, analyse this data to discover consumer trends, and use knowledge of these trends to make predictions about future consumer behaviour. Retailers capitalise on predictions in the form of purchase recommendations and personalised ads.

When adopted by police, data driven technologies serve a similar function by collecting crime and other data, analysing this data to determine crime trends, and using knowledge of these trends to make predictions about future crimes, hence the term “predictive policing.” These predictions may be used to inform the allocation of police resources.

There are many types of predictive policing. For example, some predictive technologies attempt to predict victims (those most likely to be victims of crime), others predict offenders (those most likely to commit crime in the future), and others predicting the locations (“hot spots”) where crime is likely to occur in the future. A contemporary example of the latter is the Los Angeles Police Department’s use of PredPol software, which uses three data points (time, place, and type of recent crimes) to identify hotspots.

PredPol is now used by several police departments in the UK including Kent Police. Early studies suggest PredPol may have contributed to a 6% reduction in crime rates in North Kent. However, research also identifies significant risks that must be addressed when considering the adoption of predictive software.

Three Risks of Data Driven Policing

  • Data Quality: The effectiveness of predictive software relies on the quality of input data. If input data is inaccurate, incomplete, or skewed, this will significantly affect the quality of predictive outputs made by predictive software. For example, deficient input data may result in false positives (no crime in alleged “hot spots”) or false negatives (crimes in areas identified as low-risk). There are many reasons that data may be deficient. For example, some crimes are consistently under reported, meaning crime data is incomplete and cannot provide algorithms with a quality representation of crime rates. Furthermore, police discretion introduces a level of subjectivity into the production of crime data, which may affect the quality of that data in terms of its depiction of crimes rates. As a result crime data may present a skewed portrayal of the persistence of crimes affecting the accuracy with which hot spots can be predicted.
  • Discriminatory Capacities: the use of predictive software can result in discriminatory outcomes. Evidence suggests that some police activity may disproportionately target members of marginalised groups and impoverished neighbourhoods. As a result, crime data falsely suggests that crime rates are particularly high in impoverished neighbourhoods, not because they are the locations where crime is most common, but because they are the locations where police focus their patrols, thereby overlooking similar crimes occurring in areas where there is limited police presence. Once this skewed data is introduced to predictive software, the software will notice that crime data suggests that most crime takes place in impoverished neighbourhoods, and predict that future crimes will follow this pattern. Such predictions will funnel more police into already over-policed spaces, resulting in a self-fulfilling cycle in which the criminality of impoverished spaces is given priority by police, and crime data continues to suggest a correlation between impoverished spaces and crime. Evidence of these issues can be found in Human Rights Data Analysis Group’s study of algorithms that are trained on biased data which found that those algorithms tend to make predictions encouraging the police to focus on impoverished spaces.
  • Privacy Harms: The use of data driven technologies requires the collection of large quantities of data raising questions about the police’s contributions to mass surveillance. Such surveillance poses significant risks, include violations of privacy rights. Without privacy, citizens are left without a retreat from which to carry out unmonitored communication and self-expression. This may result in a “chilling effect” based on the premise that monitored citizens inhibit behaviour (related to sexuality for example) and conceal information (related to one’s health for example) that, if publicised, could result social exclusion and the denial of opportunities.

Regulation and Oversight

Addressing these socio-legal concerns will require the adoption of regulations facilitating independent assessment of crime data. It will also require the police create oversight mechanisms that ensure use of data driven technologies comply with human rights law and address concerns about the intrusive nature of mass surveillance. Accordingly, the Project recommends the creation of regulations with particular attention paid to if and how police address the limitations of crime data, and the discriminatory capacities and privacy harms related to the use of data driven technologies.

Disclaimer: The views expressed herein are the author(s) alone.