Analysing the implications of police data and technology on human rights
Predictive analytics and other data-driven policing innovations offer immense opportunities for violence prevention, crime reduction, and efficiencies. But basic human rights are not at the forefront of software and tech design.
Big data and law enforcement
Big data is transforming law enforcement. Predictive technologies can forecast crimes before they are committed. Data on the location of crime is shaping and mapping out where police patrol and intelligence systems can highlight offenders who are responsible for a disproportionate or large number of offences.
In terms of violence prevention, the benefits are immense. For example, Columbia has seen an 82% decrease in murder rates since police started using big data; mapping crimes, compiling more reliable homicide numbers and gathering information on precisely how murders were committed. This large-scale knowledge allows for arguably well-informed predictions of future criminal acts, provoking the ideology that they can generally be stopped before they are carried out.
However, communities often have little or no idea of why and how digital and technological products are used to police and protect them. This raises concerns for the democratic oversight and authorisation of these applications.
The threat of human bias and discrimination
Some of the factors used in predictive policing may include prior convictions, income level, employment status, family background, neighbourhood, education level, and the behaviour of family and friends. Because of some of these elements, particularly family background, the data is racially encoded. This leaves it subject to committing racial bias or discrimination, dependent on the individual processing it.
The threat to privacy
Often communities undergoing big data ‘tests’ are unaware that they’re being surveilled. In one of the biggest, in New Orleans, predictive policing was used to utilise sophisticated data mining tools from Silicon Valley’s Palantir. This aimed to uncover ties to other gang members, outline extensive criminal histories and even predict future gang members. The council were not told, and no consent was given as the test was masked as a ‘philanthropic venture’. This violated the privacy of all the citizens involved in the processing. In addition, the factors mentioned previously also contribute to potential violations of privacy. For example, innocent citizens called for questioning because their cousin is a known drug-dealer – information the police would have gained from big data – even though they have no criminal background or reason to be a suspect.
Judges using big data for sentencing decisions
The Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) is a form of artificial intelligence that purports to predict a defendant’s risk of committing another crime. COMPAS works through a proprietary algorithm. This is the same type search engines use to display the most relevant results from their search index for specific queries. The system was used in the US in 2013 when a male, Eric Loomis, was found driving a car that was used in a shooting. COMPAS identified him as at high risk of re-offending, and sentenced him to six years. Loomis appealed the ruling on the grounds that the judge violated due process in considering the outcome of an algorithm whose inner workings were secretive and could not be examined. The appeal went up to the Wisconsin Supreme Court who ruled against Loomis, concluding that the sentencing would have been the same without consulting COMPAS. However, this urged caution and skepticism in the algorithm’s use.
The best intelligence is human
It’s widely accepted that intelligence work is the most effective form of counter-terrorism, and that the best intelligence comes from community engagement, not coercion.
For example, the arrest in 2008 of Andrew Ibrahim for intent to commit terrorism followed tip-offs from Bristol’s Muslim community. Detective work plays the key role in identifying terrorists after attacks. Despite the repeatedly shown surveillance camera footage of the 7/7 bombers at Luton station, it was forensic examination of corpses and intelligence from the missing persons helpline that identified them.
What public evidence there is on anti-terrorist investigations demonstrates the overwhelming importance of community tip-offs and informants. One of the most robust studies concluded that information from these sources initiate 76% of anti-terrorist investigations. This analysis of 225 individuals recruited or inspired by al-Qaeda revealed that “the contribution of NSA’s bulk surveillance programmes to these cases was minimal”, playing an identifiable role – with the most generous interpretation of the results – in just 1.8% of cases. In conclusion, the vital importance of traditional investigative and intelligence methods is undeniable.
Accountability and appropriate remedies
Andrew Guthrie Ferguson, the author of The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement, believes that the way to maximise the benefits of the data revolution and minimise costs, is through a constant process of public accountability. As well as that, he says: “Big data technologies will improve the risk-identification capacities of police but will not offer clarity about appropriate remedies.” We want to have clarity with risk identification and potential human rights abuses so that we can discover appropriate remedies that will reform the software so it can be used safely and not cause any risk to any person’s human rights. Not only that, but to also push accountability to those designing and using the software and for due diligence at every stage of the algorithm creation process.
Through qualitative empirical research with police forces in the UK and USA, collaborative workshops, discussions and public engagement, we analyse the actual implications of police data and technology on human rights, and push for more human rights-compliant policing.
HRBDT Researchers Launch New Report on London Metropolitan Police’s Trial of Live Facial Recognition TechnologyRead more
A new report by researchers from the Human Rights, Big Data & Technology Project, based at the University of Essex…
BBC Radio Essex Interview – Surveillance TechnologyRead more
Professor Pete Fussey was recently interviewed on BBC Radio Essex talking about his role leading the new human rights, data…
HRBDT provides evidence to The Law Society’s Public Policy Technology and Law CommissionRead more
On 25 July 2018, Lorna McGregor gave evidence to The Law Society’s Public Policy Technology and Law Commission. The Law…
HRBDT at Policy Roundtable on Application of Responsible AI: Opportunities for PolicingRead more
On 17 November 2017, Pete Fussey participated in a Reform policy roundtable discussion in London on ‘Application of Responsible AI: Opportunities…
Quick Comment on UK Draft Data Retention and Acquisition Regulations 2018 and the Definition of ‘Serious Crime’ for Bulk Surveillance PowersRead more
The UK Government has published the Draft Data Retention and Acquisition Regulations 2018, which propose changes to the Investigatory Powers…
Four ways your Google searches and social media affect your opportunities in lifeRead more
By Lorna McGregor, Daragh Murray and Vivian Ng Originally published in The Conversation on 21 May 2018. Whether or not you realise…
- Here's the #HRBDT weekly news round up of tech #humanrights stories including #lawenforcement #election2019 targett… https://twitter.com/i/web/status/1193949729983488002
- Join us for a thought provoking #HRBDT talk ‘Your Data – Why Should You Care?’ @firstsite Colchester on 3 December.… https://twitter.com/i/web/status/1193872045957943296
- Which opportunities does #AI offer for #HumanRights advancement in the Global South? Which threats does it pose in… https://twitter.com/i/web/status/1192330730573049856
- #AI, #HumanRights and #media development. Check out the programme of the Fome Symposium 2019 (Bonn, 7-8 Nov) and th… https://twitter.com/i/web/status/1192197351055732736
- Here's the HRBDT roundup of #HumanRights tech news including #election2019 political ads and #misinformation,… https://twitter.com/i/web/status/1191774098881482755
- So, we have what you (might) have been waiting for! #RightsCast a #HumanRights focused podcast from @EssexHRC. Topi… https://twitter.com/i/web/status/1191705560879706115