Investigating how facial recognition technology is used and its human rights implications
Facial recognition technology is typically deployed to secure rights, in particular by protecting against threats to public order and the rule of law. However, they may also threaten rights such as the rights to privacy, expression, association and assembly.
What is facial recognition technology?
Facial recognition technology is an artificial intelligence tool used to identify individuals within a digital image – such as a video or photo – often in real time. It has been used by police to identify possible suspects or to catch jaywalkers. It also used to provide travel advice in airports, and to unlock smartphones or computers.
However, its use is becoming increasingly pervasive. Just think about how your smartphone identifies your friends in your photo library.
Facial recognition and human rights
There are significant human rights concerns associated with the technology, and this is increasingly being recognized by the public. Indeed, in 2018 Microsoft’s chief legal officer called for regulation in a blog post saying:
“Imagine a government tracking everywhere you walked over the past month without your permission or knowledgeImagine a database of everyone who attended a political rally that constitutes the very essence of free speech.” – Facial recognition technology: The need for public regulation and corporate responsibility
Because facial recognition technology is so new, and because it can be deployed by so many actors, across so many different contexts, urgent research into the human rights implications is required. A key danger is that facial recognition technology can be used to create a record of our movements, and this record can then be analysed to create a detailed profile of our lives. It could reveal personal information such as who we meet with, and how often; where we work; where we socialise; if we are members of any groups or organisations; or whether we go to protests. This can be used not only to reveal personal information (such as sexual orientation or health information) but also our political opinions. The potential impact on the balance of power between the State and its citizens is significant.
Right to a private life
Article 17 of the International Covenant on Civil and Political Rights codifies the right to privacy. Live facial recognition technology interferes with this right, as it involves the processing of images for the purpose of determining an individual’s identity. The use of live facial recognition technology by public authorities will typically give rise to two separate interferences with the right to privacy. Firstly relating to the initial processing of the image (i.e. image analysis using facial recognition technology). Secondly to any subsequent retention or analysis of the data.
Prohibition of discrimination
The prohibition of discrimination is codified in Article 2 of the International Covenant of Civil and Political Rights, and Article 2 of the International Covenant on Economic, Social and Cultural Rights. This protects individuals and their rights from discrimination of any kind, such as race, colour, sex, language, religion, political or other opinion, national or social origin, property, birth or other status.
A number of concerns exist regarding bias associated with facial recognition technology, whether as a result of difficulties in conducting biometric matching or as a result of inadequate training data. The risk is that this may translate into prohibited forms of discrimination depending on how facial recognition technology is used.
The use of facial recognition technology may also bring into play other rights, such as the rights to freedom of expression, association, and assembly, or the right to liberty and security. Indeed, the potential uses of facial recognition technology are so broad that a number of rights may be affected. This is why research is necessary.
Members of the HRBDT project are currently engaged in extensive research into how facial recognition technology is used, and its human rights implications. For instance, Prof. Pete Fussey and Dr. Daragh Murray recently conducted an independent review of the London Metropolitan Police’s trial of facial recognition technology.
Right On – The Wednesday Web ChatRead more
The HRBDT project is supporting a new digital initiative that aims to keep the Human Rights dialogue going during the…
Discussion Series: Live Facial RecognitionRead more
In July 2019 researchers from the Human Rights Big Data and Technology Project published the first independently-funded academic report into…
HRBDT Public Panel Discussion: “Facial Recognition and Human Rights” 10 September 2019Read more
On 10 September the Human Rights, Big Data and Technology Project hosted a panel discussion at The Royal Society on…
- HRBDT is delighted to co-organise with @UN_GP_RtoP a three-day Round Table with social media and tech companies on… https://twitter.com/i/web/status/1268130084621291521
- HRBDT Director Prof Lorna McGregor is looking forward to speaking at this critical seminar today organised by… https://twitter.com/i/web/status/1265585380012830720
- In their latest blog @ahmedshaheed and Prof Lorna McGregor set out 5 urgent principles for responding to harm cause… https://twitter.com/i/web/status/1263813112043618305
- Here's the @HRBDTNews weekly roundup of #HumanRights #TechNews including #ContactTracingApp #digitalinclusion a… https://twitter.com/i/web/status/1263039647636230144
- How can we ensure that the rights of #refugees are protected during this pandemic? Awareness of the existing laws i… https://twitter.com/i/web/status/1262699194873241600
- Here's the @HRBDTNews roundup of the weeks #Humanrights #tech news stories including #ContactTracingApp… https://twitter.com/i/web/status/1259869449177509888