top of page

FACIAL RECOGNITION TECHNOLOGIES IN THE UK: ONE STEP FORWARD TO PREDICTIVE POLICING

Aggiornamento: 25 feb 2022

‘Predictive Policing’ might sound like movie fantasy at first sight however, it shall be considered one of the latest trends within the context of the so-called ‘Surveillance Society’.


With a focus on facial recognition technologies, if you believe they are merely utilised for commercial applications that could give you ID access to your electronic device or to pass the border on automatic passport checks when you are traveling abroad, again, you are probably underestimating this issue.


What is the status of art in the United Kingdom?


First of all, let’s provide definitions of the tech tools involved.


Facial Recognition Technology (FRT): it is the process by which an individual can be identified or recognised from a digital facial image. Cameras are used to capture these images and an FRT software produces a biometric template in which the individual is aware of the process undertaken (e.g. passport control).


Live Facial Recognition (LFR) or Live Automated Facial Recognition (LAFR): it is different and is deployed in a similar way to traditional CCTV. It is directed towards everyone in a particular area rather than applied to specific individuals. Data is collected in real-time and potentially on a mass scale. There is usually a lack of awareness, choice, or control for the individual in this process (e.g. public surveillance for security purposes).


Cultural and political implications of those technologies are out of scope in this article nevertheless, I would like to raise some awareness on fundamental rights implications and civil liberties currently threatened by law enforcement data-driven practices.


Comparing different jurisdictions on law enforcement and security policies is far from easy, cultural, historical and social factors are unique variables, therefore I would like to provide a snapshot of the current state of the art in the United Kingdom. Specifically, due to the abuse of live facial recognition for public security and surveillance purposes, we can evidence serious concerns on the protection of fundamental rights, meanwhile progressively moving towards a ‘Predictive Policing’ model of law enforcement.


Case study 1: A 14 years old black schoolchild victim of live facial recognition misidentification


In May 2019, a 14 years old black school child in his school uniform was walking down the road in London to attend his classes. He was wrongly identified by a facial recognition system installed for public safety, and immediately surrounded by four undercover police officers. He was questioned, arms held, asked for his phone, and even fingerprinted. After just ten minutes, the police had to release him due to a mismatch report with another person. The child was clearly shocked by the experience, nevertheless the Metropolitan Police – even admitted FRT and LAFR raise significant issues over gender and racial bias – have continued to use it on daily operations.


Case study 2: Innocent people with mental health problems


This case takes place again in London on Remembrance Sunday in November 2017. The Metropolitan Police took advantage of a project for live facial recognition to match dataset records of ‘fixated individuals’ - differently speaking people who frequently contact public figures and are highly likely to suffer from mental illness - despite not being suspected or wanted for any criminal activity. As a result, these individuals have been ejected from the ceremony without reasonable suspicion of any unlawful behaviour, but merely based on data-driven discrimination.


Full insights on both case studies are available on the Big Brother Watch’s report.


What is the current legal framework offered by the English Legal System?


If you try to find a clear legal basis that could justify these practices under the rule of law, well you will be quite disappointed.


No doubt that’s a complex scenario. A series of different issues arise from the abuse of facial recognition technologies such as mass screening of individuals in public spaces, violation of the right to privacy, and most definitely discrimination bias between different people.


And now, the first question from any lawyer would be: what is the legislation at current disposal to justify such measures? The answer is clear: there is no straightforward legal basis for the police’s use of live facial recognition surveillance in the UK.


Note one minor but fundamental detail: one thing is a surveillance camera – as defined in the Protection of Freedoms Act 2012 – and another is Live Automated Facial Recognition (LAFR), for which there is no reference whatsoever. Also, no mention of such technology on the Data Protection Act 2018, and despite its extensive application, it does not provide a basis in law for its use.


Let’s move step by step, running through some recent key developments.


Following a written question of Layla Moran MP to the Home Office about supporting legislation on the use of facial recognition and biometric tracking, the Minister of Policing Nick Hurd MP, responded in September 2017, and I quote: “There is no legislation regulating the use of CCTV cameras with facial recognition”. Subsequently, the police have claimed that its use was guaranteed as an application of the Protection of Freedoms Act 2012 and Data Protection Act 2018, however the Information Commissioner’s Office (ICO) - the UK’s independent authority - has taken distance from such position providing a more detailed one with the Opinion 31 October 2019, Reference: 2019/01, which stresses out the above legislation cannot be considered as a blanket for any facial recognition technology application. In addition, there is an urgent need for better governance policies on the use of individuals' personal data and data risk assessments on the technologies used, which shall be coordinated by criteria of necessity and proportionality.


Also, the Equality and Human Rights Commission has published a report in March 2020 on Civil and Political Rights in Great Britain to flag how the legal framework authorising and regulating the use of automated facial recognition is still insufficient. It seems to be exclusively based on common law powers which have no express statutory basis as just mentioned above. What is more, there is no national-level coordination, oversight, or regulatory governance to ensure these applications are compliant with data protection laws, including the General Data Protection Regulation (GDPR) in its UK GDPR version, which is still a fully binding source of law in the UK despite Brexit.


A special reference to data protection implications


Considering the human face as a form of biometric data that provides unique, in some terms permanent identification of individuals with sufficient degree of accuracy, data protection, and biometric data represents another relevant point in this discussion. More recently, the ICO has released another Opinion on the use of live facial recognition technology in public spaces in June 2021, that narrows down some key data protection issues that require urgent safeguard such as i) The automatic collection of biometric data at speed and scale without clear justification; ii) The lack of control for individuals and communities; iii) A lack of transparency; iv) The technical effectiveness and statistical accuracy of LAFR systems; v) The potential for bias and discrimination; vi) The governance of watch lists and LFR escalation processes; and vii) The processing of children’s and vulnerable adults’ data.


A milestone case law


R (Bridges) v The Chief Constable of South Wales Police is one of the leading cases with regards to the use of automated facial recognition in the UK. Mr. Bridges won his appeal in August 2020 with the Court of Appeal concluding that there is no adequate legal framework for the use of facial recognition technology.


If at the beginning of 2020 the Metropolitan Police in London was ready to launch new test trials of facial recognition technology, probably reassured by having won their case before the High Court a few months before, the Court of Appeal judgment simply flipped the coin.

The legal issue is focused on whether the current legal regime in the UK is adequate to ensure the appropriate and non-arbitrary use of LAFR in a free and civilised society. The common core was with the LAFR and its implications for privacy and data protection rights. A facial template can be considered biometrical data as an ‘intrinsically private character’ that has the capacity to identify an individual uniquely and precisely.


The Court of Appeal made a unanimous judgment, rising three main reasons where the South Wales Police’s use of LAFR technology was against the law:

  • Breach of Article 8 DPA 2018 (the right to privacy), as it is not ‘in accordance with the law’. As reported in the judgment, there are ‘fundamental deficiencies’ with the legal framework that allow too vast discretion to police personnel on how and where to use such technology

  • Breach of the DPA 2018 because it failed the data protection impact assessment (DPIA) under the combination of s.64 DPA and Article 8 implications on the use of LAFR. In a nutshell: DPIA failed to assess the risks related to the rights and freedoms of individuals, including an adequate strategy to address issues in the legal framework

  • Breach of the public sector equality duty (PSED) for lack of accuracy in the technology to identify bias by design, as the whole purpose of such regulation is ‘to ensure that a public authority does not inadvertently overlook information which it should take into account'


On the same page, note that nothing in the Court of Appeal’s judgment indicates that the use of LAFR should be considered unlawful per se. Despite its multiple misuses and potential discrimination of individuals based on sex, age, gender, or ethnicity, unlike fingerprint or DNA sample, the collection process of face recognition, in general, is not physically intrusive and so sufficiently justified for the ‘prevention and detection of crime’. In other words, until an appropriate legal framework is in place, supported by data protection impact assessment about the use of LAFR on individuals, any use of those systems is unlawful and their use must be stopped immediately.


Are we assisting to a legal ‘paradigm shift’ towards a ‘Predictive Policing’ law enforcement model for the prevention and detection of crime?


All above considered, it is crucial to provide a logical and regulatory explanation on the ratio legis behind the concept of a pre-emptive collection of data with the aim to prevent and detect crime before is committed. As evidenced by Koops, a fundamental change is now undergoing. Criminal law and law enforcement as a direct consequence is shifting from the last resort to a primary tool of social control that could be called ‘criminal risk governance’.


Let’s provide a simple example in criminal law: in the traditional paradigm, once a dead body was found, the police started looking for traces, collecting witnesses and other sources of evidence. Now, before a murder is committed, society is compelled to structure its systems in a way that, should ever a murder be committed, evidence is more likely and immediately available.


We are facing a paradigm shift. Where the current/past paradigm with regards to the use of tech tools for law enforcement was interpreted as an instrument of social control in the quality of ultima ratio, the archetype is slowly being replaced. The balance between repression v. prevention both on individuals and large groups of people is located at a continuum. In the newly evolving paradigm, criminal law is a first resort to control perceived social risks (e.g., I-led policing methods). This shift has not been completed yet, nor it is desirable, nevertheless, the reactive old algorithm is going to be replaced by the preventive one with relevant implications for regulation. As we can see, this evolution seems to be changing step by step the entire approach of criminal law enforcement into the ‘Predictive Policing’ model.


Going with the flow, particularly with the actual use of LRT or LAFR in the UK for law enforcement purposes, is justified by protecting the society common good and ensuring the highest level of security for the community, a fuzzy combination of statutory law, common law principles, case law, and governance policies are tailoring such a framework, but still threatening fundamental rights at their core.


We will stay on the lookout for the next developments.


Edited by Marco Mendola



bottom of page