This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 8 minutes read

Police's Automated Facial Recognition Deployments Ruled Unlawful by the Court of Appeal

R. (Bridges) v Chief Constable of South Wales [2020] EWCA Civ 1058 [2020] 8 WLUK 64 is thought to be the first case in the world to consider the use of facial recognition technology by law enforcement agencies.  In this short article, we explore the judgment and its implications for the deployment of these and similar technologies in future.


Background

In June 2017, South Wales Police (SWP) began a trial of cameras using Automated Facial Recognition (AFR) software in a variety of public spaces throughout Cardiff, including train stations, football stadiums and an arms fair.  

The pilot involved the deployment of surveillance cameras to capture digital images of members of the public.  These digital images were then processed by a mathematical algorithm to produce a biometric template which was then compared with other biometric templates of people on watchlists compiled by the police. The watchlists were created from images held on databases maintained by the police, including custody photos.  They were not limited to persons believed to be unlawfully at large or suspected of having committed crimes, but also persons of “possible interest” to the police.  If a person’s biometric data did not match the image on one of the watchlist (i.e. in the majority of cases), it would be automatically deleted from the AFR system.  It is estimated that, during the trial, as many as 500,000 faces of members of the public in the Cardiff area had their faces scanned by the AFR cameras between 2017 and 2018.

The Claimant contended that he was present and caught on camera on two occasions in 2017 when the SWP deployed the AFR software. He challenged the use of AFR on those occasions and generally. He contended, amongst other things: (1) that the use of AFR interfered with his privacy rights under Article 8 of the European Convention of Human Rights and that the interference had not been “in accordance with the law” and/or had been disproportionate; (2) that the data obtained by the defendant through the use of AFR was contrary to his rights to protection of his personal data under the Data Protection Acts 1998 and 2018; and (3) in breach of the police’s public-sector equality duty (“PSED”) under s.149 of the Equality Act 2010.

 

The High Court

The Divisional Court unanimously held that the use of AFR was lawful.  Specifically, the Court found that, although Article 8 was engaged, SWP’s use of AFR was proportionate and in accordance with the law, as set out in the Data Protection Act 2018, the Surveillance Camera Code of Practice, and SWP’s own policies concerning the use of AFR. The Court considered that, together, these documents amounted to a “clear framework” governing when and how AFR may be used, sufficient to satisfy the requirements of foreseeability, predictability, and legality under Article 8(2) of the ECHR.  The Court also found that AFR complied with the Data Protection Acts 1998 and 2018 and was compliant with the public sector equality duty.

 

Court of Appeal

The Claimant appealed to the Court of Appeal.  The appeal was heard by the Master of the Rolls, the President of the Queen’s Bench Division, and Singh LJ (President of the Investigatory Powers Tribunal).  The Court of Appeal held that the police’s development and use of AFR was unlawful on three grounds. 


Sufficiency of the legal regime

The claimant won primarily on the basis that the overall legal framework was not sufficient to satisfy the ‘in accordance with the law’ requirement in Article 8(2). The Court concentrated on the local deployment of AFR in the South Wales area and the legal framework which applied to the deployments in question.

The legal principles governing the requirement under Article 8 for the state to put in place a “sufficient legal framework” were not in dispute, following the decisions of R. (Catt) v Association of Chief Police Officers of England, Wales and Northern Ireland [2015] A.C. 1065 and Re Gallagher [2020] A.C. 185.   

In relation to the relative importance the Court placed on the use of AFR compared with other instances of collecting and retention of information by the police, it is noteworthy that the court was of the view that the use of AFR by SWP was ‘far removed’ from, for example, the indiscriminate and blanket retention of DNA profiles and fingerprints of individuals not convicted of any offence (S v UK), but less intrusive than the police’s retention of personal information on a national domestic database of people attending but not participating in public protests (Catt).  The court rejected SWP’s argument that the use of AFR was analogous to the use of CCTV or the taking of photographs by the police.  This was because the technology was ‘novel’, it captured and automatically processed digital information and images of a large number of members of the public, and the information was ‘sensitive’ personal information for the purposes of the DPA 2018.

The Court went on to identify two “fundamental deficiencies” in the current legal framework.  The first deficiency related to who is being targeted by the technology.  The second deficiency related to where the technology was to be deployed.   The Court stated that, on both issues, “too much discretion is currently left to individual police officers. It is not clear who can be placed on the watchlist nor is it clear that there are any criteria for determining where AFR can be deployed” [91].  In particular, the Court stated that the automatic and instantaneous deletion of data of anyone who does not match with a person on the watchlist should be a legal requirement for an adequate legal framework. [93]

The Court accepted that the DPA 2018, the Surveillance Camera Code of Practice, and SWP’s local policies were all part of the legal framework but, after a detailed analysis of these documents, the court was not convinced that any of these sources provided sufficient detail as to who should be placed on a watch list nor the locations at which AFR should be deployed. In particular the Court rejected the argument that SWP’s  Privacy Impact Assessment answered the ‘who question’ when it stated that on the watchlist could be ‘persons wanted on suspicion for an offence, wanted on warrant, vulnerable persons and other persons where intelligence is required.’.  The Court stated that the final category was not objective and ‘could cover anyone who is of interest to the police…that leaves too broad a discretion vested in the individual police officer to decide who should go onto the watchlist’ [123].

SWP have said they will not appeal the judgment, indicating that they are prepared to adapt their policies to address the judgment. It would be anticipated that what this will mean further particularisation as to the types of people and context ‘where intelligence is required’ and further particularisation as to the kind of ‘intelligence’ that is being sought, such that a person would be reasonably able to predict if they might be placed on a watchlist.  It is noteworthy that the court was not also not impressed by the local policy which stated, generally,  “watchlists will be both proportionate and necessary for each deployment with the rationale for inclusion detailed pre-event in the AFR Locate deployment report” as this  “does not govern the question of who can be put on a watchlist in the first place” [129]. The Court said less about the ‘where question’ commenting only that was no real guidance on this issue in the documents, but also that:-

"[96] It might even be that, once the “who” question can be satisfactorily resolved, that will give clear guidance as to the “where” question. This is because it will often, perhaps always, be the case that the location will be determined by whether the police have reason to believe that people on the watchlist are going to be at that location."

 

Validity of the police’s data protection impact assessment

The Court of Appeal also overturned the High Court’s finding that the data protection impact assessment required by s.64 of the Data Protection Act 2018 was adequate. Specifically, the impact assessment failed to assess the rights and freedoms of data subjects properly and failed to address the measures envisaged to mitigate the risks arising from the identified deficiencies. This was an inevitable finding once the Court had found a breach of Article 8.


Public sector equality duty

The Court of Appeal found that the police had never investigated whether AFR had an unacceptable bias on grounds of race or gender in accordance with the PSED. The Court of Appeal stated that the potential for racial bias in AFR software, with a risk of disproportionately false identifications in the case of people from black, Asian and other minority ethnic backgrounds raised a “serious issue of public concern”, which should have been considered properly by the police.  The fact that the technology was being piloted made no difference to the PSED.  The Court emphasised that the introduction of safeguards and analysis after the introduction of AFR was not a suitable replacement for doing “everything reasonable which could be done…in order to make sure that the software used does not have a racial or gender biasbefore deploying the technology. The Court of Appeal therefore found that the force had not done all it reasonably could to fulfil the PSED.

The judgment is important for the emphasis placed on the PSED saying that as well as leading to better informed decisions, its application “helps to reassure members of the public, whatever their race or sex, that their interests have been properly taken into account before policies are formulated or brought into effect” [176].  The Court mapped the origin of the PSED back to the Stephen Lawrence Inquiry Report in 1999, and a new section 71 into the Race Relations Act 1976, that required the police, amongst others,  to have “due regard” to the need “to eliminate unlawful racial discrimination” and “to promote equality of opportunity and good relations between persons of different racial groups” to give effect to the Inquiry’s findings of “institutional racism”.  Summarising on this issue the Court commented:-

"[179] Public concern about the relationship between the police and BAME communities has not diminished in the years since the Stephen Lawrence Inquiry Report. The reason why the PSED is so important is that it requires a public authority to give thought to the potential impact of a new policy which may appear to it to be neutral but which may turn out in fact to have a disproportionate impact on certain sections of the population."

 

Conclusion

The Court’s decision is not the wholesale criticism of AFR that some hoped for and, indeed, the Court did not undermine the Divisional Court’s findings that the particular deployments of AFR were proportionate for the purposes of Art 8(2). It seems clear that the Court is content that AFR can be added to the armoury of digital technology available to the police. But this will only be the case where it is clear from local policies the criteria that have been used to decide why a person has been placed on a watchlist, and why AFR has been deployed at a particular location.

But the judgment allows for an unsatisfactory situation where there are no national statutory provisions or guidance on these issues, and each of the 43 police forces will have to develop their own frameworks to comply with Art 8 if it is intended to use AFR.

Tags

article 8 echr, article 8, police, actions against the police, facial recognition, information security