English Court of Appeal rules automated facial recognition technology an interference with ECHR rights
A civil liberties campaigner who appealed a decision of the Divisional Court that the use of automated facial recognition (AFR) technology by the South Wales Police Force was compatible with Article 8 of the ECHR has succeeded in respect of three of the five appeal grounds.
The appellant, Edward Bridges, who was supported by the civil liberties organisation Liberty, argued that the DC had erred in concluding that SWP’s use of the technology was a proportionate interference with ECHR privacy rights. The appeal was opposed by the Chief Constable of South Wales Police, with the Home Secretary as an interested party and three interveners.
The appeal was heard in the Court of Appeal of England and Wales by Sir Terence Etherton MR, Dame Victoria Sharp PQBD and Lord Justice Singh.
Compared against a database
AFR is used to take a digital photograph of a person’s face and compare it against a compiled database of images to determine if two images depict the same person. SWP had a licence allowing them to use proprietary AFR software that they used to search for various categories of person, including persons wanted on warrants, individuals unlawfully at large, missing persons, and individuals whose presence at a particular event would be a cause for concern.
From May 2017 to April 2019 SWP trialled AFR Locate at a variety of public events. These deployments were publicised via social media and advertised on-site by both large posters and individual flyers. Whilst not covert, it was reasonable to suppose that large numbers of people may have had their biometrics captured and processed by AFR without their knowledge.
The appellant complained about two particular occasions in which he believed he was caught on camera by the technology: once in a busy shopping area in Cardiff and once at a exhibition centred around the defence industry at the Motorpoint Arena where he participated in a protest. On both occasions he stated he was not aware that AFR was in use prior to his being in proximity of AFR-equipped vans.
The Divisional Court held that Article 8 was engaged by the use of the technology. However, the interference was justified and met the Article 8(2) conditions.
On appeal, it was argued that the DC had erred in concluding that the interference was in accordance with the law for the purposes of Article 8(2). It was also argued that SWP’s Digital Protection Impact Assessment (DPIA) on the use of the technology was deficient as it proceeded on the basis that Article 8 was not engaged, and that the DC was wrong to hold that SWP complied with the Public Sector Equality Duty as its Equality Impact Assessment was based on an error of law.
Fundamental deficiencies
In the opinion of the court, a large part of the analysis of AFR technology by the DC was accepted, but it was held that on further analysis there were deficiencies in the framework. The court said: “The fundamental deficiencies, as we see it, in the legal framework currently in place relate to two areas of concern. The first is what was called the ‘who question’ at the hearing before us. The second is the ‘where question’. In relation to both of those questions too much discretion is currently left to individual police officers. It is not clear who can be placed on the watchlist nor is it clear that there are any criteria for determining where AFR can be deployed.”
On the stated list of possible persons who could be placed on a watchlist, it said: “In effect it could cover anyone who is of interest to the police. In our judgement, that leaves too broad a discretion vested in the individual police officer to decide who should go onto the watchlist.”
On the locations in which the guidelines allowed the technology to be deployed, it said: “Our attention [was drawn to] to page 21 of the DPIA, where it is said: ‘As we are testing the technology South Wales Police have deployed in all event types ranging from high volume music and sporting events to indoor arenas.’ That simply underlines the concern that we have in this context. First, it is a descriptive statement and does not lay down any normative requirement as to where deployment can properly take place. Secondly, the range is very broad and without apparent limits. It is not said, for example, that the location must be one at which it is thought on reasonable grounds that people on the watchlist will be present.”
On compliance with digital protection legislation, it said: “This Ground of Appeal is correct insofar as it states that the DPIA proceeds on the basis that Article 8 is not engaged or, more accurately, is not infringed. We have found, when considering Ground 1 above, that AFR Locate fails to satisfy the requirements of Article 8(2), and in particular the ‘in accordance with the law’ requirement, because it involves two impermissibly wide areas of discretion: the selection of those on watchlists, especially the ‘persons where intelligence is required’ category, and the locations where AFR may be deployed.”
On the Public Sector Equality Duty, it said: “SWP have never sought to satisfy themselves, either directly or by way of independent verification, that the software program in this case does not have an unacceptable bias on grounds of race or sex. There is evidence […] that programs for AFR can sometimes have such a bias.”
It concluded: “We would hope that, as AFR is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias.”
For these reasons, the appeal in respect of those grounds was allowed.