Apparently, R (Bridges) v The Chief Constable of South Wales Police [2019] EWHC 2341 (Admin) is the world’s first reported court case on the use of Automated Facial Recognition (‘AFR’) technology. It’s unlikely to be last, particularly as the Claimant has been granted permission to appeal to the Court of Appeal, but also due to growing public unease about the potential for AFR to discriminate against protected groups in the UK.

Bridges concerns the use of AFR in a pilot project by the South Wales Police (‘SWP’) force at large public events where it took images and checked the biometric data of passers-by against police watchlists for various suspects.

The case discusses the delicate balance between use of new technologies to prevent crime and the protection of individual rights. It ultimately concludes that the SWP’s use of AFR in the pilot was lawful and that the current legal regime is adequate to regulate it.

The AFR pilot

During the 2017 UEFA Champions League Final at Principality Stadium in Cardiff, the SWP deployed AFR in a busy high street using a marked van from 8am until 4pm. The aim was to identify and locate wanted offenders and suspects of varying degrees of priority on several local watchlists. Out of 10 matches recorded during that time, 2 were errors. As a result of the deployment, 2 arrests were made (one of whom was a wanted domestic violence offender).

The SWP also deployed AFR at an arms exhibition at Motorpoint Arena known to attract disruption. A similar method was applied but no arrests were made.

The judicial review application

Mr Bridges claimed that he was present on both occasions and that his image was captured by the SWP. He alleged that the use of AFR on these occasions:

  1. amounted to disproportionate breaches of his protected right to private life under Article 8 of the European Convention on Human Rights (‘ECHR’);
  2. breached the first data protection principle in sections 34 and 35 of the Data Protection Act 2018 (as well as its predecessor in section 4(4) of the Data Protection Act 1998) and was done without an adequate data protection impact assessment as required by section 64 of the DPA 2018;
  3. failed to adhere to the public sector equality duty in section 149 of the Equality Act 2010. He argued that the police impact assessment document did not consider that the AFR software would disproportionately misidentify women and persons from minority ethnic groups, putting those persons at a disadvantage when compared to others.

What is AFR?

Anyone interested in the procedure used by the technology itself should read § 24 of the judgment. In brief:

A CCTV camera simply captures digital video recordings. AFR technology uses that digital information to isolate pictures of individual faces, extract information about facial features from those pictures, compare that information with the watchlist information, and indicate matches between faces captured through the CCTV recording and those held on the watchlist.

When the software identifies a possible match between a face captured on the recording and an image on the watchlist, a police officer is required to review the images and determine if they believe that the match is correct.

If a person who is not on the watchlists is caught by the AFR camera, their facial biometric data processed on the AFR system is immediately deleted.

lianhao-qu-LfaN1gswV5c-unsplash

Photo by Lianhao Qu on Unsplash

In all, the deployments had scanned an astonishing 500,000 or so faces between 2017 and 2018.

The Article 8 claim

A breach of Article 8 ECHR will occur in this context if the measure in question (i) interferes with Article 8 and a Defendant cannot show that the measure in question is (ii) in accordance with the law, and (iii) a necessary and proportionate measure to prevent crime and disorder.

Interference with Article 8

The Claimant argued that his Article 8 ECHR right to private life had been breached by having his facial biometric data obtained without his consent. The Defendant denied there was proof that the Claimant’s image had been captured (the data having been deleted in any event). Even if his image was captured, there was no general right to privacy in a public place and the processing of the Claimant’s data was not so serious an intrusion into his privacy as to breach Article 8.

The Court found that AFR in this instance went much further than the simple capturing of a person’s image. Drawing on the analysis of retained DNA samples and fingerprint records by the European Court of Human Rights in the case of S v United Kingdom (2009) 48 EHRR 50, the High Court held that the capture, storage and processing of a person’s biometric facial data by AFR triggers the protections in Article 8 of the ECHR.

In accordance with the law

The Claimant’s argument was two-pronged. Firstly, he argued that in the absence of a statutory power specific to the use of AFR, the SWP’s use of the technology was outside of its powers. The SWP relied on their general common law powers to detect and prevent crime. The Court’s view was that there was no need for a specific statutory framework because the SWP’s common law powers to prevent and detect crime were ample to cover its use of AFR.

Secondly, the Claimant contended that the SWP’s use of AFR was not regulated by law and that the technology had outstripped the legal frameworks in the Data Protection Acts of 1998 and 2018, rendering it an unlawful interference with his Article 8 rights. Regulators intervening in this case, namely the Information Commissioner’s Office (‘ICO’) and the Surveillance Camera Commissioner, argued that AFR needed further limits and specifications in law.

On this point, the Court concluded that the combination of primary legislation, secondary legislation and SWP policies meant that ‘there is a clear and sufficient legal framework governing whether, when and how AFR Locate may be used’ (§ 84).

There was no need for the creation of a brand-new legal framework to govern AFR. The data protection principles in sections 35 to 42 of the DPA 2018 laid down sufficient safeguards. These were supplemented by 12 guiding principles in the statutory guidance, (the Surveillance Camera Code of Practice) and SWP’s own policy document produced in advance of the pilot. These laws were sufficient to avoid arbitrary interference with Article 8 rights, though there will be room for re-evaluation and improvement over time (§ 97).

Necessary and proportionate

As to whether the use of AFR was necessary and proportionate, the Court concluded that the SWP had struck a fair balance between the interests of the community and individual rights. AFR was being used overtly and in a transparent way (publicised on twitter, facebook and via leaflets and signs). It was deployed for a limited time and purpose at specific locations. It had led to the arrests of two suspects and no wrongful arrests. In the circumstances, the effect of the AFR pilot on the Claimant’s privacy and data rights was minimal:

101. …The interference would be limited to the near instantaneous algorithmic processing and discarding of the Claimant’s biometric data.

The Court rejected the Claimant’s arguments that the scope and purpose of the AFR deployment was excessive, finding that the process was well targeted and justified to detect and prevent crime. Since 2017, some 37 cases had led to arrests or disposals and the SWP had saved on the time and resources usually spent searching for individuals. In doing so, however, the Court noted that there had to be good reasons to put persons on watchlists in the first place:

105. … The inclusion of any person on any watchlist and the consequent processing of that person’s personal data without sufficient reason would most likely amount to an unlawful interference with their own Article 8 rights.  

The judgment emphasises that proportionality tends to be fact sensitive and any future use of AFR would need to be tested on its own merits and possibly with the regulatory involvement of the ICO and Surveillance Camera Commissioner.

The Data Protection claims

Are facial biometrics ‘personal data’?

The first question was whether the Claimant’s facial biometrics were ‘personal data’ from which he could be identified or identifiable, as defined in section 1 of the DPA 1998 (now section 3 of the DPA 2018). The SWP argued that the answer was ‘no’, on the basis that it only held identifiable data of those on the watchlists and those who matched persons on the watchlists. Other images of the general public were immediately deleted after they were checked by the algorithm. Unsurprisingly, the Court rejected this argument.

The Court was not prepared to find that the Claimant could be indirectly identified by reference to his facial biometrics and further information, (considering the expansive approach to ‘personal data’ taken by the CJEU in Breyer v Bundesrepublik Deutschland (Case C-582/14), which concerned dynamic IP addresses found to constitute ‘personal data’). That possibility was too speculative.

RELATED: Data-grab by porn company claim farmers blocked by High Court, post-GDPR

However, the Court referred to the interim decision in Vidal-Hall v Google Inc. [2016] QB 1003, a case concerning whether a person could be identified directly via browser generated information about their internet usage. There, the High Court concluded that such data singled out individual users, making them directly identifiable, and such data therefore satisfied the definition of ‘personal data’. In Bridges, the Claimant’s facial biometric data was sufficient to single him out from others and identify him directly.

Breach of the first data protection principle in section 4(4) DPA 1998

The Claimant argued that the SWP had breached the first data protection principle which requires that personal data must be processed lawfully and fairly. However, in light of the Court’s overlapping conclusions under Article 8, the Court was satisfied that this principle had been complied with.

Breach of first data protection principle in section 35 of the DPA 2018

The Court had to decide whether the Defendant had breached the principles imposed on law enforcement data processing by sections 34 and 35 of the DPA 2018.

The first issue was whether the use of AFR involved the processing of biometric data of members of the public for the purpose of uniquely identifying an individual. If it did, then it would be classed as ‘sensitive processing’, with additional legal requirements. The Defendant argued that the purpose of processing biometric data of members of the public was not to identify them but to identify individuals on the watchlist. Again, unsurprisingly, this argument was rejected. The definition of sensitive processing in section 35(8) was wide enough to capture both the processing of biometric data of members of the public and suspects on the watchlist. Article 9(1) of the GDPR, which the DPA 2018 implements domestically, supports this conclusion by referring broadly to ‘biometric data [processed] for the purpose of uniquely identifying a natural person’.

The second issue arising was whether the Defendant had complied with the three requirements in section 35 of the DPA 2018 (ie. the processing must be strictly necessary for law enforcement, must meet one of the conditions in Schedule 8 as to justifying such processing and there must be an appropriate policy document in place). As for the first and second requirements, those matters were covered already by the discussion of legality and proportionality under Article 8 and for the same reasons the Court was satisfied that these requirements were complied with. As for having an appropriate policy, section 42 of the DPA 2018 requires the data controller to explain its compliance with the data protection principles and its policies on the retention and deletion of personal data. The Court was doubtful that the SWP’s policy document was fit for purpose. However, the Court did not wish to interfere with the policy, leaving that for the ICO to provide further guidance.

RELATED: Nobody seems to know what should happen to imaged digital data in civil litigation. Until now.

Thirdly, the Claimant alleged that the SWP had failed in its obligation to conduct an adequate data protection impact assessment (‘DPIA’). On the appropriate standard of the Court’s review of a DPIA under section 64 of the DPA 2018, it observed:

146. …The notion of an assessment brings with it a requirement to exercise reasonable judgement based on reasonable enquiry and consideration… when conscientious assessment has been brought to bear, any attempt by a court to second-guess that assessment will overstep the mark.

Judged in this way, the SWP’s DPIA met the core requirements of section 64 of the DPA 2018 (§ 148) as it considered the risk to privacy rights and how to address the risks with appropriate safeguards.

The Equality Act 2010 claim

Lastly, the Court considered the Claimant’s allegation that the SWP had failed to discharge its public sector equality duty in section 149 of the Equality Act 2010. The SWP had carried out an Equality Impact Assessment (‘EIA’) early in the pilot. The Claimant’s criticism was that it failed to consider the potential that AFR may indirectly discriminate on grounds of sex and/or race by producing more false matches for women or ethnic minorities.

The Court dismissed this argument. The evidence that the AFR software produced discriminatory outcomes rested on one expert witness, Dr Anil Jain. His report observed that if the SWP did not know the contents of the data used to train the algorithm, then SWP could struggle to confirm whether the technology itself was biased.

The SWP’s witness evidence suggested that a higher proportion of women were falsely matched than men. The explanation for this was that two of the female faces on the watchlists had ‘significant generic features’ (known in the software industry as ‘lambs’). On ethnic bias, there was no evidence to suggest that the system was discriminatory.

The Court concluded that given the limitations of investigating the point without full access to the original datasets, the SWP may wish to re-evaluate discriminatory bias in the AFR software. The SWP could not be criticised for proceeding with the pilot without considering indirect discrimination on grounds of sex and/or race in the absence of any evidence at the time that this was a problem. Importantly, human intervention helped reduce the risks of false positives. In the context of a pilot process, the public sector equality duty required continual review of equality impacts in light of how the process develops (§ 158).

Conclusions

As reported in the press, the unsuccessful Claimant is taking his legal challenge to the Court of Appeal and therefore the stability of the High Court’s findings and guidance is uncertain. All the while, the use of AFR seems likely to grow.

There are a number of points here for the operators of AFR to consider carefully, such as:

  1. the need to justify the inclusion of any individual on a watchlist of faces;
  2. the importance of human intervention in potentially correcting discriminatory biases in the algorithm;
  3. the need for fully transparent studies of the original dataset used to train the algorithm before it may be said that the system is unlikely to discriminate on protected grounds; and
  4. that ongoing evaluation of policies and guidance is essential to keep in step with an evolving and multi-layered legal and regulatory framework.

It will be interesting to see what the Court of Appeal make of the largely fact-specific conclusions on Article 8 and the decisions to dismiss the data protection and Equality Act claims. The outcome is likely to have a significant impact on the use of AFR by public and private bodies in the UK, an issue that is already being hotly contested by campaign groups such as Liberty, who act for Mr Bridges.

If you require expert advice on technology and equality law, please contact me.

Posted by Ben Amunwa

Founder and editor of Lawmostly.com. Ben is a commercial and public law barrister with The 36 Group. He gives expert legal advice on employment, public law and commercial disputes to a wide range of clients.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.