Global Freedom of Expression

R v. the Chief Constable of South Wales Police

Closed Expands Expression

Key Details

  • Mode of Expression
    Non-verbal Expression, Public Assembly
  • Date of Decision
    August 11, 2020
  • Outcome
    Motion Granted, Law or Action Overturned or Deemed Unconstitutional
  • Case Number
    C1/2019/2670
  • Region & Country
    United Kingdom, Europe and Central Asia
  • Judicial Body
    Appellate Court
  • Type of Law
    Civil Law
  • Themes
    Digital Rights, Privacy, Data Protection and Retention
  • Tags
    Data Protection and Retention, Facial Recognition

Content Attribution Policy

Global Freedom of Expression is an academic initiative and therefore, we encourage you to share and republish excerpts of our content so long as they are not used for commercial purposes and you respect the following policy:

  • Attribute Columbia Global Freedom of Expression as the source.
  • Link to the original URL of the specific case analysis, publication, update, blog or landing page of the down loadable content you are referencing.

Attribution, copyright, and license information for media used by Global Freedom of Expression is available on our Credits page.

Case Analysis

Case Summary and Outcome

The Court of Appeal Civil Division unanimously held that the South Wales Police’s use of Automatic Facial Recognition Locate technology, both in a general manner and specifically on 21 December 2017 and 27 March 2018, violated the right to privacy under Article 8 of the European Convention on Human Rights, the United Kingdom Data Protection Act and the Public Sector Equality Duty. The petitioner, Mr. Edward Bridges, appealed the judgment of the Divisional Court of the Queen’s Bench Division on the grounds that the use of Automatic Facial Recognition Locate technology violated Article 8 ECHR right to privacy and has the potential to produce discriminatory and deterrent effects on freedom of expression and the right of assembly. For its part, the South Wales Police argued that the use of the technology was within its powers, that it was lawful and that notice of its use was given to members attending public events. The Court of Appeal Civil Division held that the use of Automatic Facial Recognition technology did not meet the legality requirement under the right to privacy in Article 8 of the European Convention on Human Rights by allowing excessive discretion to the South Wales Police. Furthermore, the Court held that the South Wales Police could not demonstrate that the use of such technology would not have a detrimental effect on other rights of the public or that it did not have the potential to produce discriminatory effects.


Facts

On October 3, 2018, the petitioner, Mr. Edward Bridges, sued the Chief Constable of the South Wales Police in the United Kingdom for the use of Automatic Facial Recognition technology by the South Wales Police Force, in particular for the implementation of a system called “Automatic Facial Recognition Locate”. The petitioner reproached the use of the Automatic Facial Recognition Locate system in “general” and also for its use on two “specific occasions” in which he alleged that his facial biometric data was captured by the cameras. In this regard, he remarked that these two situations happened (i) on December 21, 2017, in Queen Street, a commercial area of Cardiff; and (ii) on March 27, 2018, at the Defense Procurement, Research, Technology, and Exportability Exhibition that was held at the Motorpoint Arena. Regarding the latter event, the petitioner remarked that he had attended a protest against the referred public event in front of the Motorpoint Arena. Furthermore, Mr. Edward Bridges argued that the South Wales Police did not give notice that they were going to deploy Automatic Facial Recognition Locate technology on either occasion.

Additionally, the petitioner argued that the use of Automatic Facial Recognition technology violated the rights to privacy under Article 8, the right to freedom of expression under Article 10, and the right to assembly under Article 11 of the European Convention on Human Rights (ECHR). In turn, Mr. Bridges argued that such technology is contrary to the Personal Data Act for capturing the biometric data of individuals without their consent. In addition, he claimed that the use of Automatic Facial Recognition Locate had discriminatory effects and violated the Public Sector Equality Duty. The petitioner also sought compensation for the damages suffered.

The respondent, the Chief Constable of South Wales Police, Mr. Heddlu De Cymru, explained that the police have a license to use Automatic Facial Recognition software called “NeoFace Watch software” developed by North Gate Public Services LTD of the United Kingdom. He also stated that this technology can detect persons who have absconded from lawful custody, persons suspected of having committed crimes, persons who may be in need of protection, persons whose attendance at an event may be of particular concern, persons who are of interest to the police intelligence service and vulnerable persons. In turn, Mr. Heddlu De Cymru argued that the Automatic Facial Recognition Locate system was used on fifty occasions between May 2017 and April 2019 at public events and that members of the public were informed about its use in the areas where it was deployed. He also asserted that it was not possible for the South Wales Police to verify whether the Automatic Facial Recognition Locate software took biometric images of petitioner Edward Bridges on December 21, 2017, or March 27, 2018. In addition, the respondent argued that the use of the referred technology has not been verified to have discriminatory effects.

On September 4, 2019, the Divisional Court of the Queen’s Bench Division rejected Edward Bridges’ claim. The Divisional Court held that the case involved a legitimate limitation on the right to privacy under Article 8 of the European Convention on Human Rights. In this regard, the Divisional Court conceptualized facial biometric data as inherently private data. However, the Divisional Court determined that the use of Automatic Facial Recognition Locate technology by the South Wales Police was lawful because it was regulated under the Surveillance Camera Code of Practice and the Personal Data Act. Furthermore, the Divisional Court held that it was neither necessary nor practical for legislation to define the precise circumstances in which Automatic Facial Recognition Locate may be used. In addition, the Divisional Court held that the use of Automatic Facial Recognition Locate technology was within the powers of the South Wales Police to carry out its policing functions.

Next, the Divisional Court held that the use of Automatic Facial Recognition Locate technology was proportionate and that its use on December 21, 2017 (Queen Street) and March 27, 2018 (Motorpoint Arena) was justified. On the issue, the Divisional Court held that the use of Automatic Facial Recognition Locate technology only involves the processing of “personal data” of individuals on “watch lists” compiled by the police because they are the only ones whose name and personal identification can be recognized by the software. In this sense, the Divisional Court determined that the use of the referred technology was strictly necessary for the crime prevention purposes entrusted to the South Wales Police.

Finally, the Divisional Court rejected the petitioner’s argument that the use of the Automatic Facial Recognition Locate was discriminatory. In this regard, the Court held that there is no evidence that the use of the referred technology might be operated in a discriminatory manner.

The petitioner Mr. Edward Bridges filed an appeal against the judgment of the Divisional Court of the Queen’s Bench Division on September 4, 2019. Mr. Bridges argued that the judgment of the Divisional Court of the Queen’s Bench Division validated an Automatic Facial Recognition system that violated the human right to privacy in Article 8 of the ECHR. It also stated that the system violates the Data Protection Act and the Public Sector Equality Duty of the Equality Act by failing to prevent possible discriminatory effects.


Decision Overview

The Court of Appeal Civil Division had to decide whether the South Wales Police’s use of Automatic Facial Recognition Locate technology, both generally and specifically on the days on December 21, 2017, and March 27, 2018, violated the right to privacy under Article 8 of the European Convention on Human Rights, the Data Protection Act and the Public Sector Equality Duty. The Court unanimously held that the respondent violated Article 8 of the Convention and the Data Protection Act, and failed to comply with the Public Sector Equality Duty.

The petitioner, Mr. Edward Bridges, appealed the judgment of the Divisional Court of the Queen’s Bench Division on the grounds that the use of Automatic Facial Recognition Locate technology does not constitute a legitimate limitation on the right to privacy under Article 8 of the European Convention on Human Rights. The petitioner argued that the Divisional Court of the Queen’s Bench Division erred in finding that the use of such technology was legitimate and proportionate by failing to consider the sum of the interference with the privacy of all individuals whose biometric data is captured without their consent. In turn, Mr. Edward Bridges complained that the court did not assess the risk of indirect discrimination associated with this type of technology.

First, the Court of Appeal Civil Division explained what the Automatic Facial Recognition system consists of. According to the Court, Automatic Facial Recognition technology makes it possible to verify whether two different images belong to the same person on the basis of biometric data (i.e., measurements of facial features) extracted from a digital photograph of a face and compared with facial biometric data from images stored in a database.

The Court explained that such technology has a function called “Automatic Facial Recognition Locate” that allows obtaining images of the face of any person passing in front of the video surveillance cameras installed in police vehicles or on poles on public roads. Likewise, the Court said that such cameras allow obtaining digital images of the faces of citizens that are captured and processed in real-time to extract facial biometric information that is compared with the biometric information of a “watch list” prepared by the South Wales Police. In addition, the Court mentioned that the watch list includes biometric information of persons wanted under warrants; persons who have absconded from lawful custody; persons suspected of having committed crimes; persons who may be in need of protection (e.g. missing persons); persons whose attendance at an event may be of particular concern; persons who are of interest to the police intelligence service; and vulnerable persons.

The Court specified that from the watch list images a “biometric template” is obtained and used to make algorithmic comparisons with facial biometric data obtained from the faces of citizens participating in public events. On the issue, the Court remarked that if the software detects a possible match between the images captured by the video surveillance cameras and an image on the watch list, the police officers in the area will be notified so that they can take the measures they deem necessary (e.g. arrest the person or simply interrogate him/her). Furthermore, the Court clarified that, if there is no match between the images, the Automatic Facial Recognition Locate software does not retain the facial biometric data or the image of the persons whose faces are scanned.

Furthermore, the Court acknowledged that the South Wales Police has a communication system in which they advertise that the Police will use Automatic Facial Recognition technology in areas where events are taking place and surveillance cameras are installed. However, the Court held that although the deployment of the technology is not “covert”, it is reasonable to assume that many people are not aware that their biometric facial data is captured and processed by the Automatic Facial Recognition Locate software. The Court remarked that the main feature of this technology is that it “enables facial biometrics to be procured without requiring the co-operation or knowledge of the subject or the use of force, and can be obtained on a mass scale” [para. 23].

Second, the Court found that on December 21, 2017, and March 27, 2018, the South Wales Police did indeed use Automatic Facial Recognition technology. Also, the Court took into consideration that thanks to the deployment of the referred technology, the police were able to arrest two persons on December 21, 2017, through the matches verified by the software. Also, the Court stated that on March 27, 2018, the Defense Exhibition was held at the Motorpoint Arena in Cardiff which in previous years had attracted disorder and protests that in some cases included crimes (such as, for example, two prank calls of the alleged existence of a “bomb” to disrupt the event). However, the Court mentioned that on March 27, 2018, there were no arrests by the Police.

Third, the Court of Appeal Civil Division had to examine whether the use of Automatic Facial Recognition Locate technology by the South Wales Police had a sufficient legal framework, i.e. whether it was lawful under the European Convention on Human Rights. On this issue, the Court held that the implementation of the technology did not have a sufficient legal framework as required by Article 8 of the European Convention on Human Rights. The Court affirmed that according to the ECtHR cases Sunday Times v United Kingdom (1979), Silver v United Kingdom (1983) and Malone v United Kingdom (1984), for the implementation of the referred technology to be legal it must have some basis in domestic law and be compatible with the rule of law. Under these premises, the Court remarked that the main shortcomings of the legal framework are twofold.  The first legal deficiency is “what was called the ‘who question’ at the hearing before us. The second is the ‘where question’. In relation to both of those questions, too much discretion is currently left to individual police officers. It is not clear who can be placed on the watchlist nor is it clear that there are any criteria for determining where [Automatic Facial Recognition Locate] can be deployed” [para. 91].

The Court also noted that Automatic Facial Recognition Locate is a novel technology that involves the automated capture and processing of biometric data from a large number of members of the public in circumstances where most of that information will be of no interest to the police. At the same time, the Court held that the United Kingdom legal framework does not clearly specify who may be included in the watch lists drawn up by the police for the use of Automatic Facial Recognition Locate, nor does it establish where or in what locations such technology may be deployed. For these reasons, the Court held that “the current policies do not sufficiently set out the terms on which discretionary powers can be exercised by the police and for that reason do not have the necessary quality of law.” [par.94].

Fourth, in relation to proportionality, the Court held that “strictly speaking, it is unnecessary for this Court to consider Ground 2 in this appeal, which relates to the question of proportionality, since, if (as we have held) the interference with the Appellant’s Article 8 rights was not in accordance with the law, one never reaches the stage of asking whether that interference was proportionate” [para. 131]. In any event, the Court held that the use on two occasions of Automatic Facial Recognition Locate technology by the police did not constitute a disproportionate interference with the privacy of Mr. Edward Bridges and rejected this argument of the petitioner.

Fifth, the Court examined whether the use of Automatic Facial Recognition Locate technology constituted a violation of the Personal Data Act. The Court held that the Personal Data Act requires that before any policy in which “processing of personal data” occurs, a “data protection impact assessment” must be demonstrated. The petitioner had indicated that the data protection impact assessment conducted by the South Wales Police did not address the possible interference of the Automatic Facial Recognition Locate with Article 10 (freedom of expression) and Article 11 (freedom of assembly) of the European Convention on Human Rights.

The Court reiterated that the use of Automatic Facial Recognition Locate technology involves two impermissibly broad areas of discretion: the selection of the individuals included in the watch lists and the locations where it can be deployed. Under this premise, the Court held that “the inevitable consequence of those deficiencies is that, notwithstanding the attempt of the [data protection impact assessment to grapple with the Article 8 issues, the data protection impact assessment] failed properly to assess the risks to the rights and freedoms of data subjects and failed to address the measures envisaged to address the risks arising from the deficiencies we have found” [para. 153].

Sixth, the Court analyzed the scope of the Public Sector Equality Duty regulated in section 149(1) of the Equality Act which provides that United Kingdom authorities must eliminate any discriminatory conduct and promote equal treatment to all persons. The Court held that specially protected categories of discrimination include sex and race. The Court also held that according to the evidence provided, it was verified that the South Wales Police is not in a position to assess the possible discriminatory impact of the Automatic Facial Recognition Locate technology. In this regard, the Court concluded that Automatic Facial Recognition “is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias” [para. 201].

For all of the above reasons, the Court of Appeal Civil Division concluded that the Automatic Facial Recognition technology violated Article 8 of the ECHR right by using it on December 21, 2017, and March 27, 2018, and on an ongoing basis. In addition, the Court held that the continued use of such technology violated the Data Protection Act and the Public Sector Equality Duty of the Equality Act.


Decision Direction

Quick Info

Decision Direction indicates whether the decision expands or contracts expression based on an analysis of the case.

Expands Expression

The judgment of the Court of Appeal Civil Division expands freedom of expression by overturning a judgment that validated the use of Automatic Facial Recognition Locate technology by the South Wales Police that captured citizens’ biometric data in violation of the right to privacy under Article 8 of the European Convention on Human Rights and with potential adverse and deterrent effects on freedom of expression and the right of assembly. The Court of Appeal Civil Division held that the legal framework that regulated the use of Automatic Facial Recognition Locate technology allowed the South Wales Police excessive discretion in relation to “where” and “whom” they could investigate and take biometric data. The use of this type of surveillance and mass data collection technology without consent could have a disproportionate impact on the rights to freedom of expression and assembly due to its potential deterrent effect. By overturning the judgment of the Divisional Court of the Queen’s Bench Division, the Court of Appeal Civil Division protects the right to privacy of individuals and indirectly also reinforces the rights to freedom of expression and assembly, as citizens will be able to express themselves freely without their biometric data being captured in the context of social protests as occurred in the facts of this case.

Global Perspective

Quick Info

Global Perspective demonstrates how the court’s decision was influenced by standards from one or many regions.

Table of Authorities

Related International and/or regional laws

  • ECHR, art. 8
  • ECtHR, The Sunday Times v. United Kingdom, App. No. 6538/74 (1979)
  • ECtHR, Malone v. United Kingdom, App. No. 8691/79 (1984)
  • ECtHR, Silver and Others v. The United Kingdom, App. No(s). 5947/72, 6205/73, 7052/75, 7061/75, 7107/75, 7113/75, 7136/75 (1983)

Other national standards, law or jurisprudence

Case Significance

Quick Info

Case significance refers to how influential the case is and how its significance changes over time.

Decision (including concurring or dissenting opinions) establishes influential or persuasive precedent outside its jurisdiction.

This case did not set a binding or persuasive precedent either within or outside its jurisdiction. The significance of this case is undetermined at this point in time.

Official Case Documents

Have comments?

Let us know if you notice errors or if the case analysis needs revision.

Send Feedback