Skip to content

Facial recognition test case raises serious equality and engagement issues

Trial deployments of facial recognition technology by South Wales Police have been ruled unlawful by the Court of Appeal. Consultation Institute Associate, Anna Collins, explores the role that public opinion could play in smoothing the way for future use of the technology.

Much has been written about the pros and cons of automated facial recognition (AFR) technology and the use of biometrics in preventing and detecting crime. Society is divided about its relative merits, depending upon personal values and circumstances. Much of this division is created through lack of education and awareness about how the systems and data sets work, and as a consequence, there is suspicion and lack of trust in policing.

This was demonstrated recently in a legal case heard in The Court of Appeal between Edward Bridges, a civil liberties campaigner from Cardiff and South Wales Police (SWP). This was a complex hearing which considered Article 8 of The European Convention for Human Rights and the Public Sector Equality Duty (PSED) prescribed in the Equality Act 2010. The interesting element from the point of view of The Consultation Institute is the extent to which communities are involved in policymaking and how they can influence public perceptions.

At issue was the use of live, automated facial recognition technology by South Wales Police (SWP) in an ongoing trial using a system called AFR Locate. The force is the national lead on testing and conducting trials of AFR technologies. Did it act lawfully?

On the two dates in question, when AFR Locate was used in Cardiff, the claimant maintains that he was ‘caught on camera’. The technology had been used quite openly in Cardiff, and not deployed as a form of covert surveillance. Local awareness-raising communication campaigns had been put in place, but still, on the two dates in question, the claimant maintained he was ‘caught on camera’ and that this was in breach of the Public Sector Equality Duty, as set out in section 149(1) of the Equality Act 2010.

The two protected characteristics that are relevant in this case are race and sex. It was submitted that SWP was in breach of the PSED because it didn’t have ‘due regard’ to the need to eliminate discrimination on those two grounds. Claimants quoted evidence that facial recognition software can be biased and create a greater risk of false identifications in the case of people from black, Asian and other minority ethnic (BAME) backgrounds, and also in women. It is essential to be clear that it is not alleged that the software used by SWP has that effect.

Judgments

In this case, Lord Justice Singh found that the use of live, automated facial recognition technology, which engaged Article 8(1) of the European Convention on Human Rights, was not in accordance with the law for the purposes of Article 8(2). Further, its Data Protection Impact Assessment did not comply with section 64(3)(b) and (c) of the Data Protection Act 2018.

He also found that the respondent did not comply with the Public Sector Equality Duty (per section 149 of the Equality Act 2010) prior to or in the course of the technology’s use on 21 December 2017 and 27 March 2018 and on an ongoing basis. That is because there was insufficient evidence that SWP had taken ‘due regard’ of these issues.

The judgment has important implications in that police forces must do more to consciously consider the potential impact on protected categories. The pivotal sentence from the judgement is in Paragraph 197 “The fact remains, however, that SWP have never sought to satisfy themselves, either directly or by way of independent verification, that the software program in this case does not have an unacceptable bias on the grounds of race or sex” and the judge stressed that this is as much as anything designed to create public confidence.

It is well understood that a best practice Equality Impact Assessment is dependent on the context. It requires the assessor to take reasonable steps to make enquiries about what may not yet be known. The potential impact or unintended consequences of a proposed decision or policy on people with the relevant characteristics should be considered.

The Divisional Court had already found that there was no evidence to think that the technology used in this case had any bias on racial or gender grounds. But the point of the process is to make sure that all information should be taken into account, including public opinion. ‘Building positive relationships’ is, after all, a duty within the Act.

The submission maintained that the initial Equality Impact Assessment dated 13 April 2017 was erroneous in law because consideration was given only to the possibility that AFR might be directly discriminatory and no consideration was given as to whether it might operate in an indirectly discriminatory manner.

The use of the word “initial” did not mean that the assessment was only provisional or that there would be a further or fuller assessment in due course. It meant only that the initial assessment had not led to any concerns which were thought to require further investigation. The focus was on the point that the PSED is a continuing duty.

What should Best Practice look like?

What is interesting about this case and its outcome is the role of the public voice in the Public Sector Equality Duty and the role of the Police and Crime Commissioner in holding the police to account.

An assessment of the resulting media reports and associated website content does not demonstrate any involvement with the public in setting up the trial. Almost certainly, the police force could have done better in being more transparent if there was any gathering of public opinion before the trial was introduced.

A good stakeholder mapping exercise would have revealed local and national campaign groups. It is likely that Liberty, Big Brother Watch and others would have been in the top right-hand quadrant of the assessment as having high levels of interest and influence and early conversations would have been had with them to seek their views and seek to allay their concerns.

Moreover, the police force’s Independent Advisory Group should have been engaged about any potential impact or unintended consequences. A good engagement team would have worked with them to understand and mitigate any risks. Similarly, the local community and voluntary sector’s opinion should have been sought, particularly with sex, gender and BAME support groups.

The Consultation Institute suggests that more could be done to invest in forums, panels and focus groups to understand people’s fears and concerns so that they can be mitigated. This would help equality impact assessments reflect which applications of the technology are most and least welcome and influence Communication strategies to address those concerns.

The extent to which data modelling contains bias is a real question and, much could be learned from a long-term opinion tracking-based panel survey to measure and monitor local acceptance of the use of the technology.

The good news is that this has been acknowledged by South Wales Police. In a statement on its website, Chief Constable Matt Jukes said: “Questions of public confidence, fairness and transparency are vitally important, and the Court of Appeal is clear that further work is needed to ensure that there is no risk of us breaching of [sic] our duties around equality. In 2019 we commissioned an academic analysis of this question, and although the current pandemic has disrupted its progress, this work has restarted and will inform our response to the Court of Appeal’s conclusions.”

The role of the Police & Crime Commissioner

This prompts a question as to whether there is a role for the PCC in making sure that police forces give sufficient ‘due regard’ to the public sector equality duty in the formation of policies such as this? It is, of course, possible that the PCC, in this case, had significant involvement. We can’t be sure. However, there is little evidence that the impact on protected categories was considered at this level and as we know, this is a decision making authority which cannot be delegated, as firmly stated by McCombe LJ in the famous case of R (Bracking) v Secretary of State for Work and Pensions in 2013.

In this most recent case, the judge stated:

“The PSED is a duty of process and not outcome. This is for at least two reasons. First, good processes are more likely to lead to better informed, and therefore better, decisions. Secondly, whatever the outcome, good processes help to make public authorities accountable to the public. We would add, in the particular context of the PSED, that the duty helps to reassure members of the public, whatever their race or sex, that their interests have been properly taken into account before policies are formulated or brought into effect. That background is to be found in the Stephen Lawrence Inquiry Report in 1999.”

This is why Police and Crime Commissioners must be involved in issues of such potential sensitivity.

Police forces may argue that the use of biometrics is an operational matter, but if PCCs truly are the voice of local communities, holding Chief Constables to account on their behalf, shouldn’t they take a more proactive lead in creating and cultivating a more favourable climate of opinion between policing and the public?

PCCs legitimacy and public profile are at risk if they are not visibly engaging with communities. So, the Consultation Institute would be interested to learn about what South Wales Police and other police forces are doing to measure and monitor levels of public support and how those opinions will be used to inform ongoing Equality Impact Assessments in their decision making. Please get in touch here.

The Court of Appeal’s full judgment can be read at:

https://www.judiciary.uk/wp-content/uploads/2020/08/R-Bridges-v-CC-South-Wales-ors-Judgment.pdf

 

This article was written by  Institute Associate Anna Collins  and  Institute Director & Associate, Rhion Jones.

Scroll To Top