Appeal judges last month ruled that the use of automatic facial recognition technology by South Wales Police was in breach of the European Convention on Human Rights (here). Overturning the ruling of the High Court in September 2019, the Court of Appeal found that the legal controls around the use of the technology allowed for too much individual discretion for police officers, particularly around who it could be used to search for and where it could be deployed.
Automatic facial recognition technology (AFR) is a controversial technology that scans the faces of all passers-by, detects and analyses their biometric data, and then compares it with a ‘watchlist’ prepared by police. The vast majority of these people will not be suspected of any wrongdoing whatsoever. When a match is identified, an officer reviews it and decides whether to take action. When no match is identified, police say the data is deleted.
The technology has been met with widespread concerns and opposition, not least from the Parliamentary Committee on Science and Technology (see here), as well as civil liberties groups such as Liberty and Big Brother Watch. It is an interference with our general freedom to go about our lawful private business the likes of which has not been seen before. As with all mass privacy-threatening powers and technologies, it has a damaging, chilling effect on the freedom we have to speak freely in protest and displays of solidarity. It effectively turns a public space into an automated police line-up, and we passers-by into walking ID cards that may conceivably be stopped and asked to prove our innocence.
The claim was brought by campaigner Ed Bridges, who has said he is ‘delighted that the Court has agreed that facial recognition clearly threatens our rights’. Unfortunately however, the Court of Appeal did not quite say this. In fact, it found that the interference with Bridges’ privacy rights was small and far outweighed by the perceived benefit to the public of AFR’s use in policing.
Instead the appeal judges found the use of AFR unlawful only in a very limited way, finding the policy around its use by SWP lacking, but effectively green-lighted the use of the underlying technology with a few tweaks to procedure. Much has been made of the finding that South Wales Police was in breach of equality law, but here again it is a matter of form over substance; no actual discrimination was found, just a failure to include potential for it on impact assessments.
Crucially, the court had no qualms with the indiscriminate collection of the sensitive biometric data of hundreds of thousands of people going about their lawful daily business, often without their consent or knowledge. It took no issue with the controversial deployment of AFR at a peaceful anti-arms demonstration, scanning the faces of all those exercising their right to protest in the search of a small number of people who had been arrested the previous year (of which there were only five convictions). This was a use of unprecedented, intrusive mass surveillance at a lawful protest, primarily to find a number of people countable on one hand. The proportionality was justified by the inclusion of all those on warrants or wanted for crimes in the area for good measure; the larger watchlist seen to vindicate the sledgehammer approach to policing the nut of lawful protest. No explanation was given as to why these others were expected to be in attendance. No arrests were made.
The police and the Home Office will see this judgement as a victory. As quoted in a Forbes article, Ray Walsh of ProPrivacy explains that ‘the court’s finding is likely to result in stricter guidelines and policies that ultimately give the police what they need to continue scanning the general public’. They and other police forces can simply go away, take out the offending few words, include a requirement that issues of possible discrimination will be thought about in earnest, and then it’s pretty much business as usual. South Wales’ Chief Constable Matt Jukes said it all:
‘The test of our ground-breaking use of this technology by the courts has been a welcome and important step in its development. I am confident this is a judgment that we can work with… The Court of Appeal’s judgment helpfully points to a limited number of policy areas that require attention.’
This is worrying when you take a look at what police forces and private companies have had planned for AFR for a while: The Metropolitan Police have already used it to scan indiscriminately at Notting Hill Carnival in 2017, indiscriminately scanning unknown swathes of innocent revellers for 1 supposed match, and even at a Remembrance Sunday celebration. The Met will no doubt have big plans for AFR following this judgement that approves it in principle. Lincolnshire Police recently grabbed headlines for their plans to roll out AFR that not only records your biometric data as you walk down the street, but your mood as well.
This is nothing new in the trend of Article 8 (right to privacy and family life) cases. In 2018, sections of the media rejoiced the striking down of GCHQ’s bulk data collection practices by the Strasbourg court, declaring it a victory for privacy and freedom of speech. But again, the case was ‘won’ only on the grounds that the legal and policy framework was lacking in specificity. The Strasbourg Court in that case gave a ringing endorsement to the efficacy and proportionality of collecting this inordinate mass of metadata in principle. While the case was still in court, the law in question (the Regulation of Investigatory Powers Act or RIPA) was superseded by the Investigatory Powers Act, and GCHQ could carry on employing these techniques, albeit with certain extra limits in place.
This problem embodies the limitations of Article 8 in the age of mass indiscriminate digital surveillance. It is too focused on the individual, and not on the crucial overarching right to privacy and freedom of speech of the people. These are general freedoms that only have true meaning if all can enjoy them.
Without going into too much detail, it’s useful to understand what protections Article 8 does and doesn’t provide, at least as interpreted by the courts. There are effectively two main requirements that the state has to comply with if it wants to lawfully interfere with our privacy rights:
Firstly, the state has to show that its interference is ‘in accordance with the law’. This means that there has to be some legal basis for what it is doing, and that the powers granted have sufficient controls. This is important as it should operate to ensure that there are limits on state powers. It is on this ground that these surveillance techniques were found unlawful; the courts found the law around them wasn’t limiting enough. In the case of AFR, it found that SWP needed to be more specific about who it is looking for and where it will be looking; in particular that looking for ‘persons wanted on suspicion for an offence, wanted on warrant, vulnerable persons and other persons where intelligence is required’ was unacceptably broad. This is not insignificant as it means that SWP cannot in theory put whoever it wants on a watchlist. However, it says nothing about the indiscriminate scanning of our biometric data and the threat to privacy rights in general, and is easily rectified without any real obstacle to the rolling out of AFR technology.
The second requirement is supposed to be more substantive. It says that the interference has to be proportionate to a legitimate aim of the state. In other words, the apparent benefit to the people has to outweigh the interference with the rights of the individual whose privacy has been interfered with. Herein lies the problem. Because ECHR is drafted and interpreted around the idea of the rights of the particular individual victim bringing the claim, the interference appears dwarfed by the public benefits claimed by the police, as happened in this case.
The Court of Appeal rejected the argument that this particular interference needed to be seen in the light of all the others: SWP’s scanning and processing of biometric data of thousands upon thousands going about their business, shopping, walking, meeting friends or exercising their right to protest, or the potential ramifications for the public nationwide. It dismissed this argument on the ground that proportionality was not a mathematical exercise; that the interference with Mr Bridges’ right could not be multiplied by the others who were not involved in the case. But how is this to be reconciled with the position concerning the anti-arms trade protest above? If adding numbers to the tiny watchlist for a peaceful demonstration with general wanted persons unlikely to be present tipped the scales there, why does the mathematical approach not apply here?
This is a perfect example of the law being too slow to react. The past 20 years have presented an unprecedented threat to the privacy rights of individuals in the United Kingdom, leading to our being dubbed the ‘most surveiled democracy in the world’. Much of this has come not in the form of directed intelligence gathering about particular suspects, but in the indiscriminate collection of our images and data without our knowledge or consent. ECHR was not designed for this. Article 8 was drafted at a time where this kind of technology simply did not exist, and it is not up to the job of protecting our privacy rights against it. If the overall effect of the surveillance tool and the sheer numbers involved are not taken into account, then it offers no protection whatsoever against the greatest threat to privacy and freedom of expression we have ever faced.
It is understandable that Bridges and Liberty would consider the overturning of the controversial High Court judgement cause for celebration. All the above should be read as a criticism of the legal tools available to them, not their significant efforts and principles. They too undoubtedly recognise that this is no substitute for greater legal protection from such invasions of privacy. They would likely agree that the only way to protect our privacy is to legislate against the use of these technologies. In the words of Liberty lawyer Megan Goulding:
‘It is time for the Government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it needs to be banned.’
It is not enough to strike down particular uses because the guidelines are too vague, effectively handing the state a road map showing how they can employ them against us without further challenge. There needs to be a legally enforceable statutory duty on the state to protect privacy rights as a whole, and not just a remedy for some individuals in certain cases where the court deems sufficient the effect on that one person specifically. A human right has to be general, all-pervasive. It has to consider the right in and of itself to have any real meaning at all.
In the 2018 case, the European Court summed up the proportionality problem perfectly in saying that ‘it would be wrong automatically to assume that bulk interception constitutes a greater intrusion into the private life of an individual than targeted interception, which by its very nature is more likely to result in the acquisition and examination of a large volume of his or her communications’. This is missing the point entirely.