UK biometric surveillance exists in ‘legal grey area’

The UK’s patchwork approach to regulating biometric surveillance technologies is “inadequate”, placing fundamental rights at risk and ultimately undermining public trust, says the Ada Lovelace Institute (ALI).

As UK public and private organisations rapidly expand their use of various biometric surveillance technologies, an analysis by the ALI has found that “significant gaps and fragmentation” in the existing governance frameworks for these tools means people’s rights are not being protected.

While the Ada Lovelace Institute’s analysis focused primarily on deficiencies in UK policing’s use of live facial recognition (LFR) technology – which it identified as the most prominent and highly governed biometric surveillance use case – it noted there is a need for legal clarity and effective governance for “biometric mass surveillance technologies” across the board.

This includes for other forms of biometrics, such as fingerprints for cashless payments in schools, or systems that claim to remotely infer people’s emotions or truthfulness, as well as other deployment scenarios, such as when supermarkets use LFR to identify shoplifters or verification systems to assure people’s ages for alcohol purchases.

The urgent need for comprehensive regulation also extends to other forms of facial recognition, including retrospective or operator-initiated, which have so far received less public attention and regulatory oversight than LFR despite their “active use” by UK police.

“Our legal analysis shows that the UK’s current ‘diffused’ governance model falls short of ensuring the proportional, accountable and ethical use of biometrics. Police use of LFR is the most highly governed biometrics application in the UK and yet does not meet the bar of a ‘sufficient legal framework’,” said ALI, adding that the clear inadequacy of the current approach means other use cases, particularly on the private sector side, have even less effective governance and oversight.

“The range of impacts presented by this growing use cannot be managed effectively without a centralised governance model: a clear, coherent legal framework and a dedicated, independent regulatory body, responsible for managing current technologies and anticipating future developments.

“Importantly, this framework must be comprehensive across use cases – covering police use, private sector surveillance and inferential biometric systems to meaningfully mitigate risks to around privacy and discrimination, and ensure accountability.”

The ALI said a piecemeal approach developed in the wake of Bridges versus South Wales Police in August 2020 – the UK’s only case law on LFR to date – in which the Court of Appeal ruled the force’s use of the technology to be unlawful. It specifically highlighted “fundamental deficiencies” in the legal framework, noting that “too much discretion” was given to individual officers, and that criteria about where to deploy and who to include in watchlists was unclear.

The ALI said while “a fragmented patchwork of voluntary guidance, principles, standards and other frameworks has been developed by regulators, policing bodies and government” to control police LFR use since the ruling, the documents taken together largely fail to address the concerns of the court.

Specifically, the ALI highlighted an ongoing lack of clarity around the necessity and proportionality of any given LFR deployment, as well as a lack of clear criteria for developing and deploying watchlists.

The ALI added that although “high-level police guidance and local police policies” address different concerns, the “diffused” governance model has created gaps that are not being filled. “Responsibility for developing and implementing each policy or piece of guidance is delegated across a range of relevant actors, including regulators, policing bodies and central government, and no formal incentives to monitor or report on implementation and compliance exist,” it said.

Widening the discussion

However, the ALI was clear that while much of the public discussion has so far focused on police LFR, the risks posed by biometric surveillance technologies are not limited to a single sector or use case, and instead arise from the combined effects of increasingly widespread deployment, opaque processes, and the growing use of sensitive data to make inferences, predictions and decisions about people by a range of actors.

“Only a comprehensive approach can close loopholes, keep pace with technological developments, and safeguard public trust in how biometric systems are used,” it said, specifically noting that any biometrics regulation limited to police LFR alone could create “perverse incentives for outsourcing law enforcement responsibilities” to companies with fewer governance obligations and incentives to uphold rights.

There is no specific law providing a clear basis for the use of live facial recognition and other biometric technologies that otherwise pose risks to people and society
Nuala Polo, Ada Lovelace Institute

“There is no specific law providing a clear basis for the use of live facial recognition and other biometric technologies that otherwise pose risks to people and society. In theory, guidance, principles and standards could help govern their use, but our analysis shows that the UK’s ad hoc and piecemeal approach is not meeting the bar set by the courts, with implications for both those subject to the technology, and those looking to use it,” said Nuala Polo, UK public policy lead at the ALI.

She added that while police and other organisations claim their deployments are lawful under existing duties, these claims are almost impossible to assess without retrospective court cases.

“It is not credible to say that there is a sufficient legal framework in place. This means the rapid roll-out of these technologies exists in a legal grey area, undermining accountability, transparency and public trust – while inhibiting deployers alive to the risks from understanding how they can deploy safely,” said Polo. “Legislation is urgently needed to establish clear rules for deployers and meaningful safeguards for people and society.”

To alleviate the issues identified, the ALI outlined a number of policy recommendations for the UK government, including risk-based legislation analogous to the European Union’s Artificial Intelligence Act, which would tier legal obligations depending on the risk level associated with a given biometric system, and specify new safeguards in law, such as transparency requirements and mandated technical standards.

It also recommended adopting a definition of biometrics as “data relating to the physical, physiological or behavioural characteristics of a natural person” so that both existing and new forms of biometric surveillance are captured, and establishing an independent regulatory body to oversee and enforce any new legal framework.

“Task the independent regulatory body, in collaboration with policing bodies, to co-develop a code of practice that outlines deployment criteria for police use of facial recognition technologies, including live, operator-initiated and retrospective use,” it said. “This code of practice should address the ambiguity and subjectivity of criteria within current guidance and ensure consistency across police deployments.”

Computer Weekly contacted the Home Office about the ALI report, as well as the repeated calls for biometric regulation from Parliament and civil society groups over the years.

“Facial recognition is an important tool in modern policing that can identify offenders more quickly and accurately, with many serious criminals having been brought to justice through its use,” said a Home Office spokesperson.

“All police forces using this technology are required to comply with existing legislation on how and where it is used, and we will continue to engage publicly about the use of this technology and the safeguards that accompany it. We will set out our plans for the future use of facial recognition technology, alongside broader policing reforms, in the coming months.”

Previous calls for biometric regulation

Both Parliament and civil society have made repeated calls for new legal frameworks to govern UK law enforcement’s use of biometrics.

This includes three separate inquiries by the Lords Justice and Home Affairs Committee (JHAC) into shoplifting, police algorithms and police facial recognition; two of the UK’s former biometrics commissioners, Paul Wiles and Fraser Sampson; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; and the House of Commons Science and Technology Committee, which called for a moratorium on LFR as far back as July 2019.

All police forces using [LFR] technology are required to comply with existing legislation on how and where it is used… We will set out our plans for the future use of facial recognition technology, alongside broader policing reforms, in the coming months
Home Office spokesperson

However, while most of these focused purely on police biometrics, the Ryder review in particular also took into account private sector uses of biometric data and technologies, such as in public-private partnerships and for workplace monitoring.

During his time in office before resigning in October 2023, Sampson also highlighted a lack of clarity about the scale and extent of public space surveillance, as well as concerns around the general “culture of retention” in UK policing around biometric data.

Throughout the JHAC inquiry – which described the police use of algorithmic technologies as a “new Wild West” characterised by a lack of strategy, accountability and transparency from the top down – Lords heard from expert witnesses that UK police are introducing new technologies with very little scrutiny or training, continuing to deploy them without clear evidence about their efficacy or impacts, and have conflicting interests with their own tech suppliers.

In a short follow-up inquiry, this time looking exclusively at facial recognition, the JHAC found that police are expanding their use of LFR without proper scrutiny or accountability, despite lacking a clear legal basis for their deployments. The committee also specifically called into question whether LFR is even legal.

The committee added that, looking to the future, there is a real possibility of networked facial recognition cameras capable of trawling entire regions of the UK being introduced, and that there is nothing in place to regulate this potential development.

Despite myriad calls for a new legislative framework from different quarters, government ministers have claimed on multiple occasions that there is a sound legal basis for LFR in the UK, and that “a comprehensive network of checks and balances” already exists.

#biometric #surveillance #exists #legal #grey #area