More than half of the British public are worried about the sharing of biometric data, such as facial recognition, between police and the private sector, according to research from the Alan Turing Institute (ATI), with many expressing concern that a lack of transparency will lead to abuses.
The research, conducted alongside the Centre for Emerging Technology and Security (CETaS), revealed that 57% of the UK public are uncomfortable with biometric data sharing schemes between police forces and the private sector to prevent crimes like shoplifting.
The ATI said while some members of the public believed they would be more comfortable with the data sharing if there were appropriate transparency, oversight and accountability mechanisms in place, others said they would only feel comfortable if data sharing was a one-way process from commercial entities to the police – and not the other way round.
Others said they were completely opposed to any data sharing, arguing it opened up too much risk for abuse and an invasion of privacy.
However, the research also revealed that members of the public are much more likely to trust the use of biometric systems by public sector organisations, with 79% feeling comfortable with the use of biometric systems by the police and 66% by the NHS.
Beyond a focus on facial recognition, the research delved into a wider array of emerging biometric technologies, such as age estimation technology and emotion recognition systems.
It found that while respondents were generally more supportive of “identification” biometric systems such as live facial recognition, they were more concerned by biometric systems that could be used to classify people into groups, such as age estimation, and infer behaviour, such as polygraphs or emotional recognition.
Sam Stockwell, lead author and research associate at the ATI, said: “Our research shows that people are marginally optimistic about the benefits of biometric systems for reducing crime, but there’s also a clear acknowledgement that those using them need to provide the general public with greater confidence that appropriate safeguards are in place.”
Conditional trust
Though the research’s survey sample is nationally representative, the authors acknowledged it is skewed towards demographic majority groups, so does not necessarily accommodate concerns held by specific minority demographics.
The researchers also acknowledged the political implications of categorising individuals into demographic groups such as race and gender, as well as the problematic nature of inferring emotions from neurodivergent individuals.
Among public sector organisations, trust in the use of biometric systems varied depending on the purpose. While 85% of people were comfortable with police using facial recognition systems to verify identities at the UK border, just over 60% felt comfortable with these systems being used to identify criminal suspects in a crowd.
Inferential systems used by the police saw a further fall in trust, with less than a third of respondents comfortable with the police using biometric data through means such as polygraphs to determine whether someone might be telling the truth.
Moreover, trust also varied between regions in the UK – only 28% of respondents in Scotland and 11% in Northern Ireland were comfortable with the police sharing information with the private sector, compared with 36% in England and 48% in Wales. The study noted that attitudes towards the police may impact these figures, but that the findings nevertheless highlight that public attitudes to biometrics vary between nations.
In most cases, those opposed to biometric systems called for explicit regulation rather than an outright ban. However, more than half of respondents believed that using biometric systems in job interviews to assess performance (63%) and tracking student or employee engagement (60%) should be banned.
Overall responses to biometrics were positive – more than half of respondents (53%) believed the benefits of biometrics would outweigh the concerns, while nearly a quarter (24%) believed the opposite was true.
Tim Watson, science and innovation director for defence and national security at the ATI, said: “There’s a growing demand to find new ways to protect our personal data due to increasingly sophisticated cyber security threats and identity fraud techniques, and biometrics is likely to play a crucial role.
“We hope that this research will help policymakers to understand where the gaps are and plan accordingly.”
Legal responsibilities
The research comes just two months after a House of Lords committee questioned the legality of live facial recognition (LFR) technology being used by the UK police without proper scrutiny or accountability.
Writing to the home secretary on 27 January 2024, the Lords Justice and Home Affairs Committee (JHAC) revealed the findings of its investigation into the use of LFR by UK police, noting the lack of rigorous standards or systems of regulation in place to control police use of the technology.
Both Parliament and civil society have repeatedly called for new legal frameworks to govern law enforcement’s use of biometrics, including the UK’s former biometrics commissioner, Paul Wiles; an independent legal review by Matthew Ryder QC; the UK’s Equalities and Human Rights Commission; and the House of Commons Science and Technology Committee, which called for a moratorium on LFR as far back as July 2019.
In an exclusive interview with Computer Weekly, the outgoing biometrics and surveillance camera commissioner for England and Wales, Fraser Sampson, also highlighted a number of issues with how UK police had approached deploying its facial recognition capabilities, and warned that the future oversight of police tech is at risk as a result of the government’s proposed data reforms.