Police Scotland receive formal notice about cloud system


The Scottish biometrics commissioner has served Police Scotland with an information notice, requiring the force to demonstrate that its deployment of a cloud-based digital evidence system complies with the UK’s law enforcement-specific data protection rules.

At the start of April 2023, Computer Weekly revealed that the Scottish government’s Digital Evidence Sharing Capability (DESC) service – contracted to body-worn video provider Axon for delivery and hosted on Microsoft Azure – is currently being piloted despite major data protection concerns raised by watchdogs about how the use of Azure “would not be legal”.

According to a Data Protection Impact Assessment (DPIA) by the Sottish Police Authority (SPA) – which notes the system will be processing genetic and biometric information – the risks to data subjects’ rights include US government access via the Cloud Act, which effectively gives the US government access to any data, stored anywhere, by US corporations in the cloud; Microsoft’s use of generic, rather than specific, contracts; and Axon’s inability to comply with contractual clauses around data sovereignty.  

There is also a concern that transferring personal data to the US, a jurisdiction with demonstrably lower data protection standards, could in turn negatively impact people’s data rights to rectification, erasure and not be subject to automated decision-making.  

While the SPA DPIA noted the risk of US government access via the Cloud Act was “unlikely… the fallout would be cataclysmic”. 

Off the back of Computer Weekly’s reporting on the DESC service, Scottish biometrics commissioner Brian Plastow served Police Scotland (the lead data controller for the system) an information notice on 22 April 2023, which gives the force until mid-June to provide information about their data protection compliance.

The information notice itself directly references Computer Weekly’s DESC coverage. “I am now sufficiently concerned about the potential implications of DESC that in accordance with the provisions of section 16 of the Scottish Biometrics Commissioner Act 2020, I must now require Police Scotland to provide me with information so that I can determine whether Police Scotland are complying with the data protection elements of my statutory Code of Practice,” he wrote in the formal notice.

Plastow also outlined specific information he would like to receive, including whether biometric data transfers have taken place; what types have been transferred; in what volumes; and which country the data is being hosted in.

“If biometric data has been exchanged as part of DESC, please confirm whether Police Scotland is complying fully with Part 3 of the UK Data Protection Act 2018 relevant to law enforcement processing, and with Principle 10 of the Scottish Biometrics Commissioner’s Code of Practice,” he said, referring to a statutory code which took effect in Scotland on 16 November 2022 following approval by the Scottish government.

Principle 10 of the code specifically relates to the promotion of privacy-enhancing technologies, and notes that the way in which biometric data is acquired, retained, used and destroyed must ensure the data is protected from unauthorised access or disclosure.

“To ensure compliance with the Code of Practice, Police Scotland needs to demonstrate that any use of hyperscale cloud infrastructure which involves biometric data is compliant with law enforcement-specific data protection rules,” said Plastow. “The best way to achieve this would be to have a hosting platform that is entirely located in the UK, and which meets all the requirements of Part 3 of the Data Protection Act 2018 on processing for law enforcement purposes.

“If this is not the case with DESC, then to ensure that public confidence and trust is maintained, Police Scotland needs to explain to citizens what the use of the cloud means for their personal data. This means being open with citizens about what country their data will be stored in and, if the answer to that question is not the UK, to explain the obvious risks of that extremely sensitive data then being accessed either judicially or maliciously.”

Responding to the notice, a Police Scotland spokesperson said: “Police Scotland takes data management and security very seriously, and is working alongside criminal justice partners to ensure robust, effective and secure processes are in place to support the development of the DESC system.

“All digital evidence on the DESC system in Dundee is held securely and is only accessible to approved personnel, such as police officers, [Crown Office and Procurator Fiscal Service] COPFS and defence agents. Access to this information is fully audited and monitored, and processes are in place to ensure any data risks are quickly identified, assessed and mitigated. We will continue to engage with the Biometrics Commissioner to provide the required assurance regarding data protection and security as the pilot in Dundee progresses.”

Lack of regulatory approval

Under the notice, Plastow is also seeking information on what discussion took place with the Information Commissioner’s Office (ICO) on questions of international transfers and digital sovereignty, and for Police Scotland to confirm whether all the issues were resolved to the ICO’s satisfaction.

Computer Weekly previously asked the ICO about the prevalence of US cloud providers throughout the UK criminal justice sector, and whether their use is compatible with UK data protection rules, as part of its coverage of the DESC system. The ICO press office was unable to answer, and referred Computer Weekly’s questions to the FOI team for further responses.

On 24 April, the ICO FOI team responded that while it has obtained legal advice on the issue, the matter is ongoing and it has not yet come to a formal position on the matter. The advice itself was withheld, however, as it’s subject to legal professional privilege.

The ICO also confirmed it has “never given formal regulatory approval for the use of these systems in a law enforcement context”.

However, the SPA’s correspondence with the ICO – also disclosed under FOI – revealed the regulator largely agreed with its assessments of the risks, noting that technical support from the US or US government access via the Cloud Act would constitute an international data transfer.

“These transfers would be unlikely to meet the conditions for a compliant transfer,” it said. “To avoid a potential infringement of data protection law, we strongly recommend ensuring that personal data remains in the UK by seeking out UK-based tech support.”

Prior consultation

In separate correspondence with Police Scotland (again disclosed under FOI), the ICO noted: “If you have a remaining residual high risk in your DPIA that cannot be mitigated, prior consultation with the ICO is required under section 65 DPA 2018. You cannot go ahead with the processing until you have consulted us.”

While Plastow welcomed the strategic objectives of DESC to digitally transform how the Scottish justice system manages evidence, he confirmed that his office was never engaged by either the Scottish government or Police Scotland until a meeting held on 29 November 2022.

At this meeting – which Plastow himself requested after becoming aware that biometric data could be being shared through the system – the commissioner’s professional advisory group sought assurances on questions of data security and data sovereignty from Police Scotland.

After a presentation from the force, members of the advisory group requested that the slides regarding DESC were circulated afterwards. However, the superintendent delivering the presentation indicated that he would need to consider this request, as some of the slides may contain commercially sensitive information: “The slide pack was never received.”

A UK-wide issue

The release of the SPA DPIA also brings into question the lawfulness of cloud deployments by policing and criminal justice bodies throughout England and Wales, as a range of other DPIAs seen by Computer Weekly do not assess the risks outlined by the SPA around US cloud providers, despite being governed by the same data protection rules.

In December 2020, for example, a Computer Weekly investigation revealed that UK police forces were unlawfully processing more than one million people’s personal data – including biometrics – on the hyperscale public cloud service Microsoft 365, after failing to comply with key contractual and processing requirements within Part Three of the Data Protection Act 2018, such as restrictions placed on international transfers.  

In particular, the DPIAs disclosed to Computer Weekly via Freedom of Information requests showed that the risks of sending sensitive personal data to a US-based company, which is subject to the US government’s intrusive surveillance regime, were not properly considered. 

Other uses of US cloud providers throughout the UK criminal justice sector include the integration of the Ident1 fingerprint database with Amazon Web Services (AWS) under the Police Digital Services (PDS) Xchange cloud platform; and the HM Courts and Tribunals’ cloud video platform, which is partly hosted on Azure and processes biometric information in the form of audio and video recordings of court proceedings. 

In mid-April 2023, the biometrics commissioner for England and Wales, Fraser Sampson, told Computer Weekly that UK policing and justice bodies must be able to prove that their increasing use of public cloud infrastructure is compliant with law enforcement-specific data protection rules.

Speaking specifically about the use of hyperscale public cloud providers to store and process sensitive biometric data, Sampson said the “burden of proof is on police as [data] controllers, not just to provide the information and assurances, but also to demonstrate that their processing complies with all the relevant [data protection] requirements”. He added that the burden of proof was not just a matter of law, but of governance, accountability and building public trust in how the police are using new technologies.

During an appearance before Parliament’s Joint Committee on Human Rights in February 2023, Sampson noted there was a “non-deletion culture” in UK policing when it came to the retention of biometric information.



Source link