MPs and peers have warned the government that Russia, China, North Korea and other hostile states could attempt interfere in the 4 July general election.
A joint parliamentary committee of MPs and peers has warned the prime minister Rishi Sunak that the general election is an “attractive target” for malicious actors trying to destabilise the UK.
The UK could face cyber-attacks, including ransomware targeted against UK institutions, the spread of AI generated disinformation and targeted phishing attacks aimed at disrupting the election.
Margaret Beckett, Labour MP and chair of the influential parliamentary national security committee, warned Sunak in a letter on 24 May that hostile states such as Russia and China “can reach the British public far more easily than ever before”.
Technological developments mean that hostile actors “have the ability to influence the information landscape by creating harmful deepfake videos and audios” – which use artificially generated images and sounds to spread false information on the internet, she said.
A government spokesman acknowledged the threat from foreign states, saying “security is paramount and we are well prepared to ensure the integrity of the election with robust systems in place to protect against interference. ”
The government has established a Joint Election Security Preparations Unit ahead of the July 4 election to provide a coordinated effort across Whitehall, the police and the security and intelligence agencies, to tackle foreign interference.
Parliamentarians and election candidates have also been given access to an “enhanced cyber security offer” developed by GCHQ’s National Cyber Security Centre (NCSC), to protect against phishing attacks and foreign influence operations.
Deep fakes could influence voters
Beckett, chair of the Joint Committee on the National Security Strategy told Sunak: “clear public communication about the risks posed by mis- and disinformation and other forms of foreign interference is essential”.
“There is nothing new in hostile actors seeking to interfere in elections. Today however, these actors can reach the British public far more easily than before,” she added.
“As a consequence of technological advances, hostile actors – both foreign and domestic – have the ability to influence the information landscape by creating harmful deepfake videos and audios that rapidly spread,” she warned the prime minister.
These tactics have already been used to sow disinformation in the US where a deep fake audio of president Biden discouraged voters from going to the polls in the US primaries.
In the UK an AI-generated deepfake audio of the London Mayor, Sadiq Khan, making inflammatory remarks, provoked a spike in online criticism of the mayor after it spread on the social media services TikTok and Instagram.
“Since 2019, there have been significant changes in the geopolitical situation,” said Beckett. “These changes have made the prospect of influencing political discourse in democracies ‘ever more attractive for state actors’”.
UK must prepare for interference in 4 July election
“Conflict in the Middle East and Russia’s renewed illegal invasion of the Ukraine has contributed to a sense that the world is becoming more dangerous than at any time since the cold war,” she added.
“Despite public government statements on the threat from hostile foreign actors such as China, Russia, Iran and North Korea, it is not clear if members of the public fully understand how these threats will manifest and what this means for the UK, its democracy and for them as individuals.”
“The UK must be prepared for the possibility of foreign interference during the General Election that will take place on 4 July 2024,” she told the prime minster.
The letter points out that the UK’s National Cyber Security Centre reported in 2023 “the UK government assesses that it is almost certain that Russian actors sought to interfere in the 2019 general election.”
The hacking group Star Blizzard, also known as Callisto, ColdRiver, Tag-53, TA446 and BlueCharlie, targeted high-profile individuals, including politicians in the UK and was identified as a Russian FSB operation by Computer Weekly in 2023.
Victims include the former head of MI6 Richard Dearlove, whose emails were hacked and leaked from an encrypted email service in 2022 and left-wing freelance journalist Paul Mason, who has frequently criticised Putin’s war against Ukraine.
The group was also responsible for the leak of UK-US trade documents ahead of the 2019 general election, and in 2018 compromised the Institute of Statecraft, a UK think tank whose work included initiatives to defend democracy against disinformation.
A Chinese state-affiliated hacking group was responsible for infiltrating the IT systems of the UK Electoral Commission between 2021 and 2023 and attempted phishing attacks on members of parliament.
The NCSC warned in 2023 that it is “no secret that Russia seeks to weaken and divide their adversaries by interfering in elections using mis and dis-information, cyber-attacks, and other methods”.
Large language models will almost certainly be used to create fabricated content and hyper-realistic AI-created bots will make the spread of disinformation easier, it said.
Guidance needed as matter of priority
Beckett warns in the letter that there is nothing to stop a person with few technical skills using generative AI tools to create and disseminate convincing political disinformation.
She called on the Electoral Commission to issue guidance to the public “as a matter of priority” on how to spot deepfakes, and other types of disinformation and misinformation now that the election is underway.
Delays in implementing provisions in the Online Safety Bill meanimportant provisions that would protect against disinformation in an election will not be enacted before the General Election she said.
Tech companies creating echo chambers
Computer Weekly previously reported Beckett’s concerns that big tech companies, including X, Snapchat, TikTok, Meta, YouTube and Microsoft, were not co-ordinating plans to tackle threats to the election.
The companies use algorithms which create harmful ‘echo chambers’ that limit the content and information users are likely to see to inform their judgements during an election.
And it is yet to be seen whether Ofcom’s new enforcement powers under the Online Safety Act 2020 will ensure that social media companies follow their own policies on misinformation and disinformation and take down illegal content swiftly, Beckett told Number 10.
Urgent action needed to protect UK democracy
The letter urges Sunak to use the last few days of Parliament to bring Government, political parties and electoral and security agencies together to identify actions that can be taken to protect the UK against threats to the election.
“Clear public communication about the risks posed by mis- and disinformation and other forms of foreign interference is essential, ” wrote Beckett.
The Home Office set up the Defending Democracy Taskforce in 2022, chaired by Security Minister Tom Tugendhat, to bring together all levels of government to tackle threats to elections.
A government spokesperson said “Since its formation, the taskforce has established a new election security unit, rolled out an enhanced cyber security offer for MPs and peers, and announced £31million to protect our democratic processes and institutions”
“The National Security Act has additionally delivered a range of measures to strengthen the UK’s efforts to detect, deter and disrupt state threats,”
the spokesperson added.