A Lords committee is calling on the government to make market competition in artificial intelligence (AI) “an explicit policy objective” while criticising its “inadequate and deteriorating” position on the use of copyrighted material in large language models (LLMs).
Following the release of a government response to the Communications and Digital Committee’s report on LLMs and generative AI (GenAI), committee chair Baroness Stowell has written to digital secretary Michelle Donelan thanking her for the engagement, while also warning about “significant areas where we believe the government needs to go beyond its current position”.
In particular, Stowell cited the government’s lack of action to uphold competition in AI markets and guard against regulatory capture in key public bodies, as well as its reticence to take meaningful action to protect creatives’ copyright, as major problems.
Report and government response
Released in February 2024, the committee’s report warned about a lack of competition in the UK’s AI markets; the risks of regulatory capture in the Department for Science, Innovation and Technology (DSIT) and the AI Safety Institute (AISI); and the detrimental effects of allowing AI developers to run roughshod over copyright laws.
In a formal response published on 2 May 2024, the government said the Digital Markets, Competition and Consumers Bill (DMCC) will give the Competition and Markets Authority (CMA) the tools it needs to identify and address significant competition issues in a variety of digital markets, including AI, noting the regulator has already published its initial review into the competition implications of AI foundation models.
On regulatory capture, it added: “In line with DSIT’s conflicts of interest policy, AISI requires all individuals joining [its] Research Unit to declare any conflicts of interest. These conflicts are mitigated in line with the conflicts process agreed by the DSIT permanent secretary.”
While the government noted the AISI “is dedicated to building new infrastructure to conduct necessary testing and evaluations of advanced AI”, Politico revealed in April 2024 that it has not yet been able to carry out extensive pre-deployment testing of new models, despite agreements being made with leading AI companies to open their models for this purpose at the AI Summit in November 2023.
Regarding AI and intellectual property, the government said it was committed to ensuring the continuation of the UK’s “robust” copyright framework: “The basic position under copyright law is that making copies of protected material will infringe copyright unless it is licensed, or an exception applies. However, this is a complex and challenging area, and the interpretation of copyright law and its application to AI models is disputed, both in the UK and internationally.”
The government added it is actively engaging with the relevant “stakeholders to understand broader perspectives in relation to transparency about the purposes of web crawlers”, and reiterated the commitment made in its AI whitepaper to progress work on the transparency of AI models’ inputs and outputs.
While it noted there are several ongoing legal cases over the use of copyrighted material in AI training models, the government said “it would not be appropriate for the government to comment on ongoing court cases. These cases are for the courts to decide on and must be allowed to conclude independently”.
The government also reiterated its commitment not to legislate on AI until it has a full understanding of the evidence on risks and their potential mitigations.
Baroness Stowell letter
Published on the same day as the formal government response, Baroness Stowell’s letter provides details about the committee’s ongoing concerns with the UK’s approach to GenAI and LLMs.
Describing the government’s record on copyright as “inadequate and deteriorating”, Stowell said while the committee appreciates the technical and political complexities involved, “we are not persuaded the government is investing enough creativity, resources and senior political heft to address the problem”.
Baroness Stowell, Lords Communications and Digital Committee
She added: “The contrast with other issues, notably AI safety, is stark. The government has allocated circa £400m to a new AI Safety Institute with high-level attention from the prime minister. On copyright, the government has set up and subsequently disbanded a failed series of roundtables led by the Intellectual Property Office. The commitment to ministerial engagement is helpful but the next steps have been left unclear. While well-intentioned, this is simply not enough.”
Stowell said the government’s response “declines to provide a clear view” of whether it supports applying copyright principles to LLMs, and whether it is prepared to bring legislation to legally settle the matter.
“Indeed, it suggests that the government does not wish to comment in order to avoid prejudicing the outcome of ongoing legal cases. This contention is misguided and unconvincing,” she wrote, adding that setting out an intention to address legal uncertainty would not breach any ‘sub judice’ conventions preventing MPs from commenting on ongoing court cases: “It is therefore difficult to escape the conclusion that the government is avoiding taking sides on a contentious topic.”
Stowell concluded that the government’s reticence to take meaningful action amounts to a de facto endorsement of tech firms’ practices.
“That,” she said, “reflects poorly on this government’s commitment to British businesses, fair play and the equal application of the law. Copyright catalyses, protects and monetises innovation – as evidenced by the £100bn success of the UK’s creative industries. There is a major opportunity to establish a compelling legacy on supporting responsible AI. We urge you to take it.”
Regarding the issues of market competition and regulatory capture, Stowell said there is a clear trend towards consolidation at the cutting edge of AI markets, and that explicitly pro-competition policy objectives “should be embedded within the design and review process for new policies and standards, and subject to structured internal and external critique”.
She added: “We were disappointed that the government has not yet made a public commitment to strengthening governance measures to guard against regulatory capture. This needs to go beyond declaring interests.
“As we warned in our report, there is a clear trend towards greater reliance on external technical expertise to inform decisions on standards and policy frameworks. This will bring valuable industry engagement. But the unintended risks of entrenching incumbent advantages are real and growing.”
Stowell noted that even an unfounded perception of close relationships between AI policy and technology leaders risks lasting damage to public trust, and that the government should therefore make more explicit commitments around enhanced governance measures.
Commenting on Stowell’s letter, a DSIT spokesperson said: “The UK is a world leader in AI innovation and has a creative industries sector which generates more than £124bn a year. We are supporting artists and advocating an approach which allows them to work in partnership with AI innovators to harness the opportunities this technology provides, while engaging closely with relevant stakeholders on issues including copyright.
“We have already outlined a regulatory approach to AI earlier this year that will swiftly address global challenges, ensure safe advancement and encourage an open, competitive market in AI. That’s on top of delivering funding for AI from our record £20bn R&D budget in a fair and responsible way.”