UK government seeks public views on impacts of AI-generated porn


The UK government is seeking views on the impact of emerging technologies such as artificial intelligence (AI) and virtual reality on the adult entertainment industry as part of its Pornography Review.

Officially launched in July 2023, the review is concerned with the overall prevalence and impact of child sexual abuse material and other illegal pornographic content online. While the review is intended to be a wide-ranging examination of the porn industry, it will also explore the issue of AI being used to generate sexually explicit images of people without their consent.

The government has said the review will build on the work of the Online Safety Act, which requires online services to establish the age of their users via digital age verification and estimation tools to prevent children from accessing legal pornographic content.

Under the review process, the government is now calling for pornographic content creators, parents and law enforcement bodies for their input, which will then be used to develop recommendations for the government to undertake.

Respondents to the call for evidence will be asked for their thoughts on how AI and virtual reality have changed how pornography is made, accessed and viewed, as well as how these changes impact viewers and the wider adult entertainment industry.

“Throughout this review, it is essential we engage directly with those who are most involved in the pornography industry and accurately establish what the general public thinks of the current rules governing pornography,” said review lead Gabrielle Bertin. “We want to hear from a wide range of views, whether it be a worried parent, those enforcing the laws to stop exploitation or someone directing or performing in pornography themselves, to speak up and support our review.”

She added that the review would help “future proof the law” in the face of rapid technological change. The call for evidence will run until 7 March 2024.

In October 2023, Wired reported on independent research that showed non-consensual AI generated videos and images on the 35 most popular deepfake porn sites increased 54% between all of 2022 and the first nine months of 2023.

In 2019, separate research by AI firm Deeptrace found that non-consensual pornography accounted for 96% of all deepfake videos online.

Speaking with the BBC, Sonia Livingstone, director of the London School of Economics’ Digital Futures for Children research centre, highlighted the lack of emphasis on academic research in the review, and said it should look at the evidence researchers had already collated. She argued the review must also examine the porn industry’s business models and use of algorithms to push content, but added “it’s not clear what is being proposed here that’s new”.

Under the Online Safety Act, services are already required to identify and remove illegal content, which includes child sexual abuse material and revenge porn.

In the government’s consultation document, the questions listed for respondents include whether there are any gaps in the regulation of online pornography, and whether current regulations on porn are effective.

“As the way we consume media and access content rapidly changes, the review will investigate gaps in UK regulation which allow exploitation and abuse to take place online, as well as identifying barriers to enforcing criminal law,” said the review consultation.

“While the criminal law has been updated in recent years to tackle the presence of extreme and revenge pornography, there are currently different regimes that address the publication and distribution of commercial pornographic material offline, such as videos, and online. The government wants to ensure any pornography legislation and regulation operates consistently for all pornographic content.”



Source link