Poland Calls For EU Investigation Of TikTok Over AI-Generated Disinformation Campaign

Poland Calls For EU Investigation Of TikTok Over AI-Generated Disinformation Campaign

Poland’s Ministry of Digital Affairs submitted a formal request to the European Commission, this week, demanding investigation of TikTok for allegedly failing to moderate a large-scale disinformation campaign run using AI-generated content that urged Poland to exit the European Union. The authorities claimed the platform violated obligations as a Very Large Online Platform under the Digital Services Act.

Secretary of State Dariusz Standerski warned the synthetic audiovisual materials pose threats to public order, information security, and the integrity of democratic processes in Poland and across the European Union.

Some of the videos observed contain young women advocating for “Polexit” likely targeted at the younger audiences. European analytics collective Res Futura found one such TikTok account “Prawilne Polki,” which published content showing women dressed in T-shirts bearing Polish flags and patriotic symbols.

Digital Services Act, TikTok, Disinformation, Disinformation Campaign, Poland, EU Commission
AI-generated “Polexit” videos (Source: Res Futura X account)

The video character said: “I want Polexit because I want freedom of choice, even if it will be more expensive. I don’t remember Poland before the European Union, but I feel it was more Polish then.” (machine translated)

The disclosed content published in the Polish-language segment of TikTok exhibits characteristics of a “coordinated disinformation campaign,” with the nature of narratives, distribution methods, and use of synthetic materials indicating TikTok failed to implement adequate mechanisms for moderating AI-generated content or ensure effective transparency measures regarding material origins, Standerski said.

Four-Point Action Request

Poland’s formal request to Executive Vice President for Tech Sovereignty, Security and Democracy Henna Virkkunen proposes the European Commission initiate investigative proceedings concerning suspected breaches of Digital Services Act provisions relating to systemic risk management and content moderation.

report-ad-bannerreport-ad-banner

The ministry demands TikTok submit a detailed report on the scale and nature of disclosed content, its reach, and actions taken to remove it and prevent further dissemination. Poland also requests the Commission consider applying interim measures aimed at limiting continued spread of AI-generated content encouraging Polish EU withdrawal.

The fourth request asks for coordination with Poland’s Digital Services Coordinator UKE and notification of relevant national authorities regarding proceedings outcomes.

Digital Services Act, TikTok, Disinformation, Disinformation Campaign, Poland, EU CommissionDigital Services Act, TikTok, Disinformation, Disinformation Campaign, Poland, EU Commission
Letter sent by Secretary of State Dariusz Standerski to the EU Commission. (Source: X)

Systemic Risk Management Failures

Available information suggests TikTok has not implemented adequate mechanisms for moderating AI-generated content, Standerski said. The platform’s alleged failure to ensure effective transparency measures regarding synthetic material origins undermines Digital Services Act objectives concerning disinformation prevention and user protection.

The scale of this phenomenon, its potential consequences for political stability, and the use of generative technologies to undermine democratic foundations require immediate response from European Union institutions, the letter stressed.

As a Very Large Online Platform under DSA regulations, TikTok faces enhanced obligations including systemic risk assessments, independent audits, and transparency reporting. The platform must identify and mitigate risks relating to dissemination of illegal content and negative effects on civic discourse and electoral processes.

Growing Concerns Over AI-Generated Disinformation

The Polish complaint represents one of the first formal DSA enforcement requests specifically targeting AI-generated disinformation campaigns on major social media platforms. The case highlights growing concerns among EU member states about synthetic media being weaponized to manipulate public opinion and undermine democratic institutions.

The Digital Services Act, which came into full effect in February 2024, grants the European Commission powers to investigate very large platforms and impose fines up to 6% of global annual revenue for violations. The law requires platforms to assess and mitigate systemic risks including manipulation of services affecting democratic processes and public security.

TikTok has already been under the scanner from the EU Commission for violations under the Digital Services Act. February, last year, the Commission opened a formal investigation against the social media giant for DSA violation in areas linked to the protection of minors, advertising transparency, data access for researchers, and risk management of addictive design and harmful content.

Also read: U.S. Government Sues TikTok for COPPA Violations, Exposing Millions of Children’s Data



Source link