A Controversial Plan to Scan Private Messages for Child Abuse Meets Fresh Scandal


Danny Mekić, an Amsterdam-based PhD researcher, was studying a proposed European law meant to combat child sexual abuse when he came across a rather odd discovery. All of a sudden, he started seeing ads on X, formerly Twitter, that featured young girls and sinister-looking men against a dark background, set to an eerie soundtrack. The advertisements, which displayed stats from a survey about child sexual abuse and online privacy, were paid for by the European Commission.

Mekić thought the videos were unusual for a governmental organization and decided to delve deeper. The survey findings highlighted in the videos suggested that a majority of EU citizens would support the scanning of all their digital communications. Following closer inspection, he discovered that these findings appeared biased and otherwise flawed. The survey results were gathered by misleading the participants, he claims, which in turn may have misled the recipients of the ads; the conclusion that EU citizens were fine with greater surveillance couldn’t be drawn from the survey, and the findings clashed with those of independent polls.

The micro-targeting ad campaign categorized recipients based on religious beliefs and political orientation criteria—all considered sensitive information under EU data protection laws—and also appeared to violate X’s terms of service. Mekić found that the ads were meant to be seen by select targets, such as top ministry officials, while they were concealed from people interested in Julian Assange, Brexit, EU corruption, Eurosceptic politicians (Marine Le Pen, Nigel Farage, Viktor Orban, Giorgia Meloni), the German right-wing populist party AfD, and “anti-Christians.”

Mekić then found out that the ads, which have garnered at least 4 million views, were only displayed in seven EU countries: the Netherlands, Sweden, Belgium, Finland, Slovenia, Portugal, and the Czech Republic.

At first, Mekić could not figure out the country selection, he tells WIRED, until he realized that neither the timing nor the purpose of the campaign was accidental. The Commission’s campaign was launched a day after the EU Council met without securing sufficient support for the proposed legislation Mekić had been studying, and the targeted counties were those that did not support the draft.

The legislation in question is a controversial proposal by the EU Commission known as Chat Control or CSA Regulation (CSAR) that would obligate digital platforms to detect and report any trace of child sexual abuse material on their systems and in their users’ private chats, covering platforms such as Signal, WhatsApp, and other messaging apps. Digital rights activists, privacy regulators, and national governments have strongly criticized the proposal, which the European data protection supervisor (EDSR) Wojciech Wiewiorowski said would amount to “crossing the Rubicon” in terms of mass surveillance of EU citizens.



Source link