The European Commission has launched a formal DSA child protection investigation into Snapchat, examining whether the platform is meeting its obligations to ensure a high level of safety, privacy, and security for minors.
The move comes under the framework of the Digital Services Act (DSA), which sets strict standards for online platforms operating in the European Union and can impose fines of up to 6% of global annual turnover for non-compliance.
Age Assurance Under Digital Services Act Scrutiny
At the center of the DSA child protection investigation is Snapchat’s approach to age assurance. According to its terms, users must be at least 13 years old to access the platform. However, the Commission suspects that Snapchat’s reliance on self-declaration is insufficient.
It raises concerns that this method neither prevents children under 13 from accessing the service nor adequately verifies whether users are under 17, which is necessary to ensure age-appropriate experiences.
There are also concerns that tools to report underage users may not be easily accessible within the app.
The investigation also focuses on the risk of minors being exposed to grooming attempts and recruitment for criminal purposes. The Commission suspects that Snapchat may not be doing enough to prevent users with harmful intent from contacting children, particularly in cases where individuals misrepresent their age or manipulate their profiles.


This includes concerns around exposure to harmful content, conduct, and contact that could place minors at risk.
Default Settings And Privacy Concerns
Another key area under the DSA child protection investigation is Snapchat’s default account settings. The Commission believes that the platform may not provide sufficient privacy, safety, and security protections for minors by default.
Features such as the “Find Friends” system, which recommends users, and push notifications that remain enabled by default are under scrutiny.
The Commission also notes that users may not receive adequate guidance during account creation on how to manage privacy and safety settings, or how to adjust them effectively.
Illegal Content And Reporting Mechanisms Under Review
The investigation further examines whether Snapchat is effectively preventing the dissemination of illegal content, including information related to the sale of drugs and age-restricted products such as alcohol and vapes.
Under the DSA, platforms are required to mitigate systemic risks arising from their services. The Commission suspects that current content moderation measures may not be sufficient to block or limit access to such content, especially for younger users.
Reporting mechanisms for illegal content are also part of the Digital Services Act child protection investigation. The Commission raises concerns that these systems may not be easy to access or user-friendly and could involve design practices that make reporting less straightforward.
There are also concerns that users may not be properly informed about complaint procedures or available redress options within the platform.
Next Steps in DSA Child Protection Investigation
The European Commission will now conduct an in-depth investigation by gathering further evidence, including requesting information from Snapchat and conducting interviews or inspections.
The opening of formal proceedings allows the Commission to take further enforcement actions, including adopting interim measures or issuing a non-compliance decision. It can also accept commitments from Snapchat to address the issues identified during the investigation.
The action against Snapchat builds on broader regulatory efforts under the Digital Services Act to strengthen online child protection across platforms.
The Commission has used its 2025 DSA Guidelines on the protection of minors as a benchmark for evaluating compliance, emphasizing that self-declaration alone should not be considered a reliable age assurance method and that default settings should offer the highest level of protection for minors.
“From grooming and exposure to illegal products to account settings that undermine minors’ safety, Snapchat appears to have overlooked that the Digital Services Act demands high safety standards for all users. With this investigation, we will closely look into their compliance with our legislation,” said Henna Virkkunen, Executive Vice-President for Tech Sovereignty, Security and Democracy.
Age Verification Under Question
In a related development, the European Commission has also taken preliminary action against adult content platforms including Pornhub, Stripchat, XNXX, and XVideos under the Digital Services Act. The Commission found that these platforms may have failed to adequately protect minors from accessing pornographic content.
It noted that their risk assessments did not sufficiently identify or evaluate risks to children, and in some cases, placed more emphasis on business considerations than on child safety.
“In the EU, online platforms have a responsibility. Children are accessing adult content at increasingly younger ages and these platforms must put in place robust, privacy-preserving and effective measures to keep minors off their services. Today, we are taking another action to enforce the DSA – ensuring that children are properly protected online, as they have the right to be,” said Virkkunen.
The findings also indicate that these platforms rely heavily on self-declaration for age verification, which the Commission considers ineffective. Additional measures such as content warnings, page blurring, or “restricted to adults” labels were also deemed insufficient to prevent minors from accessing harmful material. The Commission has suggested that more robust, privacy-preserving age verification methods are required to address these risks.
As part of ongoing proceedings, these platforms will have the opportunity to respond to the Commission’s findings and take corrective measures.
If the breaches are confirmed, the Commission may issue a non-compliance decision, which could result in significant financial penalties or enforcement actions to ensure compliance.
The broader enforcement push reflects a clear regulatory direction under the Digital Services Act, with authorities focusing on ensuring that platforms, regardless of size, take stronger responsibility for protecting minors online.

