Roblox Launches AI Age Checks Amid Teen Safety Concerns
Roblox has announced a new suite of safety and privacy updates aimed at teenagers, including an AI-driven age estimation system, enhanced parental insight tools, and a new “Trusted Connections” feature.
Another update is the video-based age estimation requirement to unlock Trusted Connections. Users aged 13 and above will now need to submit a video selfie to confirm their age, a process powered by a third-party verification provider, Persona.
These updated come at a time of mounting regulatory scrutiny and legal challenges concerning how digital platforms safeguard children and adolescents online.
Age Estimation Technology Raises Privacy and Policy Questions
While Roblox emphasizes that it does not retain raw data and Persona deletes it within 30 days, privacy supporters may question the broader implications of collecting biometric data from minors. Despite assurances of data protection, such systems inevitably introduce a layer of surveillance and dependency on machine learning algorithms, a point that has become contentious in ongoing conversations around digital identity and data governance.
Roblox’s rollout comes shortly after the U.S. Supreme Court allowed age verification laws to stand, and multiple U.S. states followed with similar legislative moves. While such laws are intended to prevent minors from accessing adult content, platforms are now expected to extend these standards to broader areas, including social interaction, as is the case with Roblox.
A Measured Attempt to Contain Off-Platform Risk
The “Trusted Connections” feature allows teens to connect more easily with peers they know in real life. The logic is straightforward: by encouraging users to stay within the platform, Roblox reduces the chances of teens moving conversations to less regulated apps, such as Discord or WhatsApp, where moderation is weaker.
However, critics might argue that such mechanisms can only go so far in mitigating risk. Grooming, manipulation, and other online harms often occur even within “trusted” circles, and while age checks are a step in the right direction, they are far from a silver bullet.
It also raises a fundamental question: Should the burden of identifying “safe” interactions fall on AI tools and account-linking features, or is a broader rethinking of digital design needed?
Teen Controls and Parental Visibility
Roblox is also expanding its parental tools and teen privacy settings. New additions include:
- Do Not Disturb mode
- Customizable online status
- Screen time insights and controls
These updates aim to strike a balance between teen autonomy and parental awareness. While the ability to view time spent, friend lists, and experiences may offer parents peace of mind, it also risks pushing some teens toward creating secondary or hidden accounts, a well-documented behavior among youth who feel overly surveilled.
From a policy perspective, this raises questions about how much oversight is appropriate, especially when the child is over 13, which many consider a transitional stage in online independence.
Roblox Faces Legal and Regulatory Pressure
These safety updates arrive amid intensifying pressure. In recent months, Roblox has faced lawsuits from families alleging negligence in addressing grooming and exploitation. A recent lawsuit, for instance, describes a harrowing case of an underage user nearly being assaulted after meeting an adult through the platform.
In response, several states, including Florida, have initiated inquiries into Roblox’s content moderation systems and age verification policies. Notably, Florida Attorney General James Uthmeier issued a subpoena to the company earlier this year, seeking detailed records of its safety practices and communication policies.
The company has consistently maintained that it takes child safety seriously and has invested in moderation, machine learning tools, and advisory councils. Still, these cases reflect broader concerns about whether platform governance is evolving fast enough to keep up with real-world threats.
Industry-Wide Shift or Strategic Optics?
Roblox is not alone in reevaluating its approach to child and teen online safety. Reddit and Google have introduced or adjusted age-verification systems in response to similar pressures. In the UK, the Online Safety Act has begun to influence platform behavior globally.
However, there’s a difference between reactive compliance and proactive safety design. The integration of features like age estimation may help tick regulatory boxes, but the effectiveness of these tools will ultimately be judged by how well they prevent harm, not just how well they document intent.
It remains to be seen whether these updates will meaningfully reduce risk or simply serve as technical safeguards that shift responsibility from platform to user.
Related
Media Disclaimer: This report is based on internal and external research obtained through various means. The information provided is for reference purposes only, and users bear full responsibility for their reliance on it. The Cyber Express assumes no liability for the accuracy or consequences of using this information.
Source link