Social Media Platforms Face New Age Restrictions from December 2025

Social Media Platforms Face New Age Restrictions from December 2025

The eSafety Commissioner has released updated guidance to help the online industry and the public prepare for Australia’s new Social Media Minimum Age obligation, which comes into effect on 10 December 2025. The update outlines which platforms are expected to be classified as “age-restricted social media platforms” and therefore required to take reasonable steps to prevent Australians under the age of 16 from maintaining accounts.

In its announcement, eSafety confirmed that the following services are likely to be considered age-restricted: Facebook, Instagram, Kick, Reddit, Snapchat, Threads, TikTok, X (formerly Twitter) and YouTube.

From 10 December, these platforms will be required to implement systems and measures that effectively prevent Australian children under 16 from signing up or maintaining accounts. Failure to comply may result in enforcement action, including civil penalties of up to $49.5 million.

In contrast, eSafety stated that a number of popular online services do not currently meet the criteria for an age-restricted social media platform. These include Discord, GitHub, Google Classroom, LEGO Play, Messenger, Roblox, Steam and Steam Chat, WhatsApp, and YouTube Kids. These services are viewed as falling outside the specific regulatory definition because their primary purpose is not considered social networking for the general public or they are designed for use under parental or educational supervision.

eSafety emphasised that the landscape of online services changes rapidly, and the classification of platforms will not be fixed. The regulator will continue to assess emerging services and re-evaluate existing ones as their functions evolve. Updated guidance will be published regularly on eSafety’s website to reflect these assessments and enforcement activities.

The Social Media Minimum Age obligation forms part of broader government efforts to strengthen online safety for children and young people. eSafety has highlighted that no online platform is entirely risk-free, even those not classified as age-restricted. Children can still be exposed to harmful conduct such as cyberbullying, grooming, requests for intimate images, and exposure to age-inappropriate content, including pornography and violent material.

To support parents, carers and young users, eSafety continues to maintain the eSafety Guide — an online resource providing practical information about social media platforms, apps, games and websites. The guide helps Australians understand platform features, privacy settings, and potential safety risks so they can make informed decisions about their online activity.

As the December 2025 deadline approaches, online platforms operating in Australia are expected to demonstrate clear compliance plans and age-assurance measures. eSafety’s proactive approach signals a significant shift in accountability for global technology companies, aiming to create a safer digital environment for Australian children.





Source link