Australia’s Social Media Ban for Under-16s Faces Compliance Challenges from Major Platforms

Ryan Patel, Tech Industry Reporter
5 Min Read
⏱️ 4 min read

Australia’s recent legislation prohibiting social media use among individuals under the age of 16 is encountering significant enforcement hurdles, according to the country’s internet regulator. Despite the law’s implementation late last year, major platforms such as Facebook, Instagram, TikTok, and Snapchat have been scrutinised for their lack of robust compliance measures aimed at protecting younger users from potentially harmful content.

Regulatory Oversight and Initial Findings

The Australian eSafety Commissioner, Julie Inman Grant, has expressed “significant concerns” regarding the adherence of these platforms to the new regulations. The law, which restricts access to ten major social media services, was enacted to shield children from damaging online environments. However, the eSafety Commission’s first report since the law became effective on December 10 reveals troubling practices among the platforms.

Key issues identified include:

– Allowing users who previously claimed to be under 16 to later verify their age and regain access.

– Permitting under-16s to repeatedly attempt age verification processes, undermining the intent of the ban.

– Inadequate measures to prevent new under-16 users from creating accounts.

– Limited avenues for parents and guardians to report underage accounts that remain active.

During the initial month of enforcement, the eSafety Commissioner reported that 4.7 million accounts were restricted or removed. However, the Commissioner is adamant that this number does not reflect adequate compliance, urging a more thorough approach from the platforms in preventing underage access.

Platforms’ Responses and Challenges

Social media giants have acknowledged the challenges associated with age verification, with Meta (the parent company of Facebook and Instagram) asserting its commitment to comply with Australian regulations. A spokesperson emphasised that the broader industry grapples with the complexities of accurately determining users’ ages. They advocate for stronger age verification methods and parental consent processes at the app store level as essential mechanisms to safeguard young users.

Snap Inc., responsible for Snapchat, reported that it has locked approximately 450,000 accounts of users suspected to be underage, with ongoing efforts to identify and restrict additional accounts. Despite these claims, the effectiveness of such measures remains in question.

Cultural Implications and Parental Support

Despite the government’s efforts to enforce this ban, reports from schools indicate that many under-16s are still accessing social media platforms. Some students have reported evading age checks entirely, suggesting that the enforcement mechanisms are not achieving their intended outcomes.

Parents have largely supported the initiative, as it provides them with a framework to counter their children’s demands for social media access. This governmental backing is seen as a crucial ally for parents attempting to navigate the complexities of digital engagement with their children.

However, critics argue that outright bans may not be the optimal solution. Some technology experts and child well-being advocates suggest that education on the risks of social media could be a more effective approach. They also raise concerns about the potential exclusion of vulnerable groups, such as rural youths and LGBTQ+ teenagers, who often find community and support online.

Looking Ahead: Enforcement and Cultural Change

The eSafety Commissioner has indicated that the agency will intensify its enforcement efforts, shifting from monitoring compliance to actively gathering evidence of non-compliance. Inman Grant stated, “Durable, generational change takes time – but these platforms have the capability to comply today.” She underscored that the responsibility lies with these platforms to implement proper systems to prevent underage users from establishing accounts.

The implications of this legislation extend beyond Australia, with countries such as the UK closely observing its impact. As the regulatory landscape evolves, the balance between safeguarding children and encouraging responsible digital engagement will be a crucial area of focus.

Why it Matters

The enforcement of Australia’s social media ban for under-16s represents a significant shift in how governments are approaching online safety for children. As the eSafety Commissioner emphasises the need for rigorous compliance and cultural change within the tech industry, the outcome of this initiative could set a precedent for global standards in social media regulation. The conversation around protecting young users is critical, but it must also consider inclusive strategies that educate and empower rather than merely impose restrictions. The ongoing developments in this area will undoubtedly influence the future of online engagement for younger audiences worldwide.

Share This Article
Ryan Patel reports on the technology industry with a focus on startups, venture capital, and tech business models. A former tech entrepreneur himself, he brings unique insights into the challenges facing digital companies. His coverage of tech layoffs, company culture, and industry trends has made him a trusted voice in the UK tech community.
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2026 The Update Desk. All rights reserved.
Terms of Service Privacy Policy