Australia’s Social Media Under-16 Ban Faces Compliance Challenges: eSafety Regulator Calls for Stricter Enforcement

Ryan Patel, Tech Industry Reporter
5 Min Read
⏱️ 4 min read

Australia’s initiative to ban social media usage for children under the age of 16 is under scrutiny, with the country’s eSafety regulator expressing significant concerns about compliance amongst major platforms. This comes after the legislation, which took effect in December 2022, aimed to shield young users from harmful content and addictive online behaviours. Despite the intent, the regulator has flagged multiple shortcomings in how platforms like Facebook, Instagram, Snapchat, TikTok, and YouTube are adhering to the rules.

Regulatory Oversight and Compliance Issues

In its inaugural report since the implementation of the ban, the eSafety Commissioner, Julie Inman Grant, highlighted “a number of poor practices” observed across the five key platforms. Among the concerns raised were practices allowing users who had previously declared themselves underage to bypass restrictions by merely asserting they were over 16. Additionally, these platforms reportedly enabled minors to repeatedly attempt the same age verification methods and failed to establish effective measures to prevent new underage registrations.

The report indicated that limited data was available regarding the ban’s efficacy. However, in January, the eSafety regulator noted that approximately 4.7 million accounts had either been restricted or removed within the first month of the law’s enactment. Despite these actions, Inman Grant voiced her apprehension, stating, “While social media platforms have taken some initial action, I am concerned through our compliance monitoring that some may not be doing enough to comply with Australian law.”

Industry Response and the Challenge of Age Verification

The major platforms have responded to the criticisms with a commitment to comply with the new regulations, albeit with caveats. A spokesperson for Meta, which operates Facebook and Instagram, acknowledged the complexities surrounding accurate age verification, asserting that the industry as a whole grapples with this issue. They proposed that robust age verification processes and parental approval mechanisms at the app store level represent the most effective strategy to safeguard young users.

Snap, the parent company of Snapchat, reported that it had locked 450,000 accounts of suspected underage users and continues to implement similar measures. Nevertheless, the effectiveness of such strategies remains contentious, especially considering anecdotal evidence from Australian students indicating that many under-16s still access the platforms without significant barriers.

The Broader Context: Public Sentiment and Criticism

While the ban has garnered considerable support from parents who appreciate the government’s intervention in regulating their children’s online activities, it has also drawn criticism from technology experts and child welfare advocates. Critics argue that education about the potential dangers of social media is a more effective approach than outright bans. They contend that such restrictions may disproportionately affect marginalized groups, including rural youth, disabled teens, and LGBTQ+ individuals, who often rely on online communities for support.

The eSafety Commissioner recognised the complexity of this cultural shift, remarking that the reform is “unwinding 20 years of entrenched social media practices.” Inman Grant emphasised that lasting change requires time and cooperation from both the platforms and parents, stating, “While the onus is on age-restricted platforms to take reasonable steps to keep children under 16 from having accounts, parents are proving pivotal partners in this cultural reset.”

Why it Matters

The implications of Australia’s social media ban extend beyond its borders, potentially influencing global discussions on child safety in the digital space. As countries like the UK look on, the challenges faced in enforcing such regulations could inform their own policy frameworks. The ongoing tussle between safeguarding youth and the vast interests of social media companies encapsulates a broader societal dilemma: how to balance technological advancement with the imperative of protecting vulnerable populations. Ultimately, the effectiveness of this policy will hinge on the commitment of both the regulatory bodies and the platforms to genuinely prioritise the wellbeing of children over profit margins.

Share This Article
Ryan Patel reports on the technology industry with a focus on startups, venture capital, and tech business models. A former tech entrepreneur himself, he brings unique insights into the challenges facing digital companies. His coverage of tech layoffs, company culture, and industry trends has made him a trusted voice in the UK tech community.
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2026 The Update Desk. All rights reserved.
Terms of Service Privacy Policy