Australia’s E-Safety Commissioner Raises Alarm Over Social Media Compliance with Under-16 Ban

Ryan Patel, Tech Industry Reporter
5 Min Read
⏱️ 4 min read

Australia’s E-Safety Commissioner, Julie Inman Grant, has issued a stern warning to major social media platforms regarding significant shortcomings in enforcing the recently implemented ban on under-16 users. Nearly four months after the law took effect, there are concerns that platforms like Meta, YouTube, and TikTok are not adhering to the stringent regulations designed to protect younger users from online harm.

The Legislation and Its Implications

The under-16 ban represents one of the most robust digital safety measures globally, requiring ten of the largest social media networks—including TikTok, Instagram, Snapchat, and Facebook—to effectively prevent minors from accessing their services. Non-compliance could result in hefty fines of up to A$49.5 million (£26.5 million). Despite the initial blocking of approximately five million accounts as a response to the legislation, the E-Safety Commissioner’s recent report highlights that “major gaps” persist in the enforcement efforts by these companies.

The legislation’s intent is clear: to safeguard children from potential online threats. However, Inman Grant’s statement reflects a troubling reality. “While social media platforms have taken some initial action, I am concerned through our compliance monitoring that some may not be doing enough to comply with Australian law,” she remarked, indicating a lack of confidence in the measures currently in place.

Survey Results Reveal Parental Concerns

A survey conducted between January 19 and February 2 this year revealed that nearly half of the parents and guardians questioned reported their children had accounts on at least one social media platform. However, following the law’s implementation, this figure dropped to around 31 per cent. The report suggests that while some platforms have made attempts to comply, they have not established effective mechanisms for parents to report age-restricted accounts, which raises questions about the robustness of their age verification processes.

Moreover, the report highlighted a troubling loophole wherein under-16 users are able to repeatedly attempt age verification in a bid to circumvent restrictions. This undermines the very purpose of the legislation and poses additional risks to younger users.

Ongoing Investigations and Future Consequences

The E-Safety Commissioner is currently investigating potential non-compliance by several platforms, including Facebook and TikTok. However, establishing that these companies have not taken adequate steps to prevent under-16s from maintaining accounts is a complex process. “The evidence must establish that the platform has not taken reasonable steps to prevent children aged under 16 from having an account,” Inman Grant explained.

Failure to comply could lead to escalating repercussions, both legally and reputationally. Inman Grant cautioned that social media companies could face significant reputational damage with consumers and governments worldwide if they do not align their practices with Australian safety laws.

In response to the report, a spokesperson for Meta acknowledged the industry’s challenges in accurately determining users’ ages, stating, “The most effective, privacy-protective and consistent approach is to require robust age verification and parental approval at the app store.” The company has committed to investing in enforcement measures to detect and remove accounts belonging to under-16 users.

The Path Forward for Social Media Platforms

As the scrutiny on social media platforms intensifies, companies must reassess their compliance strategies. The E-Safety Commission’s proactive stance is a clear signal to the industry that the Australian government is serious about protecting its young citizens.

With the escalation of digital interactions among younger users, platforms must innovate and implement more effective age verification methods to ensure compliance. This could include more rigorous identification processes that respect user privacy while ensuring that minors cannot exploit existing loopholes.

Why it Matters

The implications of these developments extend beyond Australia, casting a spotlight on the global responsibility of social media platforms to safeguard young users. As digital landscapes evolve, the pressure on companies to implement robust safety measures will only increase. The actions—or inactions—of these platforms will likely influence regulatory frameworks in other countries, setting a precedent for how digital safety is approached on an international scale. The stakes are high, not just for the companies involved but also for the millions of children who navigate these online spaces daily.

Share This Article
Ryan Patel reports on the technology industry with a focus on startups, venture capital, and tech business models. A former tech entrepreneur himself, he brings unique insights into the challenges facing digital companies. His coverage of tech layoffs, company culture, and industry trends has made him a trusted voice in the UK tech community.
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2026 The Update Desk. All rights reserved.
Terms of Service Privacy Policy