Australia Enforces Social Media Age Restrictions as Snapchat Locks Over 400,000 Accounts

Lisa Chang, Asia Pacific Correspondent
5 Min Read
⏱️ 4 min read

In a significant move to enhance online safety for young users, Australia has initiated a sweeping crackdown on social media accounts belonging to individuals under the age of 16. As part of this effort, Snapchat has locked or disabled more than 415,000 accounts, responding to the newly implemented Social Media Minimum Age (SMMA) law introduced in December. This legislative action is aimed at curbing the access of minors to social media platforms, a growing concern amid rising awareness of digital safety issues.

The Legislative Landscape

The introduction of the SMMA law marks a pivotal shift in Australia’s approach to online regulation. Mandating that social media platforms prevent users under 16 from creating accounts, the legislation represents a proactive stance against the potential harms associated with young people’s engagement with digital spaces. Snapchat, among the first ten platforms required to comply with this regulation, reported the disabling of these accounts by the end of January, with a continuous effort to lock additional accounts as necessary.

Prime Minister Anthony Albanese has underscored the urgency of this initiative, announcing the removal of 4.7 million accounts across various platforms shortly after the law’s enforcement. This figure has raised eyebrows, as it encompasses not only active underage accounts but also historical and inactive profiles.

Challenges in Age Verification

Despite Snapchat’s compliance with the new regulations, the platform has raised concerns about the efficacy of current age verification methods. The company indicated that existing tools, which rely on facial recognition technology, often exhibit inaccuracies, estimating ages within a two-to-three-year margin of error. This limitation suggests that some users under 16 may elude detection while older users might unjustly lose access to their accounts.

Furthermore, Snapchat highlighted the uneven application of the ban across the digital landscape. With younger users potentially migrating to less regulated messaging services, the effectiveness of the SMMA law may be undermined. Julie Inman Grant, Australia’s eSafety Commissioner, acknowledged the phased nature of enforcement, with the initial focus on the top ten platforms while smaller companies remain under scrutiny.

Implications for Digital Safety

Under the SMMA, platforms that fail to implement “reasonable steps” to enforce the age restrictions face hefty fines of up to A$49.5 million (£24.5 million). However, the regulator has not disclosed a detailed breakdown of the accounts removed by each platform, leading to questions about transparency and accountability. This lack of clarity complicates the public’s understanding of the law’s impact and effectiveness.

Snapchat, while supportive of the overarching goal to enhance online safety, has voiced opposition to a blanket ban on users under 16. The company argues that it should be recognised primarily as a messaging app, used for connecting with friends and family, rather than a conventional social media platform. They contend that simply severing these connections does not necessarily contribute to the well-being of young individuals.

Global Repercussions

Australia’s legislative actions are drawing international attention, with other countries, including the UK, considering similar measures. Recently, an amendment supporting a ban on under-16s was backed by the House of Lords. As nations grapple with the complexities of digital safety and the responsibilities of tech giants, Australia’s approach could serve as a model—or a cautionary tale—for others navigating this contentious terrain.

Why it Matters

The implications of Australia’s social media age restrictions extend far beyond its borders. As governments worldwide seek to protect minors in an increasingly digital world, the effectiveness and fairness of such regulations will be scrutinised. The balance between safeguarding young users and ensuring their right to communicate and connect is delicate, and Australia’s experiment may well influence the global dialogue on digital safety policies. As this issue evolves, the need for robust, accurate, and fair age verification systems becomes ever more pressing, shaping the future of online interactions for generations to come.

Share This Article
Lisa Chang is an Asia Pacific correspondent based in London, covering the region's political and economic developments with particular focus on China, Japan, and Southeast Asia. Fluent in Mandarin and Cantonese, she previously spent five years reporting from Hong Kong for the South China Morning Post. She holds a Master's in Asian Studies from SOAS.
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2026 The Update Desk. All rights reserved.
Terms of Service Privacy Policy