**
In response to Australia’s recent legislation prohibiting social media access for individuals under the age of 16, Snapchat has taken significant measures by locking or disabling more than 415,000 accounts. The social media platform, which began enforcing these regulations in December, has acknowledged the challenges surrounding age verification technology, highlighting potential loopholes that could undermine the effectiveness of the ban.
Mass Account Disabling
As of late January 2026, Snapchat reported that it had disabled or locked over 415,000 accounts belonging to users who either self-identified as under 16 or were flagged by the platform’s age detection systems. The company revealed in a blog update that it continues to identify and lock additional accounts on a daily basis. This initiative is part of a broader compliance effort involving ten major social media platforms mandated by the Australian government to restrict access for younger users.
Prime Minister Anthony Albanese hailed the ban’s early success, noting that approximately 4.7 million accounts had been disabled across various platforms shortly after the enforcement began. However, despite these initial figures, concerns have emerged regarding the reliability of Snapchat’s age estimation technology, which has reportedly been easily circumvented by teenagers.
Challenges in Age Verification
Snapchat has openly recognised the inherent limitations of its age verification systems, stating there are “significant gaps” in the implementation of the ban. The platform explained that its facial age estimation technology is typically accurate within a margin of two to three years, which means that some individuals under 16 may successfully bypass the restrictions while older users might find themselves erroneously locked out.
Furthermore, Snapchat raised alarms about the possibility of young users migrating to less regulated messaging applications, which could expose them to greater risks. The company emphasised the need for policymakers to consider these dynamics as they assess the effectiveness of the legislation.
Regulatory Focus and Future Considerations
While the initial focus of the eSafety Commission has been on the ten identified platforms, there is an expectation that all social media services with Australian users will evaluate their compliance with the new regulations. Julie Inman Grant, the eSafety Commissioner, indicated that the regulatory team would prioritise platforms with substantial user bases, citing that many smaller companies with fewer than 100,000 users will also be scrutinised in due course.
Inman Grant noted that improvements in age assurance technology are necessary and mentioned that Snapchat’s use of facial age estimation lacks a crucial “liveness test” to verify that the image is genuine. This oversight may contribute to false positives, complicating the enforcement of age restrictions.
The Broader Context
The total number of account deactivations reported since the ban’s implementation includes a mixture of accounts identified as underage, as well as inactive, historical, and duplicate profiles. Despite the significant figures released, other platforms have yet to disclose how many accounts they have deactivated, adding to the ambiguity surrounding the overall impact of the ban.
Why it Matters
The implications of this social media age ban extend beyond mere account statistics; they delve into the essential discourse surrounding online safety for younger users. As Snapchat and other platforms grapple with the efficacy of their age verification methods, the potential for young individuals to access unregulated communication channels raises pressing questions about digital protection. Policymakers must critically evaluate the effectiveness of these measures and ensure that technological advancements are integrated to create a safer online environment for all users.