Australia’s e-safety commissioner has raised serious concerns regarding the enforcement of age restrictions on popular social media platforms. Nearly four months after the implementation of a ban on under-16s accessing major networks, platforms such as Meta, TikTok, and YouTube have been warned of significant compliance gaps, putting the onus on them to rectify the situation or face substantial penalties.
Compliance Under Scrutiny
In a statement released on Tuesday, Commissioner Julie Inman Grant highlighted that while initial measures have been taken by social media companies, they are failing to meet the stringent expectations laid out by Australian law. Under the new legislation, which is among the strictest globally, the largest social media networks—including TikTok, Instagram, Snapchat, YouTube, Facebook, and X—are required to prevent users under the age of 16 from creating accounts. Non-compliance could lead to fines reaching A$49.5 million (£26.5 million).
As of early March, a report from the e-safety commission indicated that around 5 million accounts had been blocked due to age restrictions. However, the commissioner pointed out that “major gaps” still exist, particularly regarding how these platforms are managing age verification processes.
Age Verification Loopholes
The investigation unveiled alarming findings about the ease with which children can circumvent age restrictions. The report noted that platforms were allowing under-16 users to repeatedly attempt age verification, thus enabling them to eventually bypass the restrictions. This loophole raises questions about the effectiveness of the measures in place and highlights the need for more robust verification systems.
A survey conducted with approximately 900 parents and guardians revealed that nearly 50 per cent reported their child had access to at least one social media platform. Following the law’s implementation, this figure dropped to around 31 per cent, suggesting some initial compliance. However, several platforms have been slow to create efficient channels for parents to report age-inappropriate accounts, further complicating the enforcement of the law.
Ongoing Investigations and Industry Response
Commissioner Grant confirmed that her office is actively investigating potential non-compliance by the major platforms. However, proving that they have not taken adequate steps to prevent underage access will require time and thorough evidence-gathering. “We certainly expect companies operating in Australia to comply with our safety laws,” Grant stated, emphasising the necessity for these companies to either adapt their practices or face escalating repercussions, including reputational damage on a global scale.
In response to the report, a spokesperson for Meta acknowledged the difficulties inherent in accurately determining a user’s age, suggesting that a more effective method would be to implement stringent age verification processes and require parental approval at the app store level. Meanwhile, the company indicated it will continue to invest in measures aimed at detecting and removing accounts belonging to users under 16.
Why it Matters
This situation underscores a critical intersection between technology and child safety, as well as the responsibility of social media companies to safeguard young users. The ongoing challenges in age verification highlight the need for more effective regulatory frameworks and industry-wide collaboration to ensure compliance. As Australia sets a precedent with its stringent regulations, the world will be watching to see how these platforms adapt to meet legal standards while protecting the privacy and safety of their youngest users. This situation could serve as a catalyst for broader regulatory discussions on digital safety, potentially shaping policies in other jurisdictions.