In a significant setback for child protection on digital platforms, the European Parliament has opted not to renew a crucial law allowing major technology companies to monitor for child sexual exploitation. This decision, which has raised alarms among child safety advocates and tech giants alike, creates a legal void that experts warn could lead to a marked increase in unreported cases of abuse.
The Expiry of Essential Safeguards
Originally enacted as part of the EU Privacy Act in 2021, the law provided a temporary allowance for companies to implement automated detection systems capable of identifying harmful content, including child sexual abuse material (CSAM) and instances of grooming. However, the legislation lapsed on 3 April, and despite the urgency expressed by various stakeholders, the Parliament has chosen not to pursue an extension, citing concerns over privacy rights.
This lapse introduces a precarious situation for tech firms such as Google, Meta, Snap, and Microsoft, which are now grappling with conflicting legal obligations. While they are prohibited from scanning for harmful content, they remain responsible for removing illegal materials under the Digital Services Act. In a joint statement on a Google blog, these companies expressed their disappointment, calling the failure to extend the law an “irresponsible failure” that jeopardises the ongoing efforts to safeguard children online.
Child Safety Advocates Sound the Alarm
Experts in child protection have voiced serious concerns about the implications of this legislative gap. Historical data suggests that similar lapses have resulted in catastrophic declines in reported cases of child exploitation. For instance, during a comparable legal gap in 2021, reports from EU-based accounts to the National Center for Missing and Exploited Children (NCMEC) plummeted by 58% within just 18 weeks.
John Shehan, vice-president at NCMEC, highlighted the direct consequences of disrupted detection tools, stating, “When detection goes dark, the abuse doesn’t stop.” In 2025 alone, NCMEC documented 21.3 million reports, comprising over 61.8 million files suspected of being linked to child abuse, the majority of which came from outside the United States.
A Global Ripple Effect
The ramifications of the EU’s decision are expected to extend well beyond its borders. Child exploitation often transcends national boundaries, with offenders able to target vulnerable minors across the globe. Shehan noted that without robust safeguards in place, those engaging in “sextortion” and other predatory behaviours could exploit this legislative ambiguity to further their illicit activities.
“There’s a real concern that predators can now operate with less scrutiny,” Shehan warned, emphasising the urgent need for international cooperation in combating child exploitation online.
Legislative Stalemate and Ongoing Negotiations
The debate surrounding child sexual abuse regulation has been contentious for years, with opponents arguing that such measures infringe upon privacy rights and could lead to intrusive surveillance practices. Privacy advocates have likened the scanning protocol to “chat control,” which they argue could infringe on fundamental rights.
Hannah Swirsky, head of policy and public affairs at the Internet Watch Foundation, expressed frustration at the ongoing stalemate. “Blocking CSAM is not an evasion of privacy,” she argued. “If the EU is serious about protecting children online, then it needs to agree on a permanent legislative framework for safeguarding children and for enabling detection.”
Despite the technology being designed to detect known images of abuse without storing personal data, the concerns regarding privacy continue to dominate discussions. Emily Slifer, director of policy at Thorn, a non-profit dedicated to combating child exploitation, explained that the detection systems rely on machine learning algorithms that can distinguish between abusive and consensual content.
Why it Matters
The failure to extend this critical law not only undermines child protection efforts within Europe but sets a worrying precedent globally. As the digital landscape continues to evolve, the absence of robust legal frameworks raises profound questions about the responsibilities of tech companies in safeguarding vulnerable populations. Without immediate legislative action, the EU risks creating an environment where child predators operate with impunity, and the consequences could be dire. The time for decisive action is now, as the safety of children in digital spaces hangs in the balance.