Starmer Urges Tech Giants to Prioritise Online Safety for Children in Urgent Meeting

Hannah Clarke, Social Affairs Correspondent
5 Min Read
⏱️ 4 min read

**

In a significant meeting at Downing Street, Prime Minister Sir Keir Starmer expressed urgent concerns about the safety of children online, stating, “things cannot go on like this.” Executives from leading social media companies, including Meta, Snap, Google, TikTok, and X, gathered to discuss how to enhance protections for young users amid increasing scrutiny over the impact of digital platforms on child wellbeing.

Urgent Call for Action

During the meeting, which also involved Technology Secretary Liz Kendall, Starmer emphasised the need for immediate change in how social media companies operate. He acknowledged that while social media has potential benefits, the safety of British children must take precedence. “Curbing access is preferable to a world where harm is the price of participation,” Starmer declared, highlighting the government’s ongoing consultations regarding a potential ban on social media for children under 16, following similar actions taken in Australia.

The gathering included key figures such as Kate Alessi, managing director of Google UK, and Markus Reinisch, public policy chief at Meta. Starmer pointed out that some companies have already begun implementing safety measures, like disabling auto-play for children and enhancing parental controls over screen time. However, he insisted that more substantial action is necessary.

Parental Concerns and Expert Opinions

Starmer’s comments resonate with growing concerns from parents and experts regarding the detrimental effects of social media on children’s concentration, sleep, and relationships. “The evidence is mounting, and the status quo simply cannot be allowed to stand,” he asserted. Parents are not merely seeking minor adjustments; they want a fundamental reassessment of a system that they feel is failing their children.

Prof Gina Neff from the Minderoo Centre for Technology and Democracy observed that the meeting represents a proactive stance from the government amid pressures to adopt a more lenient approach towards US tech firms. This sentiment was echoed by others who called for decisive action rather than half-measures.

Political Divide and Legislative Challenges

Despite the urgency of the situation, the UK Parliament recently rejected a proposal to ban social media for users under 16 for a second time, arguing that such a ban was premature given the government’s plans for new regulations. Critics, including Conservative shadow education secretary Laura Trott, accused Labour MPs of failing to protect children by voting against the ban. Meanwhile, Liberal Democrat education spokeswoman Munira Wilson urged for immediate restrictions on harmful platforms for younger users.

The discussion comes in the wake of alarming research from the Molly Rose Foundation, which found that over 60% of underage Australians continue to access social media despite a ban introduced in December 2025. This foundation was established in memory of Molly Russell, a 14-year-old who tragically took her own life after being exposed to harmful content online.

A Demand for Accountability

Andy Burrows, CEO of the Molly Rose Foundation, welcomed the meeting but cautioned against empty promises from tech companies. He urged Starmer to turn his rhetoric into tangible action by committing to a new Online Safety Act that prioritises the wellbeing of children over profit. Prof Amy Orben, a digital mental health expert, echoed this sentiment, stressing the importance of holding companies accountable for their algorithms, which have increasingly contributed to young people’s struggles with online engagement.

As the government consults on potential age restrictions for various online services, including gaming and AI chatbots, more than 45,000 responses have already been collected from the public, alongside input from around 80 organisations.

Why it Matters

The dialogue surrounding children’s safety online is not merely a regulatory issue; it is a matter of safeguarding the future generation. As the digital landscape continues to evolve, the need for robust protections and accountability from tech companies becomes increasingly critical. The actions taken—or not taken—by leaders and social media giants will resonate far beyond policy discussions, affecting the lives, mental health, and development of millions of children. In a world where digital interaction is ubiquitous, prioritising the safety of the most vulnerable must be a collective responsibility, with a call for immediate, meaningful change.

Share This Article
Hannah Clarke is a social affairs correspondent focusing on housing, poverty, welfare policy, and inequality. She has spent six years investigating the human impact of policy decisions on vulnerable communities. Her compassionate yet rigorous reporting has won multiple awards, including the Orwell Prize for Exposing Britain's Social Evils.
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2026 The Update Desk. All rights reserved.
Terms of Service Privacy Policy