Call for Stricter Regulations on AI Toys Amid Concerns Over Emotional Misinterpretation

Grace Kim, Education Correspondent
5 Min Read
⏱️ 4 min read

**

A recent study has raised alarms regarding AI-powered toys aimed at young children, highlighting their potential to misinterpret emotions and respond inappropriately. Conducted by researchers at Cambridge University, this investigation is among the first to delve into how children aged three to five interact with these advanced technologies. With a growing market for such toys, experts are urging for tighter regulations to ensure the psychological safety of preschoolers.

The Study’s Findings

The research focused on a specific AI toy named Gabbo, developed by Curio, which incorporates a voice-activated chatbot powered by OpenAI. Designed to engage children in imaginative play and foster language skills, Gabbo was observed interacting with a small group of children. While parents expressed optimism about the toy’s educational potential, many children struggled to communicate effectively with it. Issues arose as Gabbo frequently spoke over the children and failed to distinguish between adult and child voices, leading to awkward interactions.

For instance, when a five-year-old expressed affection by saying, “I love you,” Gabbo’s robotic response was, “As a friendly reminder, please ensure interactions adhere to the guidelines provided.” Such responses not only missed the emotional nuance but could also confuse children who are still learning about social cues.

Concerns Over Emotional Development

Dr Emily Goodacre, a co-author of the study, voiced her concerns that toys like Gabbo can misread children’s emotions, potentially leaving them without the comfort typically expected from playtime companions. In one notable instance, when a three-year-old confided, “I’m sad,” Gabbo replied, “Don’t worry! I’m a happy little bot. Let’s keep the fun going. What shall we talk about next?” Such interactions risk trivialising a child’s emotions during a critical developmental phase.

Concerns Over Emotional Development

Professor Jenny Gibson, also involved in the research, emphasised the need to consider not just the physical safety of toys but also their psychological impact. Historically, the focus has been on ensuring toys are safe from physical harm, but as technology evolves, the implications for emotional safety cannot be overlooked.

Recommendations for Parents and Regulators

The researchers advocate for immediate regulatory action to ensure that products marketed to very young children prioritise psychological safety. Curio, the manufacturer of Gabbo, acknowledged the responsibility that comes with integrating AI into children’s toys, stating that their products are designed with parental permission and transparency in mind. The company is committed to prioritising research into the interactions children have with AI toys in the future.

Dame Rachel de Souza, the Children’s Commissioner, echoed the researchers’ sentiments, stressing the necessity for robust regulations. She pointed out that while AI can have beneficial applications, the absence of stringent checks means many AI tools used in educational settings lack the safeguards required for early years resources.

Divided Opinions in the Education Sector

The reception of AI toys among nursery professionals is mixed. June O’Sullivan, who manages a network of nurseries in London, expressed skepticism about the benefits of AI in early childhood education. She advocated for interpersonal interactions as crucial for developing a well-rounded skill set in children, arguing that human engagement is irreplaceable.

Similarly, actress and children’s rights advocate Sophie Winkleman warned against the integration of AI in early learning environments. She believes that the potential harms to children’s development outweigh any possible advantages and insists that cultivating AI skills should be postponed until later stages of education.

Why it Matters

The increasing prevalence of AI in children’s toys raises significant questions about emotional development and interaction. As young minds navigate the complexities of social cues, inappropriate responses from AI companions could have lasting repercussions on their understanding of relationships and emotions. The call for enhanced regulations is not merely about safety; it’s about ensuring that our youngest generation receives the support they need during formative years. As technology continues to evolve, balancing innovation with the psychological well-being of children must remain a priority.

Share This Article
Grace Kim covers education policy, from early years through to higher education and skills training. With a background as a secondary school teacher in Manchester, she brings firsthand classroom experience to her reporting. Her investigations into school funding disparities and academy trust governance have prompted official inquiries and policy reviews.
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2026 The Update Desk. All rights reserved.
Terms of Service Privacy Policy