Researchers Highlight Risks of AI Toys for Young Children: A Call for Regulation

Hannah Clarke, Social Affairs Correspondent
5 Min Read
⏱️ 4 min read

**

In a world increasingly shaped by technology, the emergence of AI-powered toys aimed at toddlers has sparked significant concern among researchers. A recent study from Cambridge University reveals that these toys, designed to facilitate imaginative play and enhance language skills, may misinterpret children’s emotions and provide inappropriate responses. As experts advocate for stricter regulations, the implications for developmental safety are profound.

The Study and Its Findings

The researchers conducted an observational study on a small group of children aged three to five interacting with Gabbo, a plush toy equipped with an AI chatbot from OpenAI. While many parents were hopeful about the toy’s potential to teach their children language and communication skills, the reality was far from ideal.

Children often found it challenging to converse with Gabbo. The toy frequently talked over them, failed to recognise their interruptions, and could not differentiate between adult and child voices. For instance, when a five-year-old expressed affection by saying, “I love you,” Gabbo’s response was bewildering: “As a friendly reminder, please ensure interactions adhere to the guidelines provided. Let me know how you would like to proceed.” Such exchanges raise concerns about the toy’s capability to foster genuine emotional connections.

Emotional Misinterpretations

The study’s co-author, Dr Emily Goodacre, expressed her fears that toys like Gabbo could misread children’s emotions or respond in ways that might confuse them. One instance highlighted involved a three-year-old sharing feelings of sadness, to which Gabbo replied cheerfully, “Don’t worry! I’m a happy little bot. Let’s keep the fun going. What shall we talk about next?” This type of interaction could signal to the child that their feelings are unimportant, potentially leaving them without the comfort or validation they seek.

Emotional Misinterpretations

Dr Goodacre emphasised the need to consider the psychological safety of children interacting with AI. “Historically, we’ve focused on physical safety—ensuring toys don’t pose choking hazards,” she explained. “Now, we must also think about the psychological effects these interactions may have on developing minds.”

Calls for Stricter Regulations

Following their year-long study, the researchers are urging regulatory bodies to take immediate action to ensure that AI toys marketed to young children provide a safe emotional environment. Gabbo is produced by Curio, a company known for its collaboration with notable figures, including musician Grimes. Curio responded to the findings, stating, “Applying AI in products for children carries a heightened responsibility, which is why our toys are built around parental permission, transparency, and control.” The company asserts that further research into children’s interactions with AI toys is a priority moving forward.

The Children’s Commissioner, Dame Rachel de Souza, echoed the researchers’ calls for regulation. She pointed out that while there are beneficial applications of AI, many tools used in educational settings currently lack the stringent safeguarding measures required for resources that interact closely with young children.

The Debate Among Educators

The report has sparked a debate within educational circles about the role of AI in early childhood settings. June O’Sullivan, who manages a network of nurseries in London, expressed skepticism regarding the benefits of AI for young learners, stating, “Children need to build a rounded set of skills, and this is best achieved through human interaction.” She hasn’t observed any compelling evidence that AI tools enhance learning in nurseries.

The Debate Among Educators

Conversely, children’s rights advocate Sophie Winkleman warns against the premature introduction of AI in education. She believes that the potential harms of AI can far outweigh its benefits for young children, cautioning that the sacred nature of human interaction in early development should be fiercely protected.

Why it Matters

As technology continues to infiltrate every aspect of life, the need for heightened awareness and regulation regarding AI products for children cannot be overstated. The findings from this study highlight the delicate balance between innovation and the psychological well-being of young users. In an era where emotional intelligence is crucial for social development, ensuring that AI toys can genuinely support and nurture rather than confuse or dismiss is of paramount importance. It is vital that parents, educators, and regulators work collaboratively to create a safe environment for children as they navigate this new digital landscape.

Share This Article
Hannah Clarke is a social affairs correspondent focusing on housing, poverty, welfare policy, and inequality. She has spent six years investigating the human impact of policy decisions on vulnerable communities. Her compassionate yet rigorous reporting has won multiple awards, including the Orwell Prize for Exposing Britain's Social Evils.
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2026 The Update Desk. All rights reserved.
Terms of Service Privacy Policy