Concerns Raised Over AI Toys Misinterpreting Children’s Emotions

Hannah Clarke, Social Affairs Correspondent
5 Min Read
⏱️ 4 min read

Recent research has sparked significant concern regarding the interaction between young children and AI-powered toys, highlighting the potential emotional miscommunication these devices can generate. The study, led by a team from the University of Cambridge, focused on how children aged three to five engage with an AI toy named Gabbo, revealing that these toys often misread emotional cues and respond inappropriately.

A Closer Look at Gabbo

Gabbo, which features an AI chatbot developed by OpenAI, is designed to foster imaginative play and encourage language development among toddlers. While parents expressed interest in the toy’s ability to enhance communication skills, the reality of the interactions proved to be far less effective. The study observed several instances where children struggled to communicate with Gabbo, illustrating a concerning disconnect.

For example, one five-year-old’s affectionate declaration of “I love you” was met with a mechanical response: “As a friendly reminder, please ensure interactions adhere to the guidelines provided. Let me know how you would like to proceed.” Such replies not only lack emotional warmth but also potentially confuse children during a critical phase of their social development.

Emotional Disconnect

Dr. Emily Goodacre, a co-author of the study, pointed out that toys like Gabbo might misinterpret the emotions of children, leaving them without the comfort and support they seek. In one interaction, a three-year-old expressed sadness by saying, “I’m sad,” only to receive a cheerful, dismissive response: “Don’t worry! I’m a happy little bot. Let’s keep the fun going. What shall we talk about next?” This kind of interaction could signal to a child that their feelings are unimportant, raising alarms about the psychological implications of such technology.

Emotional Disconnect

The research team emphasised that while physical safety in toys has historically been a priority—think of choking hazards—there is now a pressing need to focus on the psychological safety of young users. This shift in perspective is critical as children learn to navigate their emotions and social cues.

Calls for Regulation

In light of these findings, the researchers have urged regulators to implement stricter guidelines to ensure that products aimed at young children provide a safe psychological environment. Gabbo is produced by Curio, a company that has previously collaborated with notable figures like singer Grimes. Curio has acknowledged the responsibility that comes with incorporating AI into children’s products, stating that their toys are developed with parental permission, transparency, and control in mind.

Dame Rachel de Souza, the Children’s Commissioner, echoed the need for regulation, highlighting that while AI can be beneficial, many tools currently used in educational settings lack the critical safeguarding checks necessary for young children. This is particularly concerning given the increasing integration of AI in nursery and early years education.

Divided Opinions Among Educators

The introduction of AI toys in educational settings has sparked a debate among nursery workers. June O’Sullivan, who oversees a chain of 43 nurseries in London, expressed scepticism about the benefits of AI for early years education. She noted that children thrive on human interaction and that learning is best facilitated through direct human engagement rather than through AI tools.

Divided Opinions Among Educators

Conversely, some advocates for AI in education argue that, if implemented thoughtfully, these tools could enhance learning. Nonetheless, the prevailing sentiment among many educators is cautious, with a growing consensus that the emotional and developmental needs of children should take precedence.

The Human Touch

Sophie Winkleman, an actor and children’s rights campaigner, is vocal about the risks associated with AI in early education. She firmly believes that the unique human connection children share with their caregivers is irreplaceable and should be preserved at all costs. Winkleman argues that the potential harms of introducing AI at such a formative stage of development may outweigh the benefits, advocating for a more human-centric approach to early learning.

Why it Matters

As technology continues to evolve, ensuring the emotional wellbeing of our youngest generations must remain a priority. The findings from this study not only raise important questions about the appropriateness of AI in children’s toys but also underscore the need for thoughtful regulation and oversight. In a world increasingly defined by technology, it is crucial that we safeguard the psychological health of children, ensuring they grow up in environments that genuinely support their emotional and developmental needs.

Share This Article
Hannah Clarke is a social affairs correspondent focusing on housing, poverty, welfare policy, and inequality. She has spent six years investigating the human impact of policy decisions on vulnerable communities. Her compassionate yet rigorous reporting has won multiple awards, including the Orwell Prize for Exposing Britain's Social Evils.
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2026 The Update Desk. All rights reserved.
Terms of Service Privacy Policy