A recent study has shed light on the troubling interactions between toddlers and AI-powered toys, prompting calls for stricter regulations in this emerging market. Researchers at Cambridge University observed how young children, aged three to five, engaged with a cuddly AI toy named Gabbo, revealing significant issues with the toy’s ability to understand and respond to emotional cues.
The Study’s Findings
Despite the growing presence of AI toys designed for preschoolers, there has been a surprising lack of research into their effects on young minds. The Cambridge team conducted one of the first investigations of its kind, discovering only seven relevant studies globally—none focusing specifically on the experiences of toddlers.
Gabbo, equipped with a voice-activated AI chatbot from OpenAI, is marketed as a tool to encourage imaginative play and enhance language skills. However, during the study, many children struggled to communicate effectively with the toy. Instances of Gabbo talking over the children or failing to recognise their voices left parents concerned.
One particularly striking moment occurred when a five-year-old expressed affection by saying, “I love you,” to Gabbo. The toy’s clinical response was, “As a friendly reminder, please ensure interactions adhere to the guidelines provided. Let me know how you would like to proceed.” Such interactions highlight the potential disconnect between a child’s emotional expression and the toy’s mechanical replies.
Emotional Disconnect
Dr Emily Goodacre, a co-author of the study, voiced her concern that toys like Gabbo could misinterpret emotional cues, leaving children without the comfort and support they might seek during moments of vulnerability. In one instance, a three-year-old shared feelings of sadness, only for Gabbo to respond dismissively, “Don’t worry! I’m a happy little bot. Let’s keep the fun going. What shall we talk about next?” This response could inadvertently trivialise a child’s feelings, sending a message that their emotions are unimportant.

Professor Jenny Gibson, also part of the research team, emphasised the need for a shift in focus from merely physical safety to psychological wellbeing. “Historically, there’s been a lot of attention to physical safety—we don’t want toys from which children could pull eyes and swallow them,” she noted. “Now we need to start thinking about psychological safety too.”
Calls for Regulation
The findings have ignited a debate about the regulatory framework surrounding AI toys. The researchers are urging swift action to ensure that products aimed at young children not only promote learning but also foster emotional and psychological safety. Curio, the company behind Gabbo, acknowledged the responsibility that comes with integrating AI into children’s products and stated its commitment to transparency and parental control.
Dame Rachel de Souza, the Children’s Commissioner, echoed the need for regulation, stating that while AI has the potential for positive applications, the lack of stringent safety checks in educational settings is concerning. “Without proper regulation, many of the tools and models used as classroom assistants or teaching aids are not subject to the stringent safeguarding checks nursery providers would require,” she commented.
The Role of Parents and Educators
The report also advises parents to supervise interactions with AI toys by keeping them in communal spaces and to carefully review privacy policies. However, opinions among nursery workers vary on the benefits of AI in early childhood education. June O’Sullivan, who oversees a network of nurseries in London, expressed scepticism, suggesting that genuine human interaction is essential for children to develop a well-rounded skill set. “I couldn’t find anything that made me feel like bringing it into our nurseries was going to enhance their learning,” she remarked.

In contrast, children’s rights advocate Sophie Winkleman warns against the premature introduction of AI in educational settings, arguing that the potential harms could far outweigh the benefits. “The human touch for little children is sacred and something that should be really protected and fought for,” she asserted.
Why it Matters
As technology continues to permeate every aspect of life, the implications of introducing AI into the realm of childhood play and learning must be considered carefully. The emotional development of young children is crucial, and toys that fail to provide appropriate responses to their feelings could have lasting effects on their understanding of social interactions. The conversation around AI in the early years is not just about innovation; it’s fundamentally about safeguarding the emotional and psychological development of our children in an increasingly digital world.