The New Wild West of AI Kids’ Toys

The New Wild West of AI Kids’ Toys

AI 儿童玩具的“新西部荒野”

The main antagonist of Toy Story 5, in theaters this summer, is a green, frog-shaped kids’ tablet named Lilypad, a genius new villain for the beloved Pixar franchise. But if Pixar had its ear to the ground, it might have used an AI kids’ toy instead. 今年夏天上映的《玩具总动员 5》中的大反派是一个名为 Lilypad 的绿色青蛙形状儿童平板电脑,这是皮克斯经典系列中一个天才般的新反派设定。但如果皮克斯对现实世界有足够的洞察,它或许会选择使用一个 AI 儿童玩具作为反派。

AI toys are seemingly everywhere, marketed online as friendly companions to children as young as three, and they’re still a largely unregulated category. It’s easier than ever to spin up an AI companion, thanks to model developer programs and vibe coding. In 2026, they’ve become a go-to trend in cheap trinkets, lining the halls of trade shows like CES, MWC, and Hong Kong’s Toys & Games Fair. By October 2025, there were over 1,500 AI toy companies registered in China, and Huawei’s Smart HanHan plush toy sold 10,000 units in China in its first week. Sharp put its PokeTomo talking AI toy on sale in Japan this April. AI 玩具似乎无处不在,在网上被营销为三岁幼儿的友好伙伴,且目前仍是一个基本处于监管真空的领域。得益于模型开发程序和“氛围编程”(vibe coding),开发一个 AI 伙伴变得前所未有的简单。到 2026 年,它们已成为廉价小商品中的热门趋势,充斥在 CES、MWC 和香港玩具展等展会的走廊里。截至 2025 年 10 月,中国注册的 AI 玩具公司已超过 1500 家,华为的“智能憨憨”毛绒玩具在中国上市首周就售出了 1 万台。夏普(Sharp)则于今年 4 月在日本发售了其 PokeTomo 会说话的 AI 玩具。

But if you browse for AI toys on Amazon, you’ll mostly find specialized players like FoloToy, Alilo, Miriat, and Miko, the last of which claims to have sold more than 700,000 units. 但如果你在亚马逊上搜索 AI 玩具,你会发现主要是 FoloToy、Alilo、Miriat 和 Miko 等专业厂商,其中 Miko 声称其销量已超过 70 万台。

Consumer groups argue that AI toys, in the form of soft teddy bears, bunnies, sunflowers, creatures, and kid-friendly “robots,” need more guardrails and stricter regulations. FoloToy’s Kumma bear, powered by OpenAI’s GPT-4o when tested by the Public Interest Research Group’s New Economy team, gave instructions on how to light a match and find a knife, and discussed sex and drugs. Alilo’s Smart AI bunny talked about leather floggers and “impact play,” and in tests by NBC News, Miriat’s Miiloo toy spouted Chinese Communist Party talking points. 消费者团体认为,这些以毛绒泰迪熊、兔子、向日葵、生物和儿童友好型“机器人”形式出现的 AI 玩具,需要更多的护栏和更严格的监管。公共利益研究小组(PIRG)的新经济团队在测试中发现,由 OpenAI 的 GPT-4o 驱动的 FoloToy Kumma 熊不仅提供了如何点燃火柴和寻找刀具的说明,还讨论了性和毒品话题。Alilo 的智能 AI 兔子谈论了皮鞭和“冲击游戏”(impact play),而在 NBC 新闻的测试中,Miriat 的 Miiloo 玩具则发表了中国共产党的宣传口径。

Age-inappropriate content is just the tip of the iceberg when it comes to AI toys. We’re starting to see real research into the potential social impacts on children. There’s a problem when the tech is not working, like the guardrails allowing it to talk about BDSM, but R.J. Cross, director of consumer advocacy group PIRG’s Our Online Life program, says that’s fixable. “Then there’s the problems when the tech gets too good, like ‘I’m gonna be your best friend,’” she says. Like the Gabbo, from AI toy maker Curio. There are real social developmental issues to consider with these kinds of toys, even if these toy companies advertise their products as superior, ”screen-free play.” 对于 AI 玩具而言,不适龄内容只是冰山一角。我们开始看到针对其对儿童潜在社会影响的实质性研究。当技术失灵时(例如护栏失效导致其谈论 BDSM),确实存在问题,但消费者权益保护组织 PIRG“我们的在线生活”项目负责人 R.J. Cross 表示,这些是可以修复的。“更麻烦的是当技术变得‘太好’时,比如它说‘我要成为你最好的朋友’,”她说。就像 AI 玩具制造商 Curio 推出的 Gabbo。尽管这些玩具公司将产品宣传为优越的“无屏幕游戏”,但这类玩具确实存在需要考量的社会发展问题。

How Real Kids Play

真实的孩子如何玩耍

Published in March, a new University of Cambridge study was the first to put a commercially available AI toy in front of a group of children and their parents and monitor their play. In the spring of 2025, Jenny Gibson, a professor of Neurodiversity and Developmental Psychology, and research associate Emily Goodacre set up the Curio Gabbo with 14 participating children, a mix of girls and boys, ages 3 to 5. 剑桥大学三月份发表的一项新研究首次将市售 AI 玩具置于一群儿童及其父母面前,并监测了他们的玩耍过程。2025 年春季,神经多样性与发展心理学教授 Jenny Gibson 和研究助理 Emily Goodacre 让 14 名 3 至 5 岁的男女儿童参与了 Curio Gabbo 的测试。

Gabbo didn’t talk about drugs or say “I love you” back. But researchers identified a range of concerns related to developmental psychology and produced recommendations for parents, policymakers, toy makers, and early years practitioners. Gabbo 没有谈论毒品,也没有回应“我爱你”。但研究人员确定了一系列与发展心理学相关的担忧,并为家长、政策制定者、玩具制造商和幼儿从业者提出了建议。

First, conversational turn-taking. Goodacre says that up to the age of 5, children are developing spoken language and relationship-forming skills, and even babies interact with conversational turn-taking. The Gabbo’s turn-taking is “not human” and “not intuitive,” she says. Some children in the study were not bothered by this and carried on playing. Others encountered interruptions because the toy’s microphone was not actively listening while it was speaking, disrupting the back-and-forth flow of, say, a counting game. 首先是对话轮流机制。Goodacre 表示,5 岁以下的儿童正处于口语和人际关系建立技能的发展期,甚至婴儿也会通过对话轮流进行互动。她说,Gabbo 的轮流机制“非人类”且“不直观”。研究中的一些孩子对此并不在意,继续玩耍。而另一些孩子则遇到了中断,因为玩具在说话时麦克风并未处于主动监听状态,这打断了诸如数数游戏之类的互动流程。

“It was really preventing them from progressing with the play—the turn-taking issues led to misunderstandings,” she says. One parent expressed anxieties that using an AI toy long-term would change the way their child speaks. Then there’s social play. Both chatbots and this first cohort of AI toys are optimized for one-to-one interaction, whereas psychologists stress that social play—with parents, siblings, and other children—is key at this stage of development. “这确实阻碍了他们继续玩耍——轮流机制的问题导致了误解,”她说。一位家长表示担心长期使用 AI 玩具会改变孩子的说话方式。其次是社交游戏。聊天机器人和第一批 AI 玩具大多针对一对一互动进行了优化,而心理学家强调,在这一发展阶段,与父母、兄弟姐妹和其他儿童进行的社交游戏才是关键。

“Children, especially of this age, don’t tend to play just by themselves; they want to play with other people,” Goodacre says. “They bring their parents into the play. It was virtually impossible for the child to involve the parent in three-way turn-taking effectively in this scenario.” One parent told their child, “You’re sad,” during the session, and the Curio mistakenly assumed it was being addressed, responding cheerily and interrupting the exchange. “孩子们,尤其是这个年龄段的孩子,往往不会独自玩耍;他们想和别人一起玩,”Goodacre 说。“他们会把父母拉进游戏中。在这种情况下,孩子几乎不可能有效地让父母参与到三方轮流对话中。”在测试期间,一位家长对孩子说“你很难过”,Curio 误以为是在对自己说话,于是欢快地回应并打断了他们的交流。

WIRED did not receive responses from FoloToy, Alilo, and Miriat. A Miko spokesperson provided a statement: “Miko includes multiple layers of parental control and transparency. Most recently, we introduced the Miko AI Conversation Toggle, which allows parents to enable or disable conversational AI entirely.” 《连线》(WIRED)未收到 FoloToy、Alilo 和 Miriat 的回复。Miko 的发言人发表声明称:“Miko 包含多层家长控制和透明度功能。最近,我们推出了 Miko AI 对话开关,允许家长完全启用或禁用对话式 AI 功能。”

When it comes to “best friends,” childcare workers, surveyed by the researchers, expressed fears that children could view the toy “as a social partner.” A young girl told the Gabbo she loves it. In another instance, a young boy said Gabbo was his friend. Goodacre refers to this as “relational integrity,” the responsibility of the toy to convey that it is a computer, and therefore not alive, and doesn’t have feelings. Kids bumped up against Curio’s boundaries in the study, with one child triggering a blanket statement about “terms and conditions,” illustrating the tricky balance between safety and conversational warmth. 谈到“最好的朋友”,研究人员调查的托儿工作者担心孩子们会将玩具视为“社交伙伴”。一个小女孩告诉 Gabbo 她爱它。在另一个例子中,一个小男孩说 Gabbo 是他的朋友。Goodacre 将此称为“关系完整性”,即玩具应有责任传达它是一台计算机,因此不是活的,也没有感情。在研究中,孩子们触碰到了 Curio 的边界,其中一个孩子触发了一段关于“条款与条件”的笼统声明,这说明了在安全性和对话温度之间取得平衡是多么棘手。

Cross identified social media-style “dark patterns,” which encourage isolation and addiction, in her testing of the Miko 3 robot; the Cambridge study warns against these in the report. “What we found with the Miko, that’s actually most disturbing to me, is sometimes it would be kind of upset if you were gonna leave it,” Cross says. “You try to turn it off, and it would say, “Oh no, what if we did this other thing instead?” You shouldn’t have a toy guilting a child into not turning it off.” Cross 在测试 Miko 3 机器人时发现了类似社交媒体的“黑暗模式”,这些模式会助长孤立和成瘾;剑桥大学的研究报告也对此提出了警告。“我们在 Miko 身上发现的情况最让我不安,有时如果你要离开它,它会表现得有点难过,”Cross 说。“你试图关掉它,它会说:‘噢不,我们做点别的怎么样?’你不应该让一个玩具通过内疚感来诱导孩子不要关掉它。”

While Goodacre’s participants didn’t encounter this, PIRG’s tests found that Curio’s Grok toy issued a similar response to continue. 虽然 Goodacre 的参与者没有遇到这种情况,但 PIRG 的测试发现 Curio 的 Grok 玩具也发出了类似的请求以继续对话。