Artificial intelligence (AI) tools are moving from browser chat windows into physical children’s toys. Plush toys, which used to be a source of symbolic play, have become conversational partners that can respond to children’s emotional needs and answer their questions. But what risks do these modern toys bring into children’s bedrooms?
AI toys are physical objects equipped with built-in hardware that allows them to access AI tools via the internet. Interaction takes place through a microphone and speaker (sometimes also a screen), and their responses are generated through an online connection. Toys that converse with children use methods designed to create an emotional bond. Examples of how AI toys achieve this include:
- recognizing the child’s voice and speech and using their name,
- being available whenever the child wants to communicate,
- analyzing the child’s responses and adjusting their replies accordingly,
- learning from past interactions and giving the impression that they know the child,
- adapting content to the child’s age or level of knowledge,
- communicating in a more natural, “conversational” way.
This type of interaction can lead to emotional reliance, where children meet their needs primarily through interaction with the toy rather than with their social environment. Because AI robots are designed to agree with users and always be available to them, human interactions may begin to seem unnecessarily difficult. Interaction with AI tools can give children unrealistic expectations about relationships with people. There are several risks associated with the use of AI toys.
Distinguishing Between People and AI Robots
Children under the age of five do not distinguish between interacting with a person and interacting with an AI robot. As a result, they may perceive the AI robot as a living being with emotions. Learning interaction and communication skills is crucial at this developmental stage, and the inclusion of an AI robot may cause confusion in forming an internal understanding of relationships, how they are built, and what is real.
AI Cannot Replace Relationships
Relationships are a complex combination of verbal and nonverbal communication, as well as context and emotions that develop within them. To build quality relationships, we need eye contact, hugs, scent, facial expressions, touch, gestures, and more. Communication that consists only of talking to an inanimate object does not teach a child the complexity of navigating relationships with others. Real relationships also involve conflicts, disagreements, and discomfort. All of which are essential for developing problem-solving skills, critical thinking, perseverance, and emotional regulation.

Safety and Content Risks
AI toys are connected to the internet and record children’s responses. To provide answers that the child will like most, they also create a profile of the child in the background: their age, preferences, place of residence, family structure, and more. Some toys even have cameras. Companies store this data, making it vulnerable to breaches. One major AI toy company has already experienced such a breach, in which an attacker gained access to all conversations with children, their personal data, names, birthdays, locations, and even the ability to insert their own responses to children’s questions.
Companies also provide weak content safeguards. With minimal effort, children can quickly access sexual content or even content related to self-harm through AI toys. Filters and protective mechanisms intended to prevent such conversations often function ineffectively.
Technical Aspects and Payment
AI toys are also subject to technical malfunctions. A toy that has known a child’s name and interests for months may suddenly “forget” everything due to a software bug. A child who is emotionally attached to such a toy may experience this glitch as the loss of a dear friend.
Additionally, some AI toys operate on a monthly subscription model. Once a child becomes attached to using the toy, the family may feel compelled to continue paying the subscription, otherwise, the child loses their "friend". Research by the website Common Sense Media has shown that 9% of teenagers already use AI robots as friends or best friends. Another 8% use them as romantic partners or for flirting.
Introducing AI toys into today’s unregulated market of AI conversational robots with still unknown long-term consequences carries significant risk. We view them the same way as introducing a smartphone, that is, a digital device. As such, they are not recommended for children under 13 years of age, and even later require parental involvement and ongoing conversations about their use. AI conversational robots are not only embedded in toys but also in social media and everyday applications.
Warning Signs That the Use of an AI Toy Has Become Risky
- The AI toy is perceived as the child’s best friend.
- The child has abandoned previous interests and toys in favor of the AI toy.
- The child ignores limits that were set regarding the use of the AI toy.
- The child uses the AI toy as their primary source of comfort.
- The child expects the same level of agreement from others as from the toy.
- The child shows significant distress when separated from the toy.
Sources
https://josephthacker.com/hacking/2026/01/29/bondu-smart-toy-vulnerability.html
https://www.commonsensemedia.org/ai-ratings/ai-toys
https://www.commonsensemedia.org/sites/default/files/research/report/talk-trust-and-trade-offs_2025_web.pdf
