While smart toys can be useful educational tools for children, they also present some potential privacy risks and could invade what is traditionally considered a private space. Think about it—the thought of your child’s toy listening in on your family 24/7/365 is disturbing. So how do we balance these risks with the benefits?

Smart toys that are made with artificial intelligence (AI) capabilities can collect different forms of data from children. For example, an AI-enabled toy may collect data to enable it to personalize lessons on how fast your child constructs a shape on the device or a doll may know your child’s favorite color or song so that it can “converse” with your child during playtime.

AI toy concerns vary based on toy type and the capabilities it has in terms of collecting data. Generally, most of these AI-enabled toys learn from children and provide adaptive, responsive play. Within this category of AI-enabled toys there are two subcategories: smart companions (i.e., toys that “learn” from their interaction with the child); and programmable toys (i.e., designed with machine learning to assist children in educational learning by moving and performing tasks). While there are regulations to protect children’s privacy and data collected from minors (the Children’s Online Privacy Protection Act (COPPA)) on the internet and through mobile applications by requiring prior express written consent from a parent or guardian, new smart devices hitting the market are not necessarily complying with COPPA according to the Federal Trade Commission (FTC).

Alan Butler of the Electronic Privacy Information Center (EPIC) said, “For any new device coming onto the market, if it’s not complying with COPPA, then it’s breaking the law. There’s a lot of toys on the market [using AI] and there’s a need to ensure that they’re all complying with COPPA.” One of the problems is that there is no pre-clearance review of toys before they are sold to consumers. While the FTC continues to enforce COPPA and historically has done so, it becomes difficult for it to stay ahead of privacy  issues when the toys are manufactured outside of the U.S. With a pre-clearance process in place, issues like invasion of privacy and collection of data from children without consent could be addressed before the toy ends up in a child’s playroom.

Whether we like it or not, smart toys and AI capabilities will only continue to grow. AI can in fact be helpful and effective in aiding children’s learning and experiences. However, we may need to examine this trend now (and the legislation related to these smart toys) to stay ahead of some of the big issues that could arise if this space is not adequately regulated and monitored.

Photo of Kathryn Rattigan Kathryn Rattigan

Kathryn Rattigan is a member of the Business Litigation Group and the Data Privacy+ Cybersecurity Team. She concentrates her practice on privacy and security compliance under both state and federal regulations and advising clients on website and mobile app privacy and security…

Kathryn Rattigan is a member of the Business Litigation Group and the Data Privacy+ Cybersecurity Team. She concentrates her practice on privacy and security compliance under both state and federal regulations and advising clients on website and mobile app privacy and security compliance. Kathryn helps clients review, revise and implement necessary policies and procedures under the Health Insurance Portability and Accountability Act (HIPAA). She also provides clients with the information needed to effectively and efficiently handle potential and confirmed data breaches while providing insight into federal regulations and requirements for notification and an assessment under state breach notification laws. Prior to joining the firm, Kathryn was an associate at Nixon Peabody. She earned her J.D., cum laude, from Roger Williams University School of Law and her B.A., magna cum laude, from Stonehill College. She is admitted to practice law in Massachusetts and Rhode Island. Read her full rc.com bio here.