Smart toys are cool. By using software like voice recognition or artificial intelligence (AI), these devices can learn about their users and personalize the play. But privacy and security issues come into play when toys are equipped with microphones, cameras, or sensors that connect to the internet or pair with other Bluetooth devices.

Listen to audio highlights of the story below:

Smart toys “usually gather a lot more data on children than parents will realize,” said data privacy expert R.J. Cross, director of the Don’t Sell My Data Campaign at the U.S. PIRG Education Fund. “This data collection is inherently risky.”

A doll, robot, or action figure connected to the internet can receive software updates, but it also may send pictures, audio, or other personal information to the manufacturer—which can be shared with other companies—and used to market to that child.

“Toys should just be toys,” Cross told Checkbook. “Kids view their toys as friends. They’re not thinking about the fact that there’s a company on the other end that’s doing the listening and the talking.”

The demand for smart toys continues to increase. Global sales are expected to hit $16.7 billion this year, according to this market research report. Sales are expected to more than double to $35.11 billion by 2027.

A growing number of toys, including robots, games, and interactive animals (some marketed for children as young as three) now advertise that they use artificial intelligence (AI).

PIRG’s 2023 Trouble in Toyland report warns parents about the increased risks posed by these AI toys.

“AI-enabled toys with a camera or microphone may be able to, for example, assess a child’s reactions using facial expressions or voice inflection. This may allow the toy to try and form a relationship with the child and gather and share information with others that could risk the child’s safety or privacy.”

“It’s chilling to learn what some of these toys can do,” said Teresa Murray, director of the consumer watchdog division at U.S. PIRG Education Fund and co-author of the report. “Smart toys can be useful, fun or educational, but interacting with some of them can create frightening situations for too many families.”

This year’s report highlights the Amazmic Kids Karaoke Microphone, a toy that uses “the latest Bluetooth 5.0 technology to provide a more stable connection (up to 33 feet),” according to the listing on Amazon.

The directions say the microphone requires a password to pair with other Bluetooth devices—that password is 0000. Weak! PIRG bought the toy and found that it paired, in about two seconds, with nearby smartphones without entering a password. It happened three times on three different phones, Murray told Checkbook.

Murray said they couldn’t find an easy way to make the toy “undiscoverable,” so strangers couldn’t “drop in on your child, and send undesirable audio messages or play inappropriate music.”

A PIRG blog post warns parents about the unique risks associated with connected toys and explains how to make smart decisions about smart toys. Here’s what to consider:

Data Collection

Smart toys can collect significant data about the children playing with them, including their location. Conversational toys that use artificial intelligence to interact with the child might solicit personal information—such as name, age, school, or birthday—and transmit that data to the toymaker.

And what about other children or friends who may be nearby and are secretly recorded by that toy?

“It’s a huge privacy concern,” Cross said. “When the parent gives consent [for the recording], they’re not giving consent for every single person who could possibly interact and be in the room with this toy.”

Data Storage and Sharing

To “remember” what the child said for future conversations, the data collected by an interactive toy—including, in some cases, audio recordings—must be stored on corporate servers somewhere. It might also be shared with other companies that process and store data. This information could be used by the toy company or third-party companies to market to the child.


Any data that is collected and stored by these companies can be exposed in a data breach. It’s happened. A breach at toymaker Vtech in 2015 exposed the names, birthdays, genders, and in some cases, photos, recordings, and chat logs of about 6.4 million children.

Hack Attacks

A connected toy with a camera or microphone is vulnerable to hackers who can use it to eavesdrop on kids and their families. In 2015, researchers demonstrated that a conversational doll, My Friend Cayla, had an unsecured Bluetooth connection that could be hacked and change the doll’s responses. Someone with a Bluetooth-enabled phone could also connect to the doll and talk to the child.

It’s “surprising” how many connected toys use an “unsecure internet connection,” and don’t require a password, Cross said. That makes it much easier for hackers to “use the toys as an eavesdropping device, or even as a microphone to talk to a child.”

In-App Purchases May be Possible Without Parental Permission

Some smart toys have companion apps that must be downloaded to enable interactive features. These apps sometimes allow the child to make unsupervised purchases. This is common with tablet games where a lead character promotes the purchase of in-app extras.

Tips for Parents

Some toys have obvious dangers, such as small parts or sharp edges. With connected toys, the potential danger to your child’s privacy and safety may be hidden in a lengthy privacy policy that most parents don’t read.

“It sucks to read the terms and conditions and privacy policies. They’re long, they’re hard to parse, and they’re surprisingly vague sometimes about the key information you really want to know,” Cross said.

If a toy is connected to the internet, parents need to understand what data is collected, how it will be used, and if it will be shared or sold to other companies.

Make sure you understand the technology that enables the toy to interact with your child. Are there microphones, cameras, or sensors? Does the toy allow the child to make purchases without your permission? Chat functions are an obvious privacy and security risk.

The Children’s Online Privacy Protection Act (COPPA) prohibits companies from collecting personal information from children under the age of 13 without parental consent. But some do it anyway. In recent years, the Federal Trade Commission sued Microsoft, Amazon, Google, and VTech for alleged COPPA violations.

In some cases, “giving consent” could be as simple as turning on the toy, PIRG cautioned. It’s easy to misunderstand the full implications of giving consent for data collection during play.

More info from PIRG:


Become a Smarter Consumer Get free, expert advice delivered to your inbox every Wednesday when you sign up for the Weekly Checklist newsletter.

Contributing editor Herb Weisbaum (“The ConsumerMan”) is an Emmy award-winning broadcaster and one of America's top consumer experts. He has been protecting consumers for more than 40 years, having covered the consumer beat for CBS News, The Today Show, and You can also find him on Facebook, Twitter, and at