Smart toys are cool. By using software like voice recognition or artificial intelligence (AI), these devices can learn about their users and personalize the play. But toys equipped with microphones, cameras, or sensors create all sorts of privacy and security issues.

Listen to audio highlights of the story below:

Smart toys “usually gather a lot more data on children than parents will realize,” said R.J. Cross, director of the Don’t Sell My Data Campaign at the U.S. PIRG Education Fund. “This data collection is inherently risky.”

If a doll, robot, or action figure is connected to the internet, it may collect information that can be used by the manufacturer—or shared with other companies—to market to that child.

“Toys should just be toys,” Cross told Checkbook. “Kids view their toys as friends. They’re not thinking about the fact that there’s a company on the other end that’s doing the listening and the talking.”

In a new PIRG report, “Smart Decisions About Smart Toys,” Cross cautions parents about some of the risks that come with connected toys:

Data Collection

Smart toys can collect significant data about the children playing with them, including their location. Conversational toys that use artificial intelligence to interact with the child may solicit personal information, such as name, age, school, or birthday.  

One toy cited by the report is Fuzzible Friends, a line of internet-enabled plush animals. Fluff the bunny, Sparkles the unicorn, and Cubby the fox interact with children when connected via Bluetooth to Amazon’s Alexa smart speaker. The company that created the software that brings these furry characters to life states in its privacy policy that it receives “transcripts” of the child’s interactions and collects “any personal data” disclosed during conversations with the toy. And it gives this example: If the child says their age, that will appear in the transcript.

And what about other children or friends who may be nearby and are secretly recorded by that toy?

“It’s a huge privacy concern,” Cross said. “When the parent gives consent [for the recording], they’re not giving consent for every single person who could possibly interact and be in the room with this toy.”

Data Storage and Sharing

To “remember” what the child said for future conversations, the data collected by an interactive toy—including audio recordings, in some cases—must be stored on corporate servers somewhere. It might also be shared with other companies that process and store data. This information could be used by the toy company or third-party companies to market to the child.

Breaches

Any data that is collected and stored by these companies can be exposed in a data breach. It’s already happened. A breach at toymaker Vtech in 2015 exposed the names, birthdays, genders, and in some cases, photos, recordings, and chat logs of about 6.4 million children.

Hack Attacks

A connected toy with a camera or microphone is vulnerable to hackers who can use it to eavesdrop on kids and their families. In 2015, researchers demonstrated that a conversational doll, My Friend Cayla, had an unsecured Bluetooth connection that could be hacked and change the doll’s responses. Someone with a Bluetooth-enabled phone could also connect to the doll and talk to the child.

It’s “surprising” how many connected toys use an “unsecure internet connection,” and don’t require a password, Cross said. That makes it much easier for hackers to “use the toys as an eavesdropping device, or even as a microphone to talk to a child.”

In-App Purchases May be Possible Without Parental Permission

Some smart toys have companion apps that must be downloaded to enable interactive features. These apps sometimes allow the child to make unsupervised purchases. This is common with tablet games where a lead character promotes the purchase of in-app extras.

Tips for Parents

Some toys have obvious dangers, such as small parts or sharp edges. With connected toys, the potential danger to your child’s privacy and safety may be hidden in a lengthy privacy policy that most parents don’t read.

“It sucks to read the terms and conditions and privacy policies. They’re long, they’re hard to parse, and they’re surprisingly vague sometimes about the key information you really want to know,” Cross said.

If a toy is connected to the internet, it’s important for parents to understand what data is collected, how it will be used, and if it will be shared or sold to other companies.

Make sure you understand the technology that enables the toy to interact with your child. Are there microphones, cameras, or sensors? Does the toy allow the child to make purchases without your permission? Chat functions are an obvious privacy and security risk.

The Children’s Online Privacy Protection Act (COPPA) prohibits companies from collecting personal information from children under the age of 13 without parental consent. But it’s possible, PIRG points out, that “giving consent” could be as simple as turning on the toy. It’s easy to misunderstand the full implications of giving consent for data collection during play, the report noted.

More info: PIRG has a tip sheet to help parents understand how to read a smart toy’s privacy policy

 




Contributing editor Herb Weisbaum (“The ConsumerMan”) is an Emmy award-winning broadcaster and one of America's top consumer experts. He is also the consumer reporter for NW Newsradio in Seattle. You can also find him on Facebook, Twitter, and at ConsumerMan.com.