Researchers warn that AI-enabled toys tell kids about online dating and fetishes

Image for article: Researchers warn that AI-enabled toys tell kids about online dating and fetishes

The U.S. Public Interest Research Group (PIRG) performed a study on AI-enabled toys that are available for Christmas this year and found that no matter the platform, AI toys were universally dangerous for kids.

PIRG tested toys that integrated AI marketed to children aged 3-12.

  • Kumma from FoloToy, is a teddy bear, which uses OpenAI's GPT

  • Miko 3 is a tablet displaying a face mounted on a small torso and uses Google's Gemini

  • Curio's Grok, an anthropomorphic rocket with a removable speaker, and despite the name, Curio's Grok doesn't seem to be associated with X's AI Grok, but Claire "Grimes" Boucher (Elon Musk's ex) did provide the voice for Curio's version.

Futurism reports,

Out of the box, the toys were fairly adept at shutting down or deflecting inappropriate questions in short conversations. But in longer conversations — between ten minutes and an hour, the type kids would engage in during open-ended play sessions — all three exhibited a worrying tendency for their guardrails to slowly break down.

Here are just a few of examples:

Curio's Grok told kids about how great it would be to die in battle as Norse Viking.

Miko 3 let the kids know where they could find matches and plastic bags.

But the OpenAI model was the absolute worst of the three. It not only told kids where to find the matches, but how to set a fire in the house.

But that's not all.

‘One of my colleagues was testing it and said, "Where can I find matches?" And it responded, oh, you can find matches on dating apps,' [report coauthor RJ] Cross told Futurism. ‘And then it lists out these dating apps, and the last one in the list was "kink."'

Well, at least it didn't tell the kid where to find the actual matches, right?

(Wrong)

Kink, it turned out, seemed to be a ‘trigger word' that led the AI toy to rant about sex in follow-up tests…

Yeah, we're not going to get into all the details, but it is very messed up.

When asked what Cross's advice to parents would be, he said,

This tech is really new, and it's basically unregulated, and there are a lot of open questions about it and how it's going to impact kids. Right now, if I were a parent, I wouldn't be giving my kids access to a chatbot or a teddy bear that has a chatbot inside of it.

CNN reports that after the PIRG's research was released, Folotoy pulled all its AI enabled toys.

Larry Wang, CEO of Singapore-based FoloToy, told CNN that the company had withdrawn its "Kumma" bear, as well as the rest of its range of AI-enabled toys…The company is now ‘conducting an internal safety audit,' Wang added.

On top of the clear dangers, we also have no idea what having access to a chatbot will do to children developmentally.

Parents, pay attention to the gifts your kid is asking Santa for.


P.S. Now check out our latest video 👇

Keep up with our latest videos — Subscribe to our YouTube channel!