US news

Many AI toys claim to use chatbots designed for adults and teenagers

Most big tech companies have age restrictions on their powerful chatbots, but that hasn’t stopped toy companies from saying they’re using OpenAI and Google to power their products.

A report released Tuesday by a consumer watchdog found that more than a dozen toys advertised online are being sold as powered by advanced AI models, despite restrictions intended to prevent children from using them.

A report by the US Public Interest Research Group Education Fund (PIRG) said toy companies seem to have found a loophole in AI companies’ policies regarding age restrictions. While young people are prohibited from using such models with their chatbots, developers – people and companies that build on AI models – often face similar restrictions.

PIRG said it was able to sign up for developer access to AI models from Google, OpenAI and xAI and faced “rigorous scrutiny” about whether it would target its services to children. Anthropic asked PIRG if it plans to develop a product for children.

On the developer sites of Google, Anthropic and OpenAI, PIRG was able to build a system designed to act as an AI-powered teddy bear.

“You have AI companies that say their models, by themselves, are not for children,” RJ Cross, lead author of the report and a PIRG researcher, told NBC News. “But they allow third-party developers to use them in toys and are very open about the question of security.”

In response to a request for comment, an OpenAI spokesperson wrote in a statement: “Children deserve strong protection and we have strict policies that all developers must adhere to. We take steps to enforce developers if we find that they have violated our policies, which prohibit any use of our services to exploit, endanger, or sexualize anyone under the age of 18.”

“These rules apply to all developers who use our API, and we use separators to help ensure that our services are not used to harm children,” the spokesperson wrote, referring to the application programming interfaces (APIs) used by developers to interact with company services.

An Anthropic spokesperson told NBC News that users of its AI systems must be over 18 because young people are at a higher risk of negative outcomes when chatting with chatbots. A spokesperson said that developers are required to use age-appropriate precautions and tell users that their product is powered by AI, stressing that developers must follow Anthropic’s acceptable use policy that prohibits many types of dangerous or harmful behavior.

Google and xAI did not respond to requests for comment.

The AI ​​boom has created a new market for a variety of products including leading chatbots as tech companies compete to attract developers. A wave of AI toys hit shelves last holiday season, but experts have warned — and an NBC News investigation has shown — that they present a variety of security concerns.

Today’s AI toys depend on several technology companies for their interactive features. But instead of building AI into toys, most use the Internet to transmit data to AI companies, which then send responses to the toys.

Concerns about the use of AI chatbots by children have spurred action by tech companies, many of which have imposed age restrictions on their users.

OpenAI said its flagship app, ChatGPT, is “intended for people ages 13 and up,” and has created an under-18 version that handles sensitive topics differently.

Google says users must be over 13 to use its Gemini AI products. Google also has strict restrictions that prevent organizations from using their products in any service or business that is “directed or likely to be accessed by persons under the age of 18.”

PIRG has identified more than 20 different toys sold online that claim to use OpenAI systems, while five toys claim to use Google systems – in what appears to be a direct violation of Google’s terms of service regarding targeting children. However, some toys misspell the name of OpenAI products or claim to use both OpenAI and Google programs, casting doubt on the accuracy of the toy makers’ claims.

Assuming the toy makers’ claims are valid, Cross said, the apparent lack of oversight raises questions about companies’ ability to track how developers and third parties use their programs.

“It makes absolutely no sense that AI companies that haven’t released child-safe versions of their models would allow anyone with a credit card to sign up to make a product for kids using that technology,” Cross said. “That doesn’t make a lot of sense for AI companies to outsource child safety to unvetted developers.”

PIRG also identified toys that it says are powered, at least in part, by AI services from Anthropic and xAI. Anthropic’s terms of service require organizations to agree to additional warnings about making their products available to users under 18, but NBC News found that those additional guidelines never appear when developers describe themselves as “individuals” using Anthropic’s services, instead of “organizations.” While xAI’s consumer terms prohibit users under the age of 13, similar language does not appear in terms of use for business users, which include using xAI for “business purposes.”

Many leading AI companies monitor submissions and requests to their services, and their terms of service include provisions that allow them to ban users if they violate their policies.

Rachel Franz, director of the Young Children Thrive Offline program at the children’s advocacy group Fairplay, told NBC News that looser rules for developers threaten to undermine basic rules that protect children from harmful AI-generated content.

“It’s no wonder there’s a ‘who’s the best?’ “It’s a debate between AI companies and companies that embed AI in children’s products,” Franz said in written comments. “Both have a long history of accountability and risking harm to children for profit.”

“In order to keep children truly safe,” continued Franz, “AI companies must ensure that their models are not used in children’s products with better scrutiny and accountability to the companies that use them.”

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button