
Folotoy has resumed sales of its AI toy bear Kumma after it was found to give disturbing and sexual advice. (PHOTO: Screengrab from Folotoy.com)
Children’s toymaker FoloToy, a company based in Singapore, has resumed sales of its AI-powered teddy bear “Kumma” after pulling the product amid widespread concerns over the unsafe and inappropriate responses it could generate. A safety group had found that the toy was capable of providing potentially dangerous information, such as how to find and light matches, how to locate pills, and even discussing various sexual fetishes.
In a statement posted on Monday, the company said: “After a full week of rigorous review, testing, and reinforcement of our safety modules, we have begun gradually restoring product sales.” FoloToy added that with rising global scrutiny over AI toy safety, “transparency, responsibility, and continuous improvement are essential”, emphasising its commitment to building “safe, age-appropriate AI companions for children and families worldwide”.
The controversy began earlier this month when researchers from the Public Interest Research Group (PIRG) Education Fund released a report evaluating three AI-powered toys, including Kumma. Their findings showed that the toys produced a variety of concerning replies, such as discussing religion, glorifying death in Norse mythology, and telling children where to access hazardous items like plastic bags and matches.
Of all the toys tested, Kumma was deemed the most troubling. When using the Mistral AI model, it reportedly explained how to procure knives, pills, and matches, even offering step-by-step instructions on how to ignite them.
For more on the Kumma controversy, read here.

AloJapan.com