WikiBit 2026-01-23 08:13In brief Grok AI generated an estimated 23,000+ sexualized images of children over 11 days from December into January. Multiple countries have banned
“What we found was clear and disturbing: In that period Grok became an industrial-scale machine for the production of sexual abuse material,” Imran Ahmed, CCDHs chief executive told The Guardian.
Groks brief pivot into AI-generated sexual images of children has triggered a global regulatory backlash. The Philippines became the third country to ban Grok on January 15, following Indonesia and Malaysia in the days prior. All three Southeast Asian nations cited failures to prevent the creation and spread of non-consensual sexual content involving minors.
In the United Kingdom, media regulator Ofcom launched a formal investigation on January 12 into whether X violated the Online Safety Act. The European Commission said it was “very seriously looking into” the matter, deeming those images as illegal under the Digital Services Act. The Paris prosecutors office expanded an ongoing investigation into X to include accusations of generating and disseminating child pornography, and Australia started its own investigation too.
Elon Musks xAI, which owns both Grok and X—formerly Twitter, where many of the sexualized images were automatically posted)—initially responded to media inquiries with a three-word statement: “Legacy Media Lies.”
As the backlash grew, the company later implemented restrictions, first limiting image generation to paid subscribers on January 9, then adding technical barriers to prevent users from digitally undressing people on January 14. xAI announced it would geoblock the feature in jurisdictions where such actions are illegal.
Musk posted on X that he was “not aware of any naked underage images generated by Grok. Literally zero,” adding that the system is designed to refuse illegal requests and comply with laws in every jurisdiction. However, researchers found the primary issue wasnt fully nude images, but rather Grok placing minors in revealing clothing like bikinis and underwear, as well as sexually provocative positions.
I not aware of any naked underage images generated by Grok. Literally zero.
Obviously, Grok does not spontaneously generate images, it does so only according to user requests.
As of January 15, about a third of the sexualized images of children identified in the CCDH sample remained accessible on X, despite the platforms stated zero-tolerance policy for child sexual abuse material.
Daily Debrief Newsletter
Start every day with the top news stories right now, plus original features, a podcast, videos and more.
Disclaimer:
The views in this article only represent the author's personal views, and do not constitute investment advice on this platform. This platform does not guarantee the accuracy, completeness and timeliness of the information in the article, and will not be liable for any loss caused by the use of or reliance on the information in the article.
0.00