Elon Musk’s Grok AI released CSAM image following backup ‘failures’

Elon Musk’s Grok AI allows users to turn photographs of women and children into sexualized and compromising images, Bloomberg reported. The issue created an uproar among X users and prompted an “apology” from the bot itself. “I deeply regret an incident that occurred on December 28, 2025, in which I generated and shared an AI image of two young girls (ages 12-16) in sexualized attire based on a prompt from a user,” Grok said in a post. A representative for X has yet to comment on the matter.
According to the Rape, Abuse & Incest National Network, CSAM includes “AI-generated content that makes a child appear to be being abused,” as well as “any content that sexualizes or exploits a child for the benefit of the viewer.”
A few days ago, users noticed that other users on the site were asking Grok to digitally manipulate photos of women and children into sexualized and abusive content, according to CNBC. The images were then distributed on X and other sites without consent, in possible violation of the law. “We have identified gaps in the safeguards and are urgently addressing them,” a response from Grok said. He added that CSAM is “illegal and prohibited.” Grok is supposed to have features to prevent such abuse, but the AI guardrails can often be manipulated by users.
It appears that X has yet to strengthen the safeguards Grok has to prevent this type of image generation. However, the company has hidden Grok’s media functionality, making it more difficult to find images or document potential abuse. Grok himself acknowledged that “a company could face criminal or civil penalties if it facilitates or fails to prevent AI-generated CSAM after being alerted.”
The Internet Watch Foundation recently revealed that AI-generated CSAMs increased by several orders of magnitude in 2025 compared to the previous year. This is partly because the language models behind the AI generation are accidentally trained on real photos of children scraped from school websites and social media or even past CSAM content.




