
However, it seems that when Musk updated Grok to respond to some requests to undress images by refusing the prompts, it was enough for UK Prime Minister Keir Starmer to claim X had moved to comply with the law, Reuters reported.
Ars connected with a European nonprofit, AI Forensics, which tested to confirm that X had blocked some outputs in the UK. A spokesperson confirmed that their testing did not include probing if harmful outputs could be generated using X’s edit button.
AI Forensics plans to conduct further testing, but its spokesperson noted it would be unethical to test the “edit” button functionality that The Verge was able to confirm still works.
Last year, the Stanford Institute for Human-Centered Artificial Intelligence published research showing that Congress could “move the needle on model safety” by allowing tech companies to “rigorously test their generative models without fear of prosecution” for any CSAM red-teaming, Tech Policy Press reported. But until there is such a safe harbor carved out, it seems more likely that newly released AI tools could carry risks like those of Grok.
It’s possible that Grok’s outputs, if left unchecked, could eventually put X in violation of the Take It Down Act, which comes into force in May and requires platforms to quickly remove AI revenge porn. One of the mothers of one of Musk’s children, Ashley St Clair, has described Grok outputs using her images as revenge porn.
While the UK probe continues, Bonta has not made clear yet what laws he suspects X may be violating in the US. However, he emphasized that images with victims depicted in “minimal clothing” crossed a line, as well as images putting children in sexual positions.
As the California probe heats up, Bonta pushed X to take more actions to restrict Grok’s outputs, which one AI researcher suggested to Ars could be done with a few simple updates.
“I urge xAI to take immediate action to ensure this goes no further,” Bonta said. “We have zero tolerance for the AI-based creation and dissemination of nonconsensual intimate images or of child sexual abuse material.”


