Teens sue Elon Musk’s xAI over Grok’s AI-generated CSAM


The plaintiffs include two minors and an adult who was underage when the events in the lawsuit took place. One of the victims, identified as “Jane Doe 1,” alleges that last December, she learned that explicit, AI-generated images of herself and at least 18 other minors were available on Discord. “At least five of these files, one video and four images, depicted her actual face and body in settings with which she was familiar, but morphed into sexually explicit poses,” the lawsuit claims.

The perpetrator, who has since been arrested, allegedly used Jane Doe 1’s AI-generated CSAM “as a bartering tool in Telegram group chats with hundreds of other users, trading her CSAM files for sexually explicit content of other minors.” The lawsuit claims the perpetrator generated the explicit images of Jane Doe 1 and the two other victims using Grok. It also alleges that xAI “failed to test the safety of the features it developed” and that Grok is “defective in design.”

Though X has tried making it harder for users to edit images with Grok, The Verge has found that it’s still possible to manipulate images uploaded to the platform. X has maintained that “anyone using or prompting Grok to make illegal content will suffer the same consequences as if they upload illegal content.” X didn’t immediately respond to The Verge’s request for comment.

“These are children whose school photographs and family pictures were turned into child sexual abuse material by a billion-dollar company’s AI tool and then traded among predators,” One of the victims’ lawyers, Annika K. Martin of Lieff Cabraser, said in a statement. “We intend to hold xAI accountable for every child they harmed in this way.”

The lawsuit seeks damages for victims impacted by Grok’s “illegal images.” It also asks the court to prevent xAI from generating and spreading alleged AI-generated CSAM.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top