Technology

Elon Musk's xAI faces child porn lawsuit from minors Grok allegedly undressed | TechCrunch


Elon Musk’s company xAI should be held accountable for allowing its AI models to produce abusive sexual images of identifiable minors, three anonymous plaintiffs argued in a lawsuit filed Monday in California federal court.

The three plaintiffs want to bring a class action suit representing anyone who had real images of them as minors altered into sexual content by Grok. They allege that xAI did not take basic precautions used by other frontier labs to prevent their image models from producing pornography depicting real people and minors.

The case, JANE DOE 1, JANE DOE 2, a minor, and JANE DOE 3, a minor versus X.AI Corp and X.AI LLC, was filed in the U.S. District Court of California Northern District.

Other deep-learning image generators employ various techniques to prevent the creation of child pornography from normal photographs. The lawsuit alleges that these standards were not adopted by xAI.

Notably, if a model allows the generation of nude or erotic content from real images, it is virtually impossible to prevent it from generating sexual content featuring children. Musk’s public promotion of Grok’s ability to produce sexual imagery and depict real people in skimpy outfits features heavily in the suit.

The company did not respond to a request for comment from TechCrunch.

One plaintiff, Jane Doe 1, had pictures from her high school homecoming and yearbook altered by Grok to depict her unclothed. An anonymous tipster who contacted her on Instagram told her that the photos were circulating online, and sent her a link to a Discord server featuring sexualized images of her and other minors she recognized from school.

Techcrunch event

San Francisco, CA
|
October 13-15, 2026

A second plaintiff, Jane Doe 2, was informed by criminal investigators about altered, sexualized images of her created by a third-party mobile app that relies on Grok models. A third, Jane Doe 3, was also notified by criminal investigators who discovered an altered, pornographic image of her on the phone of a subject they had apprehended. Attorneys for the plaintiffs say that because third-party usage still requires xAI code and servers, the company should be held responsible.

All three plaintiffs, two of whom are still minors, say they are experiencing extreme distress over the circulation of these images and what it could mean for their reputations and social life. They are asking for civil penalties under an array of laws intended to protect exploited children and prevent corporate negligence.


Please Subscribe. it’s Free!

Your Name *
Email Address *