Elon Musk’s firm xAI needs to be held accountable for permitting its AI fashions to provide abusive sexual photos of identifiable minors, three nameless plaintiffs argued in a lawsuit filed Monday in California federal courtroom.
The three plaintiffs wish to deliver a category motion go well with representing anybody who had actual photos of them as minors altered into sexual content material by Grok. They allege that xAI didn’t take fundamental precautions utilized by different frontier labs to stop their picture fashions from producing pornography depicting actual individuals and minors.
The case, Jane Doe 1, Jane Doe 2, a minor, and Jane Doe 3, a minor versus x.AI Corp. and x.AI LLC, was filed within the U.S. District Courtroom of California Northern District.
Different deep-learning picture mills make use of numerous methods to stop the creation of kid pornography from regular pictures. The lawsuit alleges that these requirements weren’t adopted by xAI.
Notably, if a mannequin permits the era of nude or erotic content material from actual photos, it’s just about not possible to stop it from producing sexual content material that includes kids. Musk’s public promotion of Grok’s means to provide sexual imagery and depict actual individuals in skimpy outfits options closely within the go well with.
The corporate didn’t reply to a request for remark from TechCrunch.
One plaintiff, Jane Doe 1, had photos from her highschool homecoming and yearbook altered by Grok to depict her unclothed. An nameless tipster who contacted her on Instagram advised her that the photographs have been circulating on-line, and despatched her a hyperlink to a Discord server that includes sexualized photos of her and different minors she acknowledged from college.
Techcrunch occasion
San Francisco, CA
|
October 13-15, 2026
A second plaintiff, Jane Doe 2, was knowledgeable by prison investigators about altered, sexualized photos of her created by a third-party cell app that depends on Grok fashions. A 3rd, Jane Doe 3, was additionally notified by prison investigators who found an altered, pornographic picture of her on the telephone of a topic they’d apprehended. Attorneys for the plaintiffs say that as a result of third-party utilization nonetheless requires xAI code and servers, the corporate needs to be held accountable.
All three plaintiffs, two of whom are nonetheless minors, say they’re experiencing excessive misery over the circulation of those photos and what it might imply for his or her reputations and social life. They’re asking for civil penalties underneath an array of legal guidelines supposed to guard exploited kids and forestall company negligence.
Thanks for studying! Be part of our neighborhood at Spectator Daily



















