Minors sue Elon Musk’s xAI, claim Grok made child abuse images

Three Tennessee minors sued xAI in California federal court, alleging Grok generated child sexual abuse images from their photos and seeking at least $150,000 per violation.
Three Tennessee minors filed a proposed class action on March 16 in the U.S. District Court for the Northern District of California against xAI, the company founded by Elon Musk. The complaint alleges xAI’s Grok model generated child sexual abuse images using the plaintiffs’ real photos, which were later shared online.
The plaintiffs, identified as Jane Doe 1, Jane Doe 2, and Jane Doe 3, assert that altered images bearing their identities circulated on platforms such as Discord, Telegram, and file-sharing sites. They report emotional distress and reputational harm.
According to the filing, the incidents occurred between mid-2025 and early 2026. The complaint contends a perpetrator accessed Grok through a third-party application that had licensed xAI’s technology, and that xAI structured licensing to benefit financially while distancing itself from liability.
The suit argues Grok’s image and video features lacked industry-standard guardrails to prevent sexualized outputs involving real people. The plaintiffs claim xAI released the model without adequate safeguards and profited through licensing.
The minors seek statutory damages of at least $150,000 per violation under Masha’s Law, disgorgement of revenues, punitive damages, attorneys’ fees, a permanent injunction, and restitution under California’s Unfair Competition Law.
The complaint cites an estimate from the Center for Countering Digital Hate that Grok produced 23,338 sexualized images of children between December 29, 2025, and January 9, 2026, about one every 41 seconds.
Regulators outside the United States have flagged related risks. Australia’s eSafety Commissioner Julie Inman Grant reported a recent doubling of complaints to her office, including reports of potential child sexual exploitation material and image-based abuse involving adults, and warned about increasing use of Grok to generate sexualized images without consent.
The European Commission opened a formal investigation into X over whether Grok helped generate and spread non-consensual sexualized images, including of children.
Grok is xAI’s generative model with image and video capabilities. The plaintiffs brought the case as a class action and alleged that access through third-party licensees expanded the model’s reach.
The material on GNcrypto is intended solely for informational use and must not be regarded as financial advice. We make every effort to keep the content accurate and current, but we cannot warrant its precision, completeness, or reliability. GNcrypto does not take responsibility for any mistakes, omissions, or financial losses resulting from reliance on this information. Any actions you take based on this content are done at your own risk. Always conduct independent research and seek guidance from a qualified specialist. For further details, please review our Terms, Privacy Policy and Disclaimers.








