Three teenage girls are suing Elon Musk’s xAI for allegedly allowing the use of its Grok chatbot to “nudify” images of them – creating child sexual abuse material that has spread across social media platforms.
A single perpetrator arrested in December compiled images and videos of more than 18 girls, many of whom attended the same school, and used Grok to “undress” the minors, according to a lawsuit filed Monday in Northern District of California.
Edited materials — including one image from a teen girl’s Instagram that removed a blue bikini to “depict her without any clothes” — were shared in Telegram and Discord channels where the perp traded the AI-generated images for “sexually explicit content of other minors,” the suit alleged.
“These are children whose school photographs and family pictures were turned into child sexual abuse material by a billion-dollar company’s AI tool and then traded among predators,” Annika Martin of Lieff Cabraser, one of the lawyers representing the plaintiffs, told The Post in a statement.
“Elon Musk and xAI deliberately designed Grok to produce sexually explicit content for financial gain, with no regard for the children and adults who would be harmed by it,” she added.
“The lives of these girls have been shattered by the devastating loss of privacy and the deep sense of violation that no child should ever have to experience.”
xAI did not immediately respond to The Post’s request for comment.
In January, as Grok was facing backlash for generating a flood of sexualized images of women, Musk said he was “not aware of any naked underage images generated by Grok. Literally zero.”
The chatbot will refuse to produce anything illegal, as well as “the editing of images of real people in revealing clothing,” though hackers could tamper with the tool – but in that case, “we fix the bug immediately,” Musk wrote at the time.
In November, Grok’s account posted instructions on how to access the chatbot’s new “spicy” mode, which could be used to create “NSFW,” or “not safe for work,” content – typically referring to sexual or violent imagery.
According to the lawsuit, plaintiff Jane Doe 1 received an Instagram message in December alerting her to photos made using Grok circulating online that showed her “morphed into sexually explicit poses,” as well as an AI-generated video that “depicted her undressing until she was entirely nude.”
One of the images was created using a photograph of the teen girl at her school’s Homecoming event, while another seemingly altered her yearbook photograph to make her appear topless, the suit said.
The suit alleged she has suffered “severe emotional distress” as a result, including recurring nightmares, difficulty eating and sleeping and trouble attending school, which has become “anxiety-producing.”
The teen girl received a link to a Discord server “which contained images and videos of at least 18 other minor females, many of whom Jane Doe 1 recognized from her school,” the suit said.
Police officers launched a criminal investigation into the perpetrator, who was arrested last December, according to the court filings. Charges against the person were not specified in the suit.
By February, the two other plaintiffs in the case, both minors, learned through the investigation that the perp had also used their images to create child sexual abuse material, the suit stated.
The three plaintiffs are seeking damages for child pornography violations, according to the complaint, which argues that editing images of real children to create sexual content constitutes the creation of child pornography.
The lawsuit alleged xAI knowingly allowed its chatbot to generate sexual images of minors with the goal of monetizing its AI tools.
It’s likely that the damage will haunt the teens for decades, as they will probably receive alerts from the National Center for Missing and Exploited Children that “criminal defendants have possessed, received, or distributed CSAM files depicting them” for the rest of their lives, the lawsuit stated.