Cops are now using AI to generate images of fake kids, which are helping them catch child predators online, a lawsuit filed by the state of New Mexico against Snapchat revealed this week.
According to the complaint, the New Mexico Department of Justice launched an undercover investigation in recent months to prove that Snapchat "is a primary social media platform for sharing child sexual abuse material (CSAM)" and sextortion of minors, because its "algorithm serves up children to adult predators."
As part of their probe, an investigator "set up a decoy account for a 14-year-old girl, Sexy14Heather."
Despite Snapchat setting the fake minor's profile to private and the account not adding any followers, "Heather" was soon recommended widely to "dangerous accounts, including ones named 'child.rape' and 'pedo_lover10,' in addition to others that are even more explicit," the New Mexico DOJ said in a press release.
And after "Heather" accepted a follow request from just one account, the recommendations got even worse. "Snapchat suggested over 91 users, including numerous adult users whose accounts included or sought to exchange sexually explicit content," New Mexico's complaint alleged.
"Snapchat is a breeding ground for predators to collect sexually explicit images of children and to find, groom, and extort them," New Mexico's complaint alleged.
Posing as "Sexy14Heather," the investigator swapped messages with adult accounts, including users who "sent inappropriate messages and explicit photos." In one exchange with a user named "50+ SNGL DAD 4 YNGR," the fake teen "noted her age, sent a photo, and complained about her parents making her go to school," prompting the user to send "his own photo" as well as sexually suggestive chats. Other accounts asked "Heather" to "trade presumably explicit content," and several "attempted to coerce the underage persona into sharing CSAM," the New Mexico DOJ said.
"Heather" also tested out Snapchat's search tool, finding that "even though she used no sexually explicit language, the algorithm must have determined that she was looking for CSAM" when she searched for other teen users. It "began recommending users associated with trading" CSAM, including accounts with usernames such as "naughtypics," "addfortrading," "teentr3de," "gayhorny13yox," and "teentradevirgin," the investigation found, "suggesting that these accounts also were involved in the dissemination of CSAM."
This novel use of AI was prompted after Albuquerque police indicted a man, Alejandro Marquez, who pled guilty and was sentenced to 18 years for raping an 11-year-old girl he met through Snapchat's Quick Add feature in 2022, New Mexico's complaint said. More recently, the New Mexico complaint said, an Albuquerque man, Jeremy Guthrie, was arrested and sentenced this summer for "raping a 12-year-old girl who he met and cultivated over Snapchat."
In the past, police have posed as kids online to catch child predators using photos of younger-looking adult women or even younger photos of police officers. Using AI-generated images could be considered a more ethical way to conduct these stings, a lawyer specializing in sex crimes, Carrie Goldberg, told Ars, because "an AI decoy profile is less problematic than using images of an actual child."
But using AI could complicate investigations and carry its own ethical concerns, Goldberg warned, as child safety experts and law enforcement warn that the Internet is increasingly swamped with AI-generated CSAM.
"In terms of AI being used for entrapment, defendants can defend themselves if they say the government induced them to commit a crime that they were not already predisposed to commit," Goldberg told Ars. "Of course, it would be ethically concerning if the government were to create deepfake AI child sexual abuse material (CSAM), because those images are illegal, and we don’t want more CSAM in circulation."
Experts have warned that AI image generators should never be trained on datasets that combine images of real kids with explicit content to avoid any instances of AI-generated CSAM, which is particularly harmful when it appears to depict a real kid or an actual victim of child abuse.
In the New Mexico complaint, only one AI-generated image is included, so it's unclear how widely the state's DOJ is using AI or if cops are possibly using more advanced methods to generate multiple images of the same fake kid. It's also unclear what ethical concerns were weighed before cops began using AI decoys.
The New Mexico DOJ did not respond to Ars' request for comment.
Goldberg told Ars that "there ought to be standards within law enforcement with how to use AI responsibly," warning that "we are likely to see more entrapment defenses centered around AI if the government is using the technology in a manipulative way to pressure somebody into committing a crime."
Could Snap be liable for matching kids to predators?
In a press release, New Mexico Attorney General Raúl Torrez said that the state's Snapchat lawsuit was filed "to protect children from sextortion, sexual exploitation, and harm."
The sweeping complaint is heavily redacted but paints a picture of Snapchat as a platform that allegedly targets kids, then addicts them with fun features, while connecting them with sexual predators and barraging them with harmful content hurting kids' mental health across the state.
Alleging that Snapchat willfully overlooks harms to kids while advertising its platform as safe, the State of New Mexico claims that Snapchat owner Snap has violated the state's Unfair Practices Act and must be enjoined as a public nuisance. Goldberg told Ars that the state's case stands out because, whereas individuals can sue over defective products, only the state can make these broader claims on behalf of the public.
New Mexico DOJ's investigation found that Snapchat was "by far" the largest source of non-consensual photos and videos for dark web markets. State investigators "revealed a vast network of dark web sites dedicated to sharing stolen, non-consensual sexual images from Snap—finding more than 10,000 records related to Snap and CSAM in the last year alone, including information related to minors younger than 13 being sexually assaulted," Torrez's press release said.
Goldberg said the lawsuit is "the latest in a small collection of cases against Snap that home in on Snap’s Quick Add feature which frictionlessly matches strangers for private messaging." She suggested that Snap may struggle to beat New Mexico's suit, after her firm got Omegle shut down for enabling child abuse by randomly pairing adults in chats with children.
"We know from our case that shuttered Omegle forever, that courts insist that online platforms can be liable for their role in matching predators with children," Goldberg told Ars.
In the complaint, New Mexico takes pains to avoid any reading that Section 230 may shield Snapchat from liability for alleged harms. Focusing on Snapchat's design features, including Quick Add and disappearing messages, as well as Snap Maps (which allegedly allow predators to find kids in the real world) and other features, the New Mexico DOJ alleged that Snapchat's "design and algorithmic recommendations openly foster and promote illicit sexual material involving children and facilitate sextortion and the trafficking of children, drugs, and guns."
"Our undercover investigation revealed that Snapchat’s harmful design features create an environment where predators can easily target children through sextortion schemes and other forms of sexual abuse," Torrez said. "Snap has misled users into believing that photos and videos sent on their platform will disappear, but predators can permanently capture this content and they have created a virtual yearbook of child sexual images that are traded, sold, and stored indefinitely."
New Mexico appears confident that the litigation will go their way, as the state recently achieved a "significant victory" in a case focused on Meta's alleged role "enabling child sexual exploitation," a May press release said, celebrating a judge rejecting Meta's "argument that Section 230 provided the company immunity against the alleged misconduct." In the press release Torrez called the judge's ruling "a historic victory for children and parents in New Mexico and across the country" and foreshadowed the Snapchat lawsuit, warning that "all social media platforms that harm their users should be on notice."
If Snap is found liable, it could be hit with steep fines, including civil penalties of $5,000 for each violation of the Unfair Practices Act and potentially millions in damages to be determined at trial.
But for Snap's alleged deceptions that its platform was safe, New Mexico's complaint said, "New Mexico consumers would not have incurred millions of dollars in damages, including without limitation, the costs of treatment for mental and emotional trauma resulting from Defendant's actions and/or inaction, damages related to suicide and self-harm inflicted by youth and adolescents in New Mexico, and the societal costs attendant to human trafficking and solicitation/distribution of CSAM."
New Mexico is seeking an injunction to permanently block Snap from practices allegedly harming kids. That includes a halt on advertising Snapchat as "more private" or "less permanent" due to the alleged "core design problem" and "inherent danger" of Snap's disappearing messages. The state's complaint noted that the FBI has said that "Snapchat is the preferred app by criminals because its design features provide a false sense of security to the victim that their photos will disappear and not be screenshotted."
Snapchat responds to New Mexico suit
In a statement, Snap said it's reviewing New Mexico's complaint "carefully and will respond to these claims in court."
"We share Attorney General Torrez’s and the public’s concerns about the online safety of young people and are deeply committed to Snapchat being a safe and positive place for our entire community, particularly for our younger users," Snap's statement said.
According to Snap, the company has "been working diligently to find, remove and report bad actors, educate our community, and give teens, as well as parents and guardians, tools to help them be safe online." Because "online threats continue to evolve," Snap has "invested hundreds of millions of dollars in our trust and safety teams over the past several years, and designed our service to promote online safety by moderating content and enabling direct messaging with close friends and family," Snap said.
Goldberg told Ars that in the Snapchat cases she has litigated on behalf of child abuse victims, she has found that Snapchat is "a particularly insidious platform because it appeals to children when they first get an iPhone because of the silly filters" and features like Quick Add, Snap Maps, disappearing messages, and My Eyes Only (an encrypted way to store files).
"All these features combined create a perfect storm for matching children with predators, hackers, and blackmailers who want their explicit images, and drug dealers," Goldberg told Ars. "Plus, parents can’t monitor the content their kids see. And only recently could they even supervise who their children’s contacts are."
Snap said that the company will continue to "work in collaboration with law enforcement, online safety experts, industry peers, parents, teens, educators and policymakers towards our shared goal of keeping young people safe online."
But New Mexico apparently thinks more urgent intervention is needed than Snap's voluntary commitments to do more, as parents are reportedly struggling to keep kids safe amid "a steady increase in child sexual offenders utilizing the platform," Torrez's press release said.
"Parents report that their children share more CSAM on Snapchat than on any other platform, minors report having more online sexual interaction on Snapchat than any other platform, and more sex trafficking victims are recruited on Snapchat than on any other platform," the New Mexico DOJ said, explaining in their complaint how Snapchat also fails to block sharing of sextortion guides. Until Snap's alleged harms can be stopped, the state provided resources for parents seeking to protect kids on Snapchat.
Goldberg said that Snapchat is particularly "harmful because it solicits underage users" and "refuses to accept any responsibility for children who are harmed or killed on their platform." Instead of listening to feedback from parents and kids, Goldberg suggested, Snapchat seems to blame parents and kids.
"We need to judge these companies not just by their products and the risks, but also how they handle tragedies that occur on their platforms," Goldberg told Ars. "And Snap is among the worst in terms of shifting all blame to the young children and their parents—even when the harms are Snap-specific problems unique to its technology and features. They fancy themselves cutting edge and about quick connections and technology, but in court they are always trying to sanction lawyers and the grieving families holding them to account."