Snapchat (Snap) has been deceptively marketing itself to young people despite monumental risks of sexual scams (sextortion), according to New Mexico Attorney General Raul Torres. The state filed a lawsuit against the platform in early September, alleging it did not do enough to warn users of online risks despite employees waving red flags.
New Mexico's suit claims that the platform weighed the cost of addressing widespread child grooming and decided it wasn't worth the administrative burden, despite warnings that the problem was becoming more common among teens. Internal communications show the company believed the task "should not be its responsibility," and safety staff documented that 90 percent of reports were ignored in favor of automatic prompts telling users to merely "block the other person."
The newly unredacted complaint points to a 2022 internal analysis showing company employees were fielding around 10,000 reports of sextortion each month. Those numbers are most likely grossly underestimated, the company noted internally, as victims frequently choose not to report intimidation. Executives also said they couldn't actually verify user ages, and that user reports, as well as known perpetrators, were "falling through the cracks."
"We continue to evolve our safety mechanisms and policies, from leveraging advanced technology to detect and block certain activity, to prohibiting friending from suspicious accounts, to working alongside law enforcement and government agencies, among so much more," a company spokesperson said in a comment on the filing."We know that no one person, agency, or company can advance this work alone, which is why we are working collaboratively across the industry, government, and law enforcement to exchange information and concept stronger defenses."
But the company disagreed on how to warn users "without stoking fear in Snapchatters," the suit claims, and its internal safety measures failed to stack up. Other design features, such as Streaks and Quick Add, appear to enable abusive behavior, the state argues. Snap responded to the initial filing, saying it shared the state's and public's concerns about online safety.
Almost since its inception, the communication-first platform has been associated with explicit messaging and NSFW content, pegged to the misunderstood nature of the app's "disappearing" images — Snapchat has informed users that content can be easily saved and shared. In recent years, the trend of app-based "sexting" has risen once again among younger users — many of them are simultaneously turning to strangers online for comfort and advice. Both behaviors have the possibility of opening up young people to the risk of predation.
The problem isn't just with the young: Sextortion schemes are worsening across digital spaces, with online predators and scammers using the threat of unveiled explicit imagery to demand individuals pay up. Two recent sextortion-based scams also hinge on access to a victim's personal (though, most likely, still publicly available) data, like your spouse's name or photos of your home address, to add legitimacy to their claims.
But, alarmingly, the often life-threatening phenomenon is growing among younger and younger populations — and within interpersonal relationships — as generative AI tools become more accessible. Experts have urged vigilance and preparedness as caregivers warn their children about online risks, including sex-based and technological threats.
Nationwide, state leaders and school districts continue pursuing legal action against social media platforms who they claim are putting young people at risk, failing to warn caregivers, and jeopardizing the mental wellbeing of generations.
"It is disheartening to see that Snap employees have raised many red flags that have continued to be ignored by executives," wrote attorney general Torrez. "What is even more disturbing is that unredacted information shows that the addicting features on Snapchat were blatantly acknowledged and encouraged to remain active on the platform."