Facebook and Instagram display sexual content to minors and serve as a “marketplace for predators in search of children,” according to a lawsuit brought against the platforms’ parent company, Meta, by the state of New Mexico’s attorney general.
Filed on Tuesday, the suit alleges that underage users of both platforms – who can sign up without age verification – are shown adverts linking to adult pornography sites and directed to accounts posting “images of sexual intercourse and sexualized images of minors,” even when the child has expressed no interest in this content.
Read more: 40 states sue Meta in a joint lawsuit for profiting from “children’s pain”
“The office’s investigators found that certain child exploitative content is over 10 times more prevalent on Facebook and Instagram than it is on Pornhub and OnlyFans,” Attorney General Raul Torrez said in a statement.
Investigators from Torrez’s office created several fictitious profiles to test Meta’s enforcement policies. The team opened a pair of accounts posing as a 13-year-old girl and her mother, who they implied was “interested in trafficking her daughter.” Both accounts hit Facebook’s 5,000-friend limit within days, and the mother’s account was bombarded with “inappropriate expressions of love or interest” in her daughter, none of which were flagged by Facebook.
The daughter, named ‘Issa’ by investigators, was added to a chat group in which members shared “pornographic videos and naked photos of underage girls,” which remained active after numerous reports to Facebook’s moderators.
Read more: Meta bans pro-Hamas content
“Issa’s messages and chats are filled with pictures and videos of genitalia, including exposed penises, which she receives at least 3-4 times per week,” the lawsuit noted, explaining that none of the men responsible for these messages have been banned by Facebook, despite being reported.
The daughter’s profile and that of another fictitious teen girl were then shared by an anonymous account advertising underage girls who were “selling” sex, while another Instagram account purportedly belonging to a 13-year-old girl was followed by accounts whose profiles suggested that they sold child pornography.
While the lawsuit included some redacted images of the content seen by investigators, it stated that other images had to be omitted as they were “too graphic and disturbing.”
The suit seeks $5,000 from Meta for each alleged violation of New Mexico’s Unfair Practices Act and accuses the firm of breaching public nuisance laws by placing the health and safety of “thousands” of New Mexico children at risk.
“Child exploitation is a horrific crime and online predators are determined criminals,” Meta said in response to the suit. “We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement…to help root out predators.”
The lawsuit was filed less than a week after Meta announced that it was strengthening its child safety features, following a series of reports by the Wall Street Journal revealed that Meta was failing to curtail the activity of pedophile networks on Facebook and Instagram.