An experienced child exploitation investigator told Reuters he reported 26 accounts on the popular adults-only website OnlyFans to authorities, saying they appeared to contain sexual content featuring underage teen girls.
“What is alarming is the scope and scale of it,” said Matt W.J. Richardson, head of intelligence at The Canadian Open Source Intelligence Centre. He noted that many accounts featured more than one female who he suspected was underage.
Read more: The real-life violence that inspired South Korea’s ‘Squid Game’
Within a day of his Dec. 16 report to authorities, all of the accounts had been removed from OnlyFans, said Richardson, whose organization trains law enforcement and government agencies in open source intelligence to help combat crimes such as human trafficking and child exploitation.
Richardson said he reported the 26 accounts as containing suspected child sexual abuse material (CSAM) to the National Center for Missing & Exploited Children, the U.S.-based clearinghouse for CSAM-related tips.
Some of the accounts were linked to each other through promotional posts, suggesting they may have been controlled by the same person or group, Richardson said.
The images in the accounts featured females with physical attributes typical of those under the age of 18, Richardson said. Most had narrow hips and lacked physical maturation, he said. Many had narrow shoulders or appeared to be “well short of five feet tall,” he said.
Sexually explicit images of minors are banned in most countries, including the U.S., U.K., and Canada, and are against OnlyFans rules. On its website, OnlyFans says it prohibits content featuring the exploitation or abuse of anyone under 18, even if it’s adults “pretending to be” under 18.
The U.S. National Center for Missing & Exploited Children (NCMEC) told Reuters it could not comment on reports made to its tip line. Generally, NCMEC said, it tries to confirm whether reported webpages actively host child sexual abuse material or exploitive content. When warranted, it notifies the company hosting the material; it’s then up to the company whether to remove or block the content. In addition, all tips are “referred to the appropriate law enforcement agency for possible investigation,” NCMEC said.
Responding to the findings of this story, an OnlyFans spokesperson said the company has a “zero tolerance approach” to child sexual abuse material on the platform and has “strict onboarding processes to ensure all creators are over the age of 18.” OnlyFans works closely with NCMEC to “thoroughly investigate any reports they receive from others about our platform,” the spokesperson said.
She did not respond to a question about why the 26 accounts were taken down only after Richardson reported them.
OnlyFans has said it vets all content and swiftly removes and reports any suspected child sexual abuse material when detected. The company has a “pre-check team” that uses technology to detect content that’s “extremely likely to be a child,” CEO Keily Blair said in a speech in August. OnlyFans invests heavily in content moderation, Blair has said, which allows it to “focus on keeping the community safe and keeping the community for adults only.” OnlyFans also says all content is ultimately reviewed by human moderators.
Read more: Netanyahu orders destruction of Houthi infrastructure
The spokesperson noted that OnlyFans itself made 347 reports to NCMEC in 2023, “vastly fewer than the millions of instances that are reported by other social media platforms where users can remain anonymous and where content is unmoderated.”
OnlyFans’ 4.1 million content creators typically sell explicit images and videos for a monthly subscription fee, plus one-off payments. The company keeps a fifth of the revenue. The subscriptions effectively place paywalls around nearly every OnlyFans account, making the site difficult to scrutinize.
In a July investigation, Reuters used U.S. police and court records to document 30 complaints to law enforcement that child sexual abuse material appeared on OnlyFans between Dec. 2019 and June 2024. The case files examined by Reuters cited more than 200 explicit videos and images of kids, including some adults having oral sex with toddlers.
OnlyFans told Reuters at that time that NCMEC has “full access” to the site behind its paywalls. But NCMEC said access was “limited” to OnlyFans accounts reported to its tip line or connected to a missing child case. Aside from that, NCMEC said it “does not proactively monitor, moderate, or actively seek to review content at scale” behind the paywall of OnlyFans.
Emojis of lollipops and baby bottles
In recent months, Reuters separately examined non-explicit public profile photos, bios and posts of OnlyFans creators who shared near-identical promotional language in their posts. Access to the material was open to the public without a paid OnlyFans subscription.
Female creators in 49 of those accounts had non-explicit profile photos that appeared childlike, according to three people with professional experience identifying child exploitation, including Richardson. Richardson said the 26 cases he reported to NCMEC overlapped with four of those Reuters identified.
Eric Silverman, a researcher for Culture Reframed, an organization focused on raising awareness about the harms of pornography to youth, said the 49 profile photos raised serious concern.
“Their appearance, wide-eyed, the innocent look, the pouting, the finger in the mouth, all these are visual cues that convey childhood,” he said. “Even if every single one of them is verifiably 18 or older, you still have to confront the issue that OnlyFans is profiting from the sexual portrayal of women who appear to be underage.”
The accounts dated back months, as far as December 2023. All 49 of them were taken down after Reuters contacted OnlyFans about its findings.
A company spokesperson said OnlyFans reviewed the user names provided by Reuters. “In every case, the Creators’ Government identity documents had been checked and verified by a third party which confirmed the documents are valid, and all individuals are over the age of 18.”
The company did not answer a question about why the 49 accounts were removed. OnlyFans also did not respond when asked to identify the “third party” that confirmed the documents.
Reuters could not verify the ages of the females featured in the profiles of the 49 accounts or whether their photos had been digitally altered. OnlyFans’ website says verified creators can post AI-generated content of themselves but must disclose it by, for example, including the hashtag #ai. The 49 accounts identified by Reuters didn’t have that hashtag or language flagging the use of AI.
The public details of those accounts often included emojis such as teddy bears, lollipops, or baby bottles. In their non-explicit profile photos, some females wore their hair in pigtails or braids. Others had braces. Many referred to themselves in diminutive terms such as “small,” “tiny,” “petite” or “mini.”
Some described themselves as “shy” or “innocent.” One profile description read, “If you came to my page, it means you love little, inexperienced girls like me!”
One profile photo was of a female, dressed in a skin-tight, rainbow-coloured fishnet top and described on her OnlyFans page as a “petite princess” who “may look innocent” but is looking for a “well experienced man.”
“She doesn’t even look 15,” said Lori Cohen, CEO of Protect All Children from Trafficking or PACT, a U.S. non-profit dedicated to eradicating child sexual exploitation and trafficking.
“It’s a very disturbing image,” Cohen said.
‘This isn’t small scale’
Richardson said the 26 accounts he reported suggested possible coordination by content creators.
Typically, an OnlyFans creator has a public profile with one or more photos, and subscribers can log in to view explicit content involving that creator. But in the cases examined by Richardson, the creators often posted images advertising multiple other females, some of whom appeared underage, and linked to those females’ OnlyFans accounts, he said.
“This isn’t small scale. It’s not like one or three or four,” said Richardson. “How many really are out there?”
In some of the 26 accounts Richardson reported on Dec. 16, the profile photo of the OnlyFans creator featured what looked like an adult female, he said, but the photos posted within the account included those of suspected underage females along with their OnlyFans account names.
“Some of them appear to be very young and tiny,” he said.
He said some of the content in the 26 accounts could have been “legacy CSAM”: sexually explicit photos taken when the female was a girl but posted when she was 18 or over. These would still violate laws governing child sexual abuse material, three legal experts told Reuters
Richardson has previously conducted research into OnlyFans.
In 2022, he and other researchers reported that they found creator profiles with “common indicators” of child sexual abuse material or sex trafficking, including keywords and other language. Shortly after their report was published, one U.S. lawmaker entered it into the Congressional Record, the official chronicle of congressional proceedings.
In the 49 public profiles Reuters reviewed, many of the account names included the words – or variations of the words – “little,” “baby,” “sweet” or “Lolita.” Those names, when combined with images of young females, can be used to advertise child sexual abuse material, said the specialists consulted by Reuters.
The posts sometimes referred to females in schoolgirl or other child-oriented terms. One, for example, stated: ”The depraved babe @[account name redacted by Reuters] ran away from school.” It continued: “Subscribe and punish her!” The account was removed by Dec. 17, the day after Reuters contacted OnlyFans about its findings.
Bios or posts in the accounts Reuters identified often included references to females being 18 or having just turned 18. “I’ve been waiting to turn 18 for a long time so I could register on Onlyfans, and here I am,” read the profile of one female, who Cohen said appeared younger.
“There’s a fullness to her face that is often associated with youth. She’s got disproportionately large eyes, and children tend to have larger eyes on their face relative to adults,” said Cohen.
Even if the females actually were 18, child exploitation experts said that’s still a problem because the images themselves suggested they might be underage. “It creates a demand for younger and younger bodies,” said Cohen. “This type of behaviour normalizes pedophilia.”
It is unlikely that a person could be successfully prosecuted for child sexual abuse material in a scenario where an adult is posing as a minor, because governing statutes generally focus on the participant’s actual age, three legal experts told Reuters.
Among the 49 accounts Reuters examined, the creators often used nearly identical language to describe themselves or particular sex acts.
“I don’t like parties or noisy clubs, I’m more of a home person, Netflix and chill is my kind of weekend,” one female wrote in her profile. Another used notably similar language: “I’m not a big fan of parties and noisy clubs. I’m more of a homebody, you know? Netflix and chill – that’s my idea of a perfect weekend.”
As of mid-December, Reuters documented more than 150 instances in which posts from different accounts used identical or nearly identical language. Many of those posts, spread across 25 of the 49 accounts Reuters reviewed, also shared the same combinations of emojis.
A half dozen of those posts, all purporting to have been written by different creators, promised for instance that “this beauty will play with her tight holes just for you.”
Multiple others promised a “gorgeous doll for your hot desires” or a girl who “is all alone right now and needs some new attention.”