A comprehensive investigation by the Wall Street Journal and the Stanford Internet Observatory reveals that Meta-owned Instagram has been home to an organized and massive network of pedophiles.
But what separates this case from most is that Instagram's own algorithms were promoting pedophile content to other pedophiles, while the pedos themselves used coded emojis, such as a picture of a map, or a slice of cheese pizza.
Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests, the Journal and the academic researchers found.
The pedophilic accounts on Instagram mix brazenness with superficial efforts to veil their activity, researchers found. Certain emojis function as a kind of code, such as an image of a map—shorthand for “minor-attracted person”—or one of “cheese pizza,” which shares its initials with “child pornography,” according to Levine of UMass. Many declare themselves “lovers of the little things in life.” -WSJ
According to the researchers, Instagram allowed pedophiles to search for content with explicit hashtags such as #pedowhore and #preteensex, which were then used to connect them to accounts that advertise child-sex material for sale from users going under names such as "little slut for you."
Sellers of child porn often convey the child's purported age, saying they are "on chapter 14," or "age 31," with an emoji of a reverse arrow.
BREAKING: Instagram algorithm exposed promoting pedophile networks in massive investigation, video sales, ‘preteensex’ menus, in-person meetups with underage boys and girls, using emojis such as a map and cheese pizza - WSJ— Jack Poso 🇺🇸 (@JackPosobiec) June 7, 2023
Meta claims to have taken down 27 pedophile networks over the past two years, and says it plans more removals.
"That a team of three academics with limited access could find such a huge network should set off alarms at Meta," said Alex Stamos, the head of the Stanford Internet Observatory and Meta’s chief security officer until 2018, adding that the company has far more effective tools to 'map' its pedophile network than outsiders do.
"I hope the company reinvests in human investigators," he added.
Researchers investigating the network set up test accounts within the pedophile network, which were immediately inundated with "suggested for you" recommendations of child-sex content, as well as accounts linking to off-platform trading sites.
Underage-sex-content creators and buyers are just a corner of a larger ecosystem devoted to sexualized child content. Other accounts in the pedophile community on Instagram aggregate pro-pedophilia memes, or discuss their access to children. Current and former Meta employees who have worked on Instagram child-safety initiatives estimate the number of accounts that exist primarily to follow such content is in the high hundreds of thousands, if not millions. -WSJ
"Instagram is an on ramp to places on the internet where there’s more explicit child sexual abuse," according to Brian Levine, director of the UMass Rescue Lab. Levine authored a 2022 report for the DOJ's National Institute of Justice on child exploitation over the internet.
What's more, Meta accounted for 85% of child pornography reports filed with the National Center for Missing & Exploited Children, according to the report. That said, "Meta has struggled with these efforts more than other platforms both because of weak enforcement and design features that promote content discovery of legal as well as illicit material, Stanford found."
Today the WSJ and Stanford Internet Observatory released important research showing pedophiles use Instagram to share material.— Andrea Stroppa 🐺 Claudius Nero's Legion 🐺 (@andst7) June 7, 2023
We're happy that Stanford and WSJ followed our steps in working on this critical issue. As an independent researcher with my team, we showed how…
"Instagram’s problem comes down to content-discovery features, the ways topics are recommended and how much the platform relies on search and links between accounts," said David Thiel, chief technologist at the Stanford Internet Observatory. "You have to put guardrails in place for something that growth-intensive to still be nominally safe, and Instagram hasn’t."
Sarah Adams, a Canadian mother of two, has built an Instagram audience discussing child exploitation and the dangers of oversharing on social media. Given her focus, Adams’ followers sometimes send her disturbing things they’ve encountered on the platform. In February, she said, one messaged her with an account branded with the term “incest toddlers.”
Adams said she accessed the account—a collection of pro-incest memes with more than 10,000 followers—for only the few seconds that it took to report to Instagram, then tried to forget about it. But over the course of the next few days, she began hearing from horrified parents. When they looked at Adams’ Instagram profile, she said they were being recommended “incest toddlers” as a result of Adams’ contact with the account.
A Meta spokesman said that “incest toddlers” violated its rules and that Instagram had erred on enforcement. The company said it plans to address such inappropriate recommendations as part of its newly formed child safety task force. -WSJ
Meta acknowledged to the Journal that they had received a flood of reports of child sexual exploitation and failed to act on them - blaming a software glitch that prevented a substantial portion of user reports from being processed.
And while Meta is allowing pedophiles to run rampant on its platforms, ZeroHedge is still banned.