Bombshell WSJ report shows that Instagram connects pedophile networks using algorithms and blatantly explicit hashtags
· Jun 7, 2023 · NottheBee.com

The Wall Street Journal, along with researchers at two major universities, have documented a vast network of pedophiles on Instagram operating essentially out in the open, including blatant hashtags used to tag material for pedos.

According to this research, it goes far beyond Instagram having child pornography on their app. Instagram's algorithm also curated pedophilic content and connected people with similar interests, essentially facilitating the sharing of child sexual abuse material.

All thanks to Instagram's powerful and predictive algorithms and the site's lack of oversight of abuse.

Check out some of the hashtags that Instagram kept live and active.

[Graphic Sexual Language Below]

Though out of sight for most on the platform, the sexualized accounts on Instagram are brazen about their interest. The researchers found that Instagram enabled people to search explicit hashtags such as #pedowhore and #preteensex and connected them to accounts that used the terms to advertise child-sex material for sale. Such accounts often claim to be run by the children themselves and use overtly sexual handles incorporating words such as "little slut for you."

These pedos weren't hiding their intent to share child porn. Besides the posts being explicit these accounts would sell CP as well.

The accounts would advertise all sorts of different sexual abuse material AND offer to prostitute children for the right price.

Instagram accounts offering to sell illicit sex material generally don't publish it openly, instead posting "menus" of content. Certain accounts invite buyers to commission specific acts. Some menus include prices for videos of children harming themselves and "imagery of the minor performing sexual acts with animals," researchers at the Stanford Internet Observatory found. At the right price, children are available for in-person "meet ups."

All of this happening on Instagram, one of the social media apps that most parents let their children use unsupervised. It's been a haven of predators for a long time, and Mark Zuckerberg and Meta just let it happen.

These accounts obviously broke the law and Meta's rules, but there was no serious attempt to enforce those rules. In my opinion, Meta is guilty, at least, of negligent child abuse.

But Meta does claim that they are working to take down child sexual abuse material.

In response to questions from the Journal, Meta acknowledged problems within its enforcement operations and said it has set up an internal task force to address the issues raised. "Child exploitation is a horrific crime," the company said, adding, "We're continuously investigating ways to actively defend against this behavior."

In the last two years, Meta claims they have taken down 27 separate pedophile networks and killed many hashtags used by pedos to connect their content. They also claim to be working on their algorithm to prevent the curation of pornographic material.

It may well be true. But all that means is that it's like fighting the hydra. Every time they cut down one network of pedos, three more pop up.

Underage-sex-content creators and buyers are just a corner of a larger ecosystem devoted to sexualized child content. Other accounts in the pedophile community on Instagram aggregate pro-pedophilia memes, or discuss their access to children. Current and former Meta employees who have worked on Instagram child-safety initiatives estimate the number of accounts that exist primarily to follow such content is in the high hundreds of thousands, if not millions.

A Meta spokesman said the company actively seeks to remove such users, taking down 490,000 accounts for violating its child safety policies in January alone.

According to those who did the research, Instagram is the biggest app for sexual abusers by far. Twitter also has child pornography, but is much quicker to take action. TikTok and Snapchat didn't have nearly the amount of abuse or the capability to network as Instagram did either.

Okay, I may be a radical, but if the problem actually is as bad as Meta says, then we just need to shut Instagram down until they figure out how to battle the pedophile networks on the app.

It would take a ton of manpower, but obviously, the algorithmic system designed to prevent it is doing a terrible job.

Meta's automated screening for existing child exploitation content can't detect new images or efforts to advertise their sale. Preventing and detecting such activity requires not just reviewing user reports but tracking and disrupting pedophile networks.

It's not worth having Instagram exist at all if it's a den of iniquity for millions of pedophiles.

Ready to join the conversation? Subscribe today.

Access comments and our fully-featured social platform.

Sign up Now
App screenshot