Report: Twitter wanted to launch an "OnlyFans" competitor but the idea was tanked because it already has trouble stopping child porn
· Aug 30, 2022 · NottheBee.com

So Twitter wanted to launch an OnlyFans-style competitor on their website.

No joke.

Here's the deets the of how Twitter wanted to apparently get into the subscription-based homemade pornography business, from The Verge:

In the spring of 2022, Twitter considered making a radical change to the platform. After years of quietly allowing adult content on the service, the company would monetize it. The proposal: give adult content creators the ability to begin selling OnlyFans-style paid subscriptions, with Twitter keeping a share of the revenue.

Had the project been approved, Twitter would have risked a massive backlash from advertisers, who generate the vast majority of the company's revenues. But the service could have generated more than enough to compensate for losses. OnlyFans, the most popular by far of the adult creator sites, is projecting $2.5 billion in revenue this yearabout half of Twitter's 2021 revenue — and is already a profitable company.

Some executives thought Twitter could easily begin capturing a share of that money since the service is already the primary marketing channel for most OnlyFans creators. And so resources were pushed to a new project called ACM: Adult Content Monetization.

And why didn't they go ahead with their morally bankrupt plan?

Well, it's because Twitter is already awash in porn and they haven't been able to effectively remove child porn from their website. Launching a paid porn service without that policing ability could get them in serious trouble with every law enforcement agency, judge, and jury in the nation.

Before the final go-ahead to launch, though, Twitter convened 84 employees to form what it called a "Red Team." The goal was "to pressure-test the decision to allow adult creators to monetize on the platform, by specifically focusing on what it would look like for Twitter to do this safely and responsibly," according to documents obtained by The Verge and interviews with current and former Twitter employees.

What the Red Team discovered derailed the project: Twitter could not safely allow adult creators to sell subscriptions because the company was not — and still is not — effectively policing harmful sexual content on the platform.

"Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale," the Red Team concluded in April 2022. The company also lacked tools to verify that creators and consumers of adult content were of legal age, the team found. As a result, in May — weeks after Elon Musk agreed to purchase the company for $44 billion — the company delayed the project indefinitely. If Twitter couldn't consistently remove child sexual exploitative content on the platform today, how would it even begin to monetize porn?

So this group of folks who were tasked with finding a way to release pornography onto Twitter at a ginormous scale, who saw no problems with pornography in general, recommended Twitter not proceed because they're currently hosting LOTS of child porn on the platform.

Introducing paid pornography to the platform would only make this problem that much worse.

This proposed pornography monetization project just blows the lid off of Twitter's extreme problem managing the pornography hosted on its site.

While the Red Team's work succeeded in delaying the Adult Content Monetization project, nothing the team discovered should have come as a surprise to Twitter's executives. Fifteen months earlier, researchers working on the team tasked with making Twitter more civil and safe sounded the alarm about the weak state of Twitter's tools for detecting child sexual exploitation (CSE) and implored executives to add more resources to fix it.

"While the amount of CSE online has grown exponentially, Twitter's investment in technologies to detect and manage the growth has not," begins a February 2021 report from the company's Health team. "Teams are managing the workload using legacy tools with known broken windows. In short (and outlined at length below), [content moderators] are keeping the ship afloat with limited-to-no-support from Health."

Employees we spoke to reiterated that despite executives knowing about the company's CSE problems, Twitter has not committed sufficient resources to detect, remove, and prevent harmful content from the platform.

So according to this, Twitter has known, at least for a year and a half, and likely much longer, that their website is being used to broadcast kiddie porn with no repercussions.

There are years of reported evidence against Twitter:

Instead of tracking down and shutting down these child exploitation pages, I guess they're busy banning Libs of TikTok and other conservative accounts for exposing lefties who are complicit in child exploitation.

The internet is an awful place and these huge websites are going to be abused. But in my opinion, Twitter is a particular cesspool of filth due to its unwillingness to deal with the issue.

But unlike larger peers, including Google and Facebook, Twitter has suffered from a history of mismanagement and a generally weak business that has failed to turn a profit for eight of the past 10 years. As a result, the company has invested far less in content moderation and user safety than its rivals. In 2019, Mark Zuckerberg boasted that the amount Facebook spends on safety features exceeds Twitter's entire annual revenue.

Meanwhile, the system that Twitter heavily relied on to discover CSE had begun to break...

The 2021 report found that the processes Twitter uses to identify and remove CSE are woefully inadequate — largely manual at a time when larger companies have increasingly turned to automated systems that can catch material that isn't flagged by PhotoDNA. Twitter's primary enforcement software is "a legacy, unsupported tool" called RedPanda, according to the report. "RedPanda is by far one of the most fragile, inefficient, and under-supported tools we have on offer," one engineer quoted in the report said.

Read the entire report from The Verge to understand the depths of Twitter's major issues.

It's a shame that this issue of child pornography is brought to a head because of the website's desire to turn itself into a homemade porn site.

Twitter is a dark place... darker ways than you see on the surface.


P.S. Now check out our latest video 👇

Keep up with our latest videos — Subscribe to our YouTube channel!

Ready to join the conversation? Subscribe today.

Access comments and our fully-featured social platform.

Sign up Now
App screenshot