TikTok's algorithm is pushing suicide on teens within 10 minutes of using the app, according to new research
· Mar 22, 2023 · NottheBee.com

Half of Americans are on the Chinese spyware app TikTok even though everyone knows it's dangerous. And now, to add to the national security risk, researchers have found out that the app is pushing suicide on teenagers.

From Breitbart:

The Chinese app's recommendation algorithm is so advanced that within ten minutes, it will start pushing suicide videos if the young TikTok user suggests he is sexually frustrated, according to research published Tuesday by corporate accountability group Ekō and shared with VICE News.

The researchers set up nine different new TikTok accounts, and listed their age as 13 — the youngest age users can join the platform — then they mimicked who they referred to as "incels" or "involuntary celibates," which is an online community of "young men who formed a bond around their lack of sexual success with women," according to VICE News.

So, young men who sign up for TikTok and who are "sexually frustrated" (as every 13-year-old boy in history has been), will be fed videos talking about and encouraging suicide and it only takes about 10 minutes for those videos to start showing up.

This is the design of the platform:

Feeding porn to kids, tell boys to kill themselves if they aren't sexually active, then tell girls that their discomfort with puberty actually means they are a boy or "non-binary." For those that survive, give them viral "challenges" that harm them (like cooking chicken in NyQuil).

In my opinion, no good parent should allow their kid on the app.

"Ten minutes and a few clicks on TikTok is all that is needed to fall into the rabbit hole of some of the darkest and most harmful content online," Maen Hammad, Ekō campaigner and co-author of the research, told VICE News.

"The algorithm forces you into a spiral of depression, hopelessness, and self harm, and it's terribly difficult to get out of that spiral once the algorithm thinks it knows what you want to see," Hammad added. "It's extremely alarming to see how easy it is for children to fall into this spiral."

It's designed to reinforce what you linger on, what you like, and what you share. If you watch one video about suicide, ten more similar videos pop up.

Additionally, the majority of the commenters were in support of the suggested suicide. Other commenters lamented about their loneliness, with many saying they felt "dead inside." One commenter even suggested his own suicide within the next four hours.

The same is true with softcore pornography, the same with transgenderism, the same with deadly TikTok "challenges."

We've got to break the cycle before it's too late.


Ready to join the conversation? Subscribe today.

Access comments and our fully-featured social platform.

Sign up Now
App screenshot

You must signup or login to view or post comments on this article.