This is one report you absolutely do not want to miss this week:
Yup.
Instagram regularly recommends sexual videos to accounts for teenagers that appear interested in racy content, and does so within minutes of when they first log in, according to tests by The Wall Street Journal and an academic researcher.
This isn't just idle speculation, mind you. The Journal and Northeastern University computer science professor Laura Edelson took turns "setting up new accounts with ages listed as 13." They then proceeded to make these fake accounts watch racy content on Instagram's "Reels" feature.
The content is nasty stuff, and there's no need to share it here, but rest assured this experiment had a very predictable effect:
When the accounts skipped past other clips but watched those racy videos to completion, Reels recommended edgier content.
Well, yeah.
It took "as little as three minutes" for pornographers to start infiltrating the feeds of the 13-year-old accounts. After 20 minutes "the test accounts' feeds were dominated by promotions for such creators, some offering to send nude photos to users who engaged with their posts."
Meta has claimed that "teens under 16 aren't supposed to be shown sexually explicit content at all."
Yeah, I don't think that's working!
So what was Meta's response?
Meta dismissed the test findings as unrepresentative of teens' overall experience.
'This was an artificial experiment that doesn't match the reality of how teens use Instagram,' spokesman Andy Stone said.
Uhh..
First of all, "artificial experiment" is redundant. All experiments are "artificial." But that doesn't matter; what matters is what they prove. And plainly this experiment proves that teenagers can access sexually explicit content on Instagram very easily, the site even pushing the content at them once it notices their interest in it.
And I mean, look, you don't need me to tell you: If teenagers are given the chance to access this kind of content, many of them will. It's just the nature of temptation. It's why parents are supposed to steer and guide their kids away from such harmful material.
The obvious solution here, of course, is maybe don't let your 13-year-old on Instagram unsupervised. There's really not much reason someone so young should be on there, anyway. After nearly two decades of social media, we've come to understand that.
Meta, meanwhile, suggests it's working hard to counteract this phenomenon:
As part of our long-running work on youth issues, we established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have meaningfully reduced these numbers in the past few months.
Yeah, I'm sure they're hard at work on that!
P.S. Now check out our latest video 👇