justin․searls․co

Instagram’s Algorithm Delivers Toxic Video Mix to Adults Who Follow Children

The Journal sought to determine what Instagram’s Reels algorithm would recommend to test accounts set up to follow only young gymnasts, cheerleaders and other teen and preteen influencers active on the platform.

Instagram’s system served jarring doses of salacious content to those test accounts, including risqué footage of children as well as overtly sexual adult videos—and ads for some of the biggest U.S. brands.

The Journal set up the test accounts after observing that the thousands of followers of such young people’s accounts often include large numbers of adult men, and that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults. The Journal also tested what the algorithm would recommend after its accounts followed some of those users as well, which produced more-disturbing content interspersed with ads.

Neat.

Experts on algorithmic recommendation systems said the Journal’s tests showed that while gymnastics might appear to be an innocuous topic, Meta’s behavioral tracking has discerned that some Instagram users following preteen girls will want to engage with videos sexualizing children, and then directs such content toward them.

Since the dawn of the Internet, people have been consuming innocuous content alongside not-at-all innocuous content, often for the same not-at-all innocuous purpose. Who could have possibly predicted that a naive correlation-matching algorithm might reflect this? Where's my fainting couch when I need it?


Got a taste for fresh, hot takes?

Then you're in luck, because you can subscribe to this site via RSS or Mastodon! And if that ain't enough, then sign up for my newsletter and I'll send you a usually-pretty-good essay once a month. I also have a solo podcast, because of course I do.