Ashelyn ,

My boyfriend occasionally watches YouTube shorts, mostly for the occasional good joke or cat video. He’s told me that the shorts algorithm seemingly goes out of its way to show him Andrew Tate type content as well as general Daily Wire/Shapiro/conservative ‘libs owned’ clips. More or less, if he doesn’t immediately close out the app or swipe to the next short when one of these videos comes up, his shorts feed is quickly dominated by them.

I think the big thing is that these algorithms are often trained on maximizing watch time/app usage, and there’s something uniquely attention-catching to a lot of men and boys about the way viral manosphere content is constructed. A random poor setup to a skit is likely to get swiped past, but if the next clip comes swinging out of the gate with “here’s how women are destroying the West” there’s a certain morbid curiosity that gets some to watch the whole thing (even out of amusement/credulousness), or at least stay on the clip slightly longer than they would otherwise. If one lingers on that content to any degree, the algorithm sees that as a sign that the user wants more of it—or rather, that it would achieve its “more engagement” goals by serving up more of it.

Plus, it’s grabbing ideas on what to recommend based on user data and clustered associations. It’s very likely to test the waters with stuff it knows worked for others with similar profiles, even if it’s a bit of a reach.

Edit: minor sentence structure stuff

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • [email protected]
  • All magazines