Why YouTube’s Algorithms Are Spitting Out Pedophiles

Thanks to the advent of data scraping tools, it is now possible for knowledgeable people to audit websites like YouTube. Analyzing the data that we can scrape from YouTube tells us a lot about how the platform is run and moderated.

Many parents have children who like to upload original videos to YouTube. Over time, even young children can build up a legitimate fanbase and following on the platform. But, unless their channel takes off spectacularly, they aren’t going to be pulling in tens of thousands of views. And yet, a growing number of parents have seen their child’s videos inexplicably amassing tens, even hundreds, of thousands of views.

Research from several organizations, including the New York Times, suggests that YouTube’s algorithms serve up content containing partially-clothed children and teenagers to potential pedophiles.

YouTube Recommendations

To understand what is going on here and how the researchers concluded that errant algorithms were to blame, you need to know how YouTube’s recommendations work in a general sense.

YouTube has refined its recommendation system since first launching the platform. In the early days, YouTube based its recommendations on the tags assigned to a video. Users add tags to videos they upload so they can be appropriately categorized.

While this is a logical way of doing things, it isn’t very useful in terms of finding new content for users. As machine learning became more accessible and refined, YouTube began to change tack. Instead of relying on the tags users assigned to videos, they started to track their users and look for patterns. These patterns hidden in the data enabled YouTube to take advantage of connections that users might not be aware of.

For example, if someone watches videos about snowboarding, there is a good chance they will also be interested in skiing videos. Users would tag both types of video with terms like “snow,” “extreme sports,” and “winter sports.”

However, it might be that most people who enjoy snowboarding videos also want to see videos about vintage cars. There is no immediate and obvious connection between these two things, yet the data might show a significant overlap in the groups that enjoy both types of video.

To have a better understanding of how YouTube is running, here is a worthy reading: 10 Best YouTube Video Promotion Services.

Unintended Consequences

The data-driven approach to recommendations that YouTube has adopted is effective. But machine learning algorithms are black boxes. Humans can’t simply open them up and see what the algorithm is doing. We don’t know exactly why YouTube goes on to recommend the videos that it does, only that the data is telling it to do so.

YouTube’s algorithms are also incapable of making any kind of value or moral judgment. If a user is continually moving from one video of partially-clothed children to the next, all the algorithm cares about is the data the user is generating. It isn’t capable of flagging the behavior as potentially dangerous.

Consequently, YouTube’s recommendation algorithms may be helping pedophiles to find content featuring children. YouTube’s algorithms have been criticized in the past for directing users towards conspiracy theories and fake news.

If a user shows an affinity for these types of videos, YouTube has no problem finding more of them. Despite the tweaks the platform has made to reduce the rate at which it recommends such videos, it is challenging to adjust the algorithms for content featuring children.

The difficulty comes from the fact that such videos do not contain any content that violates YouTubes’ guidelines. What’s more, any video featuring a child could potentially be appealing to a pedophile. Short of not recommending videos featuring children at all, it’s hard for YouTube to address the issue.

What Steps Has YouTube Taken?

YouTube has been aware of the concerns about pedophiles on the platform for some time. Earlier this year, Wired magazine published a story, along with several other outlets, highlighting how some YouTube comments sections had become a meeting place for pedophiles.

In addition to some overtly sexual comments made on videos featuring children, researchers also noticed that the pedophile community had developed their own secret code to communicate with each other via the comments.

In response to these reports, YouTube disabled comments on numerous videos featuring children. However, the problematic recommendation algorithm has remained in place. As things currently stand, it seems as if YouTube is unsure how to proceed.

Crossing the Line

This algorithm issue is one that affects the entire platform. While YouTube does not allow pornographic content, there is plenty of erotic content that falls within the platform guidelines. Someone watching legitimate erotic content featuring adults will then be recommended similar content based on what other users watch.

A user who is simply following YouTube recommendations might not notice that the women in the videos they are watching are getting progressively younger. It’s not until YouTube suddenly throws up a video that is obviously of a child that many people realize what road YouTube’s algorithms are taking them down.

What Can Parents Do?

First and foremost, parents should avoid uploading videos featuring their children to YouTube at all. If you upload any video with your children in it, it could end up in front of someone who you would never choose to show it to.

If your child has their own YouTube account and uploads their own videos, all you can do is try to monitor things the best you can. Unfortunately, YouTube has only taken the most token of steps to address this ongoing issue.

The reality is that YouTube is not a safe space for children. By the platform’s own admission, there is simply too much content uploaded every day to moderate it properly. There are severe limitations to what algorithms and artificial intelligence can do in this regard. But YouTube seems unfazed by the issues that journalists and child protection agencies have raised.

Google, who owns YouTube, used to use the motto “Do no evil.” Not only have they abandoned this motto, but they also seem to have abandoned the basic premise. Many people consider YouTube to be a genuine evil in its current form.