Maria Montero

YouTube under fire for recommending videos of children with inappropriate comments

More than a year after a YouTube child safety content moderation scandal, it takes just a few clicks for the platform’s recommendation algorithms to redirect searches for “bikini-clad” videos of adult women to clips of Minors in scantily clad gymnastics that contort the body. or take an ice “challenge” or popsicle.

A YouTube creator named Matt Watson pointed out the problem in a critical Reddit post, saying he found dozens of videos of children in which YouTube users are exchanging inappropriate comments and timestamps below the fold, denouncing the company for not having avoided what he describes as a “soft core” Pedophile Ring “operating with the naked eye on his platform.

He has also posted a YouTube video demonstrating how the platform’s recommendation algorithm pushes users into what he calls a “wormhole” of pedophilia, accusing the company of facilitating and monetizing the sexual exploitation of children.

We were easily able to replicate the behavior of the YouTube algorithm that Watson describes in a private browser session with a history that, after clicking on two videos of bikini-clad adult women, suggested that we watch a video called “Sweet Sixteen Pool Party.”

Clicking on YouTube’s sidebar brings up various prepubescent girl videos in its “until next time” section, where the algorithm includes related content to encourage users to keep clicking.

The videos they recommended to us in this sidebar included thumbnails showing young girls demonstrating gymnastic poses, showing off their “morning routines,” or licking popsicles or popsicles.

Watson said it was easy for him to find videos containing inappropriate / predatory comments, including sexually suggestive emoji and timestamps that appeared with the intention of highlighting, shortcutting and sharing the most engaging positions and / or moments in the videos of minors.

We also found multiple examples of inappropriate timestamps and comments on children’s videos that the YouTube algorithm recommended that we view.

Some comments from other YouTube users denounced those who made sexually suggestive comments about children in the videos.

In November 2017, several major advertisers froze spending on the YouTube platform after an investigation by the BBC and the Times discovered similarly obscene comments on children’s videos.

Earlier in the same month, YouTube was also criticized for low-quality content targeting children as viewers on its platform.

The company announced a series of policy changes related to child-focused videos, including the fact that police comments about children’s videos and inappropriate videos about children in them would have to be disabled. completely.

Some of the young girl videos YouTube recommended we view already had comments disabled, suggesting that their AI previously identified a large number of inappropriate comments being shared (due to their policy of disabling comments on clips containing children when there are comments). considered “inappropriate”) – however, the videos themselves were still suggested for viewing in a test search that originated with the phrase “bikini ride”.

Watson also says that he found advertisements shown in some children’s videos that contain inappropriate comments, and claims that he found links to child pornography that are also shared in YouTube comments.

We were unable to verify those findings in our brief tests.