Saltar al contenido
Contact :

All about the YouTube case and the privacy of minors

comcast essentials internet rates kids using computerYouTube took action when it was discovered that pedophiles targeted videos of children with inappropriate comments, removing more than 150,000 videos, closing thousands of accounts and disabling comments from more than 625,000 records. But now, after an investigation of months, the Google subsidiary will have to pay fines for the violation of a federal online privacy law, by tracking children under 13 without parental consent and showing targeted advertisements. This is all you should know about the YouTube case and its handling of the privacy of minors.

The last

Fine record

The Federal Trade Commission (FTC) and the New York attorney general imposed a fine of $ 170 million on YouTube, among other penalties, for their handling of child data. This, after the company tracked children under 13 without the consent of their parents and showed them targeted advertisements.

Regulators said YouTube had illegally collected data from children, such as identification codes used to track web browsing, without the consent of their elders. The site was also promoted to its advertisers as one of the main destinations for children. Thanks to that, you earn millions of dollars.

According to The New York Times, YouTube agreed to pay $ 170 million, of which $ 136 million will go to the Federal Trade Commission and $ 34 million to New York. The sum represents the largest civil penalty obtained by the FTC in a child privacy case, leaving the previous record fine of $ 5.7 million imposed against TikTok, a social application for video sharing, far behind.


Next, we present the development of the most relevant elements about this situation, and everything you should know about it.

Federal investigation

In June 2019, we informed you that the United States government was in the last stages of an investigation based on its handling of videos by YouTube that include minors. All, after numerous complaints from consumer groups and privacy advocates.

This investigation will threaten the company with possible millionaire fines and has led the tech giant to reevaluate some of its business practices. In addition, YouTube is considering major changes in its recommendation algorithm as the research develops.

The situation arises months after several major brands stopped buying ads on YouTube when their advertising continued to appear in children's videos, where pedophiles had infiltrated the comment sections.

YouTube disables comments on videos with minors

In a blog published on Thursday, February 28, YouTube claimed to have disabled the option to post comments on tens of millions of videos where similar new comments could be posted. The president and CEO of the company, Susan Wojcicki said in a tweet that the safety of children on the platform is crucial for them and that is why they have taken new measures.

Also, there will be an update of the "classifier" that moderates comments more quickly and effectively.

The companies, among which are AT&T, Disney, Nestl and the creator of Fortnite, Epic Games, acted after a YouTube user posted a video to signal this behavior. The video was created by Matt Watson and in it accused YouTube of "facilitating sexual exploitation" of children, stating that the YouTube recommendation system (those links that appear automatically to recommend other options that may interest you), also they guided predators to other similar videos of minors, many of which carry announcements of the main brands.

Be careful what you post!

As a wake-up call to parents who post photographs or videos of their children, or who let them do it themselves, it is reported that, for the most part, the videos they signed up for did not violate YouTube's rules. Moreover, they were innocent or funny videos, like girls or young girls doing gymnastics, or being part of physical activities. However, the videos were filled with nasty and even perverted comments. Among them, a series of sexually suggestive emojis, or even asking girls if they were wearing underwear, or mentioning what they could do if they were alone. Really worrying ..

This is not the first time something similar has happened. Approximately two years ago, hundreds of companies stopped investing in YouTube for concerns based on their ads appearing alongside problematic content, whether from terrorist groups, or promoting racial hatred, or videos that seemed to endanger or exploit children

More companies withdraw their ads

The AT&T telephone company is the last company to withdraw its YouTube ads, after reports that a large number of pedophile comments have invaded the videos of small children, mainly girls, marking time stamps that show times when the little ones they show some area of ​​the body that could be sexualized, and objectifying the children in the YouTube comments section."Until Google can protect our brand from offensive content of any kind, we are removing all YouTube advertising," an AT&T spokesperson told CNBC.

The company originally withdrew its entire advertising budget from YouTube in 2017 after disclosures that its ads appeared along with offensive content, including terrorist content, but these resumed in January.On Wednesday of this week, the manufacturer of Nestl and “Fortnite” Epic Games released its promotional videos. According to reports, Disney also stopped publishing its ads.

What is YouTube doing?

A Google spokesman, the parent company of YouTube, told the New York Times that they had already deleted the accounts and channels of the people who left the disturbing comments, in addition to deleting comments that violated their policies. Also, the company said to report illegal activities to the authorities. "Any content, including comments, that endangers minors is abominable and we have clear policies that ban this on YouTube," the spokesman said. "There is more to do and we continue to work to improve and detect abuse more quickly."

Editor Recommendations

Rate this post