The musical application
have asked their moderators to
delete or hide videos
by users considered ugly or in poor environments
by the platform.
The statements of the
they come after a
TikTok asks its moderators
deleting user posts with a
"abnormal body shape", "too many wrinkles" or "ugly facial appearance"
as well as videos on
and where they appear
"houses in poor condition"
The document was filtered by
And, now, the company has assured in a statement to which Europa Press has had access, that "most" of these standards "are no longer used" or have never been in force.
In addition, the document also shows that the company asked its moderators to censor live broadcasts on "ideologically undesirable content," according to The Intercept.
"Most of the rules featured in The Intercept are no longer used, or in some cases seem to have never been in effect, but it is correct that for live streaming, TikTok remains especially vigilant to keep sexual content out. of the platform, "said TikTok.
In January, TikTok first published a guide with its guiding principles of moderation so that everyone could see how content is moderated and regulated within the platform, with the aim of increasing transparency and gaining the trust of its users.
"The local teams apply the updated Community Rules that we already published in January and which are aimed at keeping TikTok as a place of open self-expression and a safe environment for both users and creators," he added.
Furthermore, TikTok stresses that it has expanded its Trust and Security centers in the United States, Ireland and Singapore, which "oversee the development and execution of our moderation policies and are led by industry experts with extensive experience in these areas" .
The Chinese app also announced a few days ago that it would open a "Transparency Center" at its headquarters in the US city of Los Angeles in early May, to allow "outside experts" to examine and verify the work of its team members. .
In this way, outside experts will be able to see how the TikTok team applies moderation policies to "review technology-based actions and identify possible additional violations."