Do not miss the most important congress of digital marketing in Latin America. November 20 and 21 CDMX. Buy your ticket here
Instagram has more than one billion users globally
About 37 percent are between 13 and 24 years old
This is the segment of the population that they try to protect with the new measures
It is indisputable that social networks have transformed the mute for some years, not only because they are scenarios of interaction between users, but as a space in which brands can connect with their target audiences. One that has done well is Instagram
It is no secret that the photo and video app is the jewel in the crown of Facebook, currently has more than 1 billion users worldwide and their Stories already sleep 500 million daily active users, both factors have attracted a large number of brands and advertisers, only for this year it is estimated that the platform represents revenues higher than 14 billion dollars, according to projections of Jefferies & Company.
The importance that Instagram has acquired has motivated Menlo Park to be increasingly concerned with the environment and ecosystem that exists within it, particularly with regard to security and against violence, harassment and bullying, He has done so for at least the last year.
Harden your policy and now include memes
Recently Instagram published a letter signed by its director, Adam Mosseri, in which he explains that they have increased vigilance on content that can promote violence, self-vigilance and suicide within the social network.
In that sense, he details that he will no longer allow images related to such actions through images based on drawings, cmics, films and memes.
On Instagram, we owe it to everyone who uses our platform, especially those who may be at risk of suicide and self-harm, to do everything possible to keep them safe, Mosseri says in the published text.
In some way, what the platform is looking for is to harden its content monitoring policy. Recall that since last year I had dedicated effort to identify and not promote publications related to violence and suicides, in February this year I took more measures on images, photographs and videos, but now it will be extended to another type of post by the users.
?We will also eliminate other images that may not show self-harm or suicide, but that include associated materials or methods, adds the Instagram director, so that all users could be subject to some type of concealment or removal of content.
Social and business concern
Suicide and self-harm are more sensitive issues in recent years, in fact, data from the World Health Organization (WHO) point out that near 800 thousand people the vine is removed a year, and adds that it is the second leading cause of death among young people between 15 and 29 years old (in Mexico it is the second cause in deaths between 10 and 22 years, according to the Inegi).
Unfortunately, this situation is currently related to social networks and their possible negative impact mainly on youth; Many remember the case of a British teenager who committed suicide in 2017 allegedly after seeing graphic images related to suicide on Instagram and Pinterest. However, the case is one of the most recent, but not the only one.
The situation concerns not only governments and non-governmental institutions, but the social networks themselves, Twitter, Facebook, Snapchat, even Google they have taken actions to limit the scope of content related to autoviolence or suicide.
In the end, all or each one not only does it for a matter of social responsibility, it is a situation that can affect your business environment. In the case of Instagram, for example, currently around 37 percent of its users globally has between 13 and 24 years, the most susceptible age range. The existence of content such as those referred could be a threat for more people of these ages to join the platform or even to lose subscribers.