Academics at Oxford and Stanford Universities think Facebook. It should give users greater transparency and control over the content they see on your platform.
They also believe that the social media giant should radically reform its governance structures and processes to shed more light on content decisions, even drawing on more outside experts to direct policy.
Such changes are necessary to address widespread concerns about Facebook’s impact on democracy and freedom of expression, they argue in a report released today, which includes a series of recommendations to reform Facebook (titled: Glasnost! Nine Ways Facebook Can Become a Better Forum for Free Speech and Democracy.)
“There are a lot of things that a platform like Facebook can do right now to address the concerns of the general public and do more to meet its public interest responsibilities as well as international human rights standards,” writes the primary author. Timothy Garton Ash.
“The executive decisions made by Facebook have important political, social and cultural consequences around the world. “A small change to the News Feed algorithm, or content policy, can have a faster and broader impact than any individual legislation (or even the entire EU).”
Here is a summary of the report’s nine recommendations:
- Tighten the wording of the community standards on hate speech. – academics argue that Facebook’s current wording in key areas is “too broad, leading to erratic, inconsistent, and often context-insensitive disapprovals”; and it also generates “a high proportion of contested cases”. Clearer and more precise wording could facilitate consistent implementation, they believe.
- Hire more and contextually expert content reviewers – “The problem is quality and quantity,” the report said, pressuring Facebook to hire more human content reviewers plus a layer of reviewers with “relevant cultural and political experience,” and also to get more involved. with trusted external sources such as NGOs. “It is clear that AI will not solve the problems with the deeply context-dependent judgments that must be made to determine when, for example, hate speech becomes dangerous speech,” they write.
- Increase “decisional transparency” – Facebook does not yet offer adequate transparency regarding content moderation policies and practices, they suggest, arguing that it needs to publish more details about its procedures, including a specific call on the company to “publish and widely disseminate case studies” to give users more guidance. and to provide potential grounds for appeals.
- Expand and improve the appeals process. – Also on appeals, the report recommends that Facebook provide reviewers with much more context about the content parts in dispute, and also provide statistical data of appeals to analysts and users. “Under the current regime, the initial internal reviewer has very limited information about the person who posted some of the content, despite the importance of context for adjudication of appeals,” they write. “A Holocaust image has a very different meaning when it is posted by a Holocaust survivor or a Neo-Nazi.” They also suggest that Facebook should work on developing the appeals due process “more functional and usable for the average user”, in dialogue with users. – as with the help of a content policy advisory group.
- Provide meaningful news feed controls for users – The report suggests that Facebook users should have more meaningful controls over what they see in the news feed, with the authors calling the current controls “totally inadequate” and advocating much more. Like the ability to turn the algorithmic feed off entirely (without the chronological view being reset to the default in the algorithm when the user reloads, as is the case now for anyone who changes the AI-driven view). The report also suggests adding a news feed analysis feature, to give users a breakdown of the sources they are viewing and how it compares to other users’ control groups. Facebook could also offer a button for users to take a different perspective by exposing them to content they don’t normally see, they suggest.
- Expand context and fact-checking facilities. – the report pushes for “significant” resources to be used to identify “the best, most authoritative and reliable sources” of contextual information for each country, region and culture – to help fuel the existing (but still inadequate and not universal) Facebook fact-checking efforts.
- Establish regular audit mechanisms. – there have been some civil rights audits of Facebook’s processes (like this one, which suggested that Facebook formalizes a human rights strategy), but the report urges the company to open up to more of these, suggesting that the significant audit model it must be replicated and extended to other areas of public interest, including privacy, algorithmic fairness and bias, diversity, and more.
- Create an external content policy advisory group. – Facebook should enlist key content stakeholders from civil society, academia, and journalism for a policy advisory group of experts to provide ongoing feedback on its content standards and their implementation. as well as reviewing your appeal file. “Creating a body that has credibility with the extraordinarily broad geographic, cultural and political range of Facebook users would be a great challenge, but a carefully selected and formalized expert advisory group would be a first step,” they write, noting that Facebook has begun to move in this direction, but adds: “These efforts must be formalized and expanded in a transparent manner.”
- Establish an external appeal body. – The report also calls for final “independent and external” scrutiny of Facebook’s content policy, through an appeals body located outside the mothership and includes representation from civil society and rights groups digital. The authors note that Facebook is already flirting with this idea, citing comments made by Mark Zuckerberg last November, but also cautioning that this must be done correctly if the power is to be “meaningful.” “Facebook should strive to make this body of appeals as transparent as possible … and allow it to influence broad areas of content policy … not just rule on the removal of specific content,” they warn.
In conclusion, the report notes that the content issues it focuses on are not only tied to Facebook’s business, but are also widely applied across various Internet platforms; therefore, there is growing interest in some kind of “industry-wide self-regulatory body.” Although it suggests that such general regulation will be “a long and complex task.”
Meanwhile, academics remain convinced that there are “a lot of things a platform like Facebook can do right now to address widespread public concerns and do more to meet its public interest responsibilities, as well as international standards. of human rights “. The front and center of the company given its massive size (more than 2.2 billion active users).
“We recognize that Facebook employees are making difficult, complex contextual judgments every day, balancing competing interests, and not all of those decisions will benefit from full transparency. But everything would be better for a more regular and active exchange with the world of academic research, investigative journalism and the defense of civil society “, they add.
We have contacted Facebook to comment on their recommendations.
The report was prepared by the Freedom of Expression Debate project of the Dahrendorf Program for the Study of Freedom, St. Antony’s College, Oxford, in association with the Reuters Institute for the Study of Journalism, Oxford University, the Project on Democracy and the Internet. , Stanford University and the Hoover Institution, Stanford University.
Last year, we offered some of our own ideas to fix Facebook, including suggesting that the company hire orders of magnitude with more expert content reviewers, as well as providing greater transparency in key decisions and processes.