The room was split almost evenly on a second case, involving a phrase that some viewed as a violation of rules against hate speech but others read as a joke.
Facebook’s new effort to bring outside experts into its content review process promises to be complicated and possibly contentious, if discussions this week at a meeting in Singapore are any indication.
Over the course of two days, 38 academics, non-profit officials and others from 15 Asian countries who were invited to a Facebook workshop wrestled with how a proposed “external oversight board” for content decisions might function.
The gathering, the first of a half-dozen planned for cities around the world, produced one clear recommendation: the new board must be empowered to weigh in not only on specific cases, but on the policies and processes behind them.
Facebook has long faced criticism for doing too little to block hate speech, incitements to violence, bullying and other types of content that violate its “community standards.”
In Myanmar, for example, Facebook for years took little action while the platform was used to encourage violence against the Rohingya minority.
But the company also draws fire for not doing enough to defend free speech. Activists accuse the company of taking down posts and blocking accounts for political or business reasons, an allegation it denies.
Facebook CEO Mark Zuckerberg unveiled the idea of an independent oversight board last November and a draft charter was released in January.
“We want to find a way to strengthen due process and procedural fairness,” Brent Harris, director of global affairs and governance at Facebook, said at the opening of the Singapore meeting.
A Reuters reporter was invited to observe the proceedings on the condition that the names of participants and some details of the discussions not be disclosed.
Facebook’s initial plan calls for a 40-person board that would function as a court of appeal on content decisions, with the power to issue binding rulings on specific cases.
But as attendees peppered Facebook officials with questions and worked through issues such as how the board would be chosen and how it would select cases, they repeatedly came back to questions of policy. Rulings on individual postings would mean little if they were not linked to the underlying content review procedures, many attendees said.
Hate speech policies were a big focus of discussion. Many attendees said they felt Facebook was often too lax and blind to local circumstances, but the company has held firm to the concept of a single set of global standards and a deliberate bias towards leaving content on the site.
More than one million Facebook posts per day are reported for violations of the standards, which set detailed rules on everything from pictures of dead bodies (usually allowed) to explicit sexual conversations (usually not allowed).
The company has been beefing up enforcement. It now has an army of 15,000 content reviewers, many of them low-paid contractors, charged with checking posts that are reported for violations and deciding what to remove. Difficult decisions, or those involving politically contentious questions, are often “escalated” to the company’s content policy team.
One of the examples discussed at the Singapore meeting involved a post that was reported more than 2,000 times and reviewed 108 separate times by different content moderators – who concluded every single time that the post did not violate standards and should remain up.
But after it was escalated to content policy staffers who had more information about the political context, it was removed. Meeting participants appeared to be unanimous in agreeing that it should indeed have come down.
The room was split almost evenly on a second case, involving a phrase that some viewed as a violation of rules against hate speech but others read as a joke. In that situation, the content had remained on the service for many months before it was reported, and Facebook took it down.