Facebook’s product policy head Monika Bickert is responsible for the team that sets content standards on the world’s largest social media site with over 1.3 billion users. She decides what people can post, as well as guides the enforcement of these policies. And every time a piece of content is reported to Facebook, it is her team’s job to review it and apply their content standards to it. Excerpts from an exclusive interview with Nandagopal Rajan.
Is it a challenge keeping Facebook clean?
Our job is to make sure that we are keeping people safe and free from abuse. That is extremely challenging given the size of our population and its global diversity. It is something we are very committed to and we are working hard everyday to get it right and get better at it.
Is there some part that is especially challenging and is it getting tougher to do your job these days?
There are two primary difficulties with the job. People sometimes have different ideas about what is okay to share and what is not. But we have to have one set of standards to apply across the globe for the entire community. The second challenge is that we get a lot of reports. We want to be able to respond to those reports efficiently and consistently. For that reason, we have to craft our policies to be very objective. This is why the policies at times have to be more blunt than we would like them to be.
See the video interview with Monika Bickert here
How does the reporting system at Facebook work?
We want people to tell us when something is not right on the site. That is why we make it so easy to report content. In fact, you can report any piece of content on Facebook whether you are using the network on desktop or mobile. When someone reports a post, it is sent to an employee who has the language skills and training to apply our content standards. After we review the content, we remove it from the site or leave the content on the site. In either case, we sent a message to the person who reported it.
Is there a timeframe in which the process is completed?
It really depends on what is being reported. Of course, it is important that we respond in time, but it is more important to us that we get it right. We know we will not always get it right. There are real people reviewing the report and they will make mistakes from time to time. But our accuracy rates are very high and we are proud of that.
But isn’t the time taken to respond in cases of self harm for instance?
I am very happy that you asked this question as I am very proud of what we do to keep people safe from instances of self harm. If somebody reports a threat of self harm, the first thing that we do is to provide the person who reported with resources they can use to reach out and help the person. That happens automatically. The next thing that we do is we prioritise the review of this content. We also have a network of safety organisations around the world and we might reach out to them to keep the person safe.
We have had recent instance of terror groups using social network to reach out to people and propagate their message. Is there a way to preempt this?
We have taken a very strong stance against terrorism. If you look at our community standards, we say clearly that we do not allow violent organisations or members of that organisations to use Facebook. If we find that they are using Facebook for any reason, even to talk about something personal, we would remove them from the site and ensure that they do no comeback. Also, we do not allow content that praises or supports terror organisations or their acts. We will absolutely remove that content.