Facebook today updated its “community standards” guidelines, giving users more clarity on acceptable posts relating to nudity, violence, hate speech and other contentious topics.
The world’s biggest social network said it does not allow a presence from groups advocating “terrorist activity, organized criminal activity or promoting hate.”
The new guidelines say Facebook will take down “graphic images when they are shared for sadistic pleasure or to celebrate or glorify violence.”
Nudity is also removed in many cases but allowed for images of breastfeeding, art or medical conditions.
“These standards are designed to create an environment where people feel motivated and empowered to treat each other with empathy and respect,” said a blog post from Facebook global policy chief Monika Bickert and deputy general counsel Chris Sonderby.
“While our policies and standards themselves are not changing, we have heard from people that it would be helpful to provide more clarity and examples, so we are doing so with today’s update.”
The new guidelines say Facebook members should use their “authentic name,” a move that appears to head off criticism from people who used stage or performance names instead of their legal name.
In October Facebook said it would ease its “real names” policy that prompted drag queen performers to quit the social network and sparked wider protests in the gay community and beyond.
Facebook’s new guidelines said it would remove content, disable accounts and work with law enforcement “when we believe that there is a genuine risk of physical harm or direct threats to public safety.”
But it also pointed out “that something that may be disagreeable or disturbing to you may not violate our community standards.”
The move comes with Facebook and other social media struggling with defining acceptable content and freedom of expression.
“It’s a challenge to maintain one set of standards that meets the needs of a diverse global community,” the blog post said.
“This is particularly challenging for issues such as hate speech. Hate speech has always been banned on Facebook, and in our new community standards, we explain our efforts to keep our community free from this kind of abusive language.”
Facebook said earlier this year it was putting warnings on “graphic content,” which would also be banned for users under 18. In 2013, Facebook ended up banning a beheading video after outrage followed a lifting of the ban.