The company, which recently changed its name from Facebook to Meta, has been under scrutiny for a long time for how it handles abuses that prevail on its services.
Facebook: For the first time ever, social media platform Facebook has revealed how prevalent bullying and harassment is on the site. As per the platform, in the third quarter, content related to bullying and harassment was seen about 14 to 15 times for every 10,000 views. In fact, Facebook’s parent company Meta also shared the prevalence of such content on photo-sharing platform Instagram in its quarterly content moderation report. On Instagram, the company reported that it saw content related to bullying and harassment between 5 and 6 times per every 10,000 views of content.
The company, which recently changed its name from Facebook to Meta, has been under scrutiny for a long time for how it handles abuses that prevail on its services. This has intensified after whistleblower Frances Haugen, who was also a former employee, shared internal documents of the company earlier this year. These documents pertained to research and discussions around the effect of Instagram on the mental health of teenagers as well as on whether Facebook and its sister platforms led to division among people.
The whistleblower said that the documents showed that Facebook, now Meta, valued profits over public safety, and this stance was objected to by the Mark Zuckerberg-owned company, which said that the documents were being used to paint a false picture.
Though Facebook objected to the documents and what was reported around them, it led to users calling for the company to be more transparent. They also questioned whether metrics like prevalence of such harmful content gave the full picture of the manner in which the company deals with this issue.
Meta has said that its numbers of bullying and harassment only looked at instances where the company did not need additional information like user reports to determine if such content broke its rules. It said that out of the 9.2 million pieces of content that it removed for being in violation of its bullying and harassment rules, the company found 59.4% pieces proactively.