FB ordering investigations to uncover platform deficiencies is good, but it needs to do more to make social media safe
On Wednesday, Facebook apologised to the Sri Lankan government for its role in the 2018 riots in the country, which led to three people losing their lives and 18 ending up injured. The communal tension, it is claimed, started on Facebook; the dominant community turned against a minority community, leading to violence. The company, at the time, only had two people to review content posted by over two million local language users.
The apology is a big step for Facebook, which, until a few years ago, had been brazenly shrugging off responsibility, or even acknowledging its role. While the company has been trying to make amends since 2018, when it started ordering an investigation into its role in hatred across nations where such events have taken place, it certainly needs to do much more given that it has become more than just a medium of communication.
More important, Facebook and other social media need to start spending more to monitor content. Although Facebook has hired human rights consultancies like Article One to investigate its role in illegal activities, the platform still lags in ramping up its workforce.
The role of AI cannot be underscored enough—Facebook reported last year that it had removed seven million instances of hate speech online using AI, but it could only do so in societies where the AI can comprehend the language. Time reported that Facebook had no protection built for languages like Assamese, and only employed moderators under a system that would usually take up cases of content flagged by users.
As people come to rely more on social media, companies like Facebook and Twitter need to be more proactive in monitoring content. The issue of platform responsibility also needs to involve the state, which, at times, becomes complacent about social media companies. Until both can sit and come out with a common framework to deal with such nuisance, Facebook will not be able to do much.