Facebook apologised on Friday to Myanmar civil society groups who took issue with Mr Mark Zuckerberg’s defence of the platform’s record on curbing hate speech roiling the country.
Facebook has been battered by allegations that its platform has helped fuel communal bloodshed in Myanmar, a mainly Buddhist country accused of waging an ethnic cleansing campaign against Rohingya Muslims.
On Thursday, six Myanmar organisations published an open letter criticising an interview Mr Zuckerberg gave with news site Vox this week. In it he cited examples of both Myanmar Buddhists and Muslims spreading “sensational” messages on Facebook Messenger that warned of imminent violence from the other community.
“That’s the kind of thing where I think it is clear that people were trying to use our tools in order to incite real harm. Now, in that case, our systems detect that that’s going on. We stop those messages from going through,” Mr Zuckerberg was quoted as saying.
In their letter, the six local tech and human rights organisations said they were surprised to hear Mr Zuckerberg “praise the effectiveness” of Facebook’s systems in Myanmar.
“It took over four days from when the messages started circulating for the escalation to reach you,” said the groups, who had flagged the content to Facebook.
“Far from being stopped, they spread in an unprecedented way, reaching country-wide and causing widespread fear and at least three violent incidents in the process.”
When reached for a comment on Friday, a Facebook spokesman conceded the company was too slow in responding to reports about the incendiary messages.
“We should have been faster and are working hard to improve our technology and tools to detect and prevent abusive, hateful or false content,” the spokesman said.
“We are sorry that Mark did not make clearer that it was the civil society groups in Myanmar who first reported these messages.”
Facebook has also added more Myanmar-language reviewers and is rolling out the ability to report content in the Messenger service, the spokesman added.
In late January, Facebook removed the page of popular anti-Rohingya monk Wirathu. Last year (2017), it regulated the use of the word “kalar” which is considered derogatory against Muslims.
In their joint letter, the local groups said Facebook’s response to hate speech and vicious rumours in Myanmar has been inadequate for years, adding that their offers to help craft broader solutions have gone unanswered.
They urged the social media giant to add reporting mechanisms to the Messenger app, increase transparency, engage more with local stakeholders and draw on data and engineering teams to identify repeat offenders.
Facebook dwarfs all other social media platforms in Myanmar, where it has become the chief channel for communication among both the public and government ministries.
But it has come under fire for allegedly helping broadcast ethnic hatred in a fledgeling democracy still emerging from decades of repressive junta rule.
Scrutiny has intensified in the wake of a bloody military campaign against the Rohingya people that erupted last August, expelling some 700,000 of the minority to Bangladesh.
In March, the United Nations' special rapporteur to Myanmar Yanghee Lee said Facebook had morphed into a “beast” and had incited “a lot of violence and a lot of hatred against the Rohingya or other ethnic minorities”.