Facebook admits it was used to ‘incite offline
Facebook is the internet for many in Myanmar
Facebook has said it agrees with a report that found it had failed to prevent its platform being used to “incite offline violence” in Myanmar. The independent report, commissioned by Facebook, said the platform had created an “enabling environment” for the proliferation of human rights abuse. It comes after widespread violence against the Rohingya minority which the UN has said may amount to genocide. The report said Facebook would have to “get it right” before 2020 elections.
Facebook has more than 18 million users in Myanmar. For many, the social media site is their main or only way of getting and sharing news. The network said it had made progress in tackling its problems in Myanmar but that there was “more to do”. Last year, the Myanmar military launched a violent crackdown in Rakhine state after Rohingya militants carried out deadly attacks on police posts.
Thousands of people died and more than 700,000 Rohingya fled to neighbouring Bangladesh. There are also widespread allegations of human rights abuses, including arbitrary killing, rape and burning of land.
The country where Facebook posts whipped up hate
Facebook ‘still hosts hate speech’
Why Facebook banned an army chief
The Rohingya are seen as illegal migrants in Myanmar (also called Burma) and have been discriminated against by the government and public for decades.
The new report was commissioned after the UN accused Facebook of being “slow and ineffective” in its response to the spread of hatred online. The 62-page independent report from non-profit organisation Business for Social Responsibility (BSR) found that the platform “has become a means for those seeking to spread hate and cause harm” in Myanmar.
A minority of users are seeking to exploit Facebook as a platform to undermine democracy and incite offline violence. The report said that Facebook should more strictly enforce its existing policies on hate speech, introduce a “stand-alone human rights policy” and better engage with authorities in Myanmar.
The report – which only briefly referenced the Rohingya specifically – also warned that the 2020 elections presented a serious risk of further human rights abuses and warned Facebook to prepare now for “multiple eventualities”.
‘Facebook is the internet’
For many in Myanmar, Facebook is the internet.
After five decades of stale state propaganda, along came a feast of colourful, interactive news. But as UN human rights experts found, ultra-nationalist Buddhists seized on Facebook as a powerful means of inciting violence against Muslims. One frightening example came back in 2014 when a fake online story about a Muslim man who’d apparently raped a Buddhist woman sparked deadly clashes in the second city of Mandalay. Facebook has since admitted it didn’t do more to stem a torrents of racist posts over the years that followed.
In August this year, the same UN experts concluded the inflammatory material Burmese people had been exposed to day in, day out had played a role in enabling the military’s purge of Rohingya Muslims from Rakhine state – an attack which the UN believes was genocide. Facebook was looking into setting up a human rights policy and was making it easier to report and remove violent or inciting content.
Rohingya girls in danger: The stories of three young women
Mr Warofka said the company now employs Burmese language specialists to review potentially sensitive content. Much of Myanmar communicates online using the Zawgyi font, which is not easily translated into English and therefore using it makes it harder for bad content to be detected.
Blow by blow: How a ‘genocide’ was investigated
Facebook has removed Zawgyi as a language option for new users and is supporting Myanmar’s transition to Unicode – the international text encoding standard. Facebook has already banned several Myanmar military and government figures it said had “inflamed ethnic and religious tensions”.