Facebook May Have Acted As A Catalyst In The Rohingya Genocide

by
The U.N. report called Facebook a “useful instrument for those seeking to spread hate” and said the site has “led to real-world discrimination must be independently and thoroughly investigated.”

A detailed analysis of the United Nations’ newly released report on Myanmar by BuzzFeed News showed Facebook played a large part in spreading hate speech against the oppressed Rohingya minority of the Rakhine state.

The U.N.’s incriminating report, released on Aug. 27, detailed the shocking findings of the Rohingya crisis and officially called the systemic killings, rapes, arson and mass exodus of the Rohingya people into neighboring Bangladesh, “genocide” by Myanmar’s military.

U.N. investigators have called for the senior Burmese military officials, including Commander-in-Chief General Min Aung Hlaing who ordered the Aug. 25 raids in the Rakhina state, to be prosecuted by the International Criminal Court.

However, much of the blame falls on Facebook’s shoulder also.

The U.N. report called Facebook a “useful instrument for those seeking to spread hate.”

A review of over 4000 posts by politicians from the Arakan National Party (ANP) — the most popular party that represents the interests of the dominant “ethnic Rakhine” in the Rakhine state, which also used to be home to around 700,000 Rohingya — found 10 percent of the posts since the start of 2017 contained hate speech, as defined by Facebook’s own community standards.

The post were made before, during and several months after the state-sanctioned violence against the Rohingya.

Some of the dehumanizing posts by Rakhine parliament members compared Rohingya to dogs, said Muslim women were too ugly to rape and falsely claimed Rohingya torched their own homes for payouts. Some even told Muslims to prepare themselves to be murdered. A report in Wired claimed Aung Hlaing’s office posted pictures of mutilated and dismembered babies and children, which the account alleged were attacked by Rohingya in a bid to reject Britain’s concerns over atrocities against minorities.

The analysis also noted many of the anti-Rohnigya posts were crafted to evade Facebook’s sloppy moderating system. Some of them fall short of explicitly calling for violence against the minorities, instead referring to them as “kalars” — a slur — who were “breeding” too many children and likened the Rohingya’s presence to that of an “invasion.” Others compared the group to animals but used images and memes to send out their message so they become hard to search for. A Reuters report found over 1,000 posts and comments attacking Muslims and Rohingya on Facebook.

As the Rohingya crisis escalated, Facebook’s response was virtually non-existent. The United Nations said the company was “slow and ineffective” in its response against such hate. It said the “extent to which Facebook posts and messages have led to real-world discrimination must be independently and thoroughly investigated.”

Hours after the report was published, Facebook quickly did what can only be construed as damage control and removed “18 Facebook accounts, one Instagram account and 52 Facebook Pages,” which it said were followed by “almost 12 million people.” The company banned 20 organization and individuals, including the commander-in-chief of Myanmar and the army’s television network.

However, the actions come a bit too late.

Facebook came to Myanmar in 2015, at a time when the social media was already extremely popular in Asia. Locals took to the platform with lightning speed and currently the country boasts 15 to 20 million monthly active Facebook users.

Yet, Facebook remains woefully incompetent when it comes to regulating content in a country where hate speech against minorities is overwhelmingly widespread.

In the past, Facebook has relied almost entirely on content reported as inappropriate by civil rights users, government entities and other users and flagged content through direct emails or messages. The platform is still dependent on this method. Earlier this year, CEO Mark Zuckerberg insisted Facebook’s own processes were responsible for taking down violent content. However, other activists who experimented on the platform said the company only took action after days of reporting, sometimes taking as much as 48 hours.

Facebook said it has 60 people who moderate content in Burmese and will increase the number to 100 by the end of 2018. However, it says its real issue is that Burmese users do not actively report content. It said it is trying to resolve this by investing in a digital literacy campaign and introducing a more user-friendly text format for users. But critics say the reporting process on Facebook is obsolete and difficult to use, which is why people don’t utilize it.

Facebook also has not uncovered anything about its content moderators in Myanmar, including their qualifications, professional backgrounds and their opinions.

The United Nations’ report calling senior military figures to be indicted shows the widespread role of the state in promoting violence. Facebook was used to spread hate and it’s unclear whether the company fully realizes its failure to stop derogatory rhetoric targeting minority communities may have been a catalyst for promoting genocide.

Campaigners in Myanmar claimed they have actively tried to warn Facebook that targeting the Rohingya could very quickly spiral out of control, but the social media seemed reluctant to take any concrete measures against it.

“I can’t think of a genocide where there hasn’t been a media component,” said Alexa Koenig, executive director of the Human Rights Center at University of California, Berkeley. “There are many social scientists and scholars who have documented the same patterns. As humans, our gut instinct isn’t to perform violence against each other. There has to be historical trigger. But you have to have tinder in place for a spark to become a flame.”

As a social media company, it is very difficult to find whether Facebook directly benefits from calls for violence and the platform, like other social media site, can be spared the responsibility of the content it broadcasts (because it ultimately belongs to the users). This means it becomes difficult to prosecute Facebook for any dehumanizing, violent messages that are posted. But that does not mean Facebook should not hold any concerns for the content it’s disseminating.

Banner/Thumbnail Credits: REUTERS/Soe Zeya Tun

Carbonated.TV
View Comments

Recommended For You