In the wake of a UN report highlighting genocide in Myanmar, which put Facebook at the heart of the issue, the social network has started removing accounts linked to the military regime
Despite coming to power in Myanmar as the international darling of democracy, Aung San Suu Kyi’s Myanmar has been at the heart of unspeakable horrors. A little over 12 months ago indiscriminate persecution against the Rohingya Muslim population in the north of the country sparked an international crisis as millions of refugees flooded across the border into Bangladesh.
A subsequent UN report into the crisis has been damning of the military regime in Myanmar and another international player has also come under harsh criticism. The UN cites Facebook as being the main facilitator of the spread of the “hate and misinformation” that has fueled religious persecution on an industrial scale.
We’ve highlighted Facebook’s role in the spread of fake news more than once. Shockingly, when Facebook began operations in Myanmar, the internet giant only employed two people who could speak the native language to regulate content shared across the network. With upwards of 53 million people living in Myanmar and most of them getting their news from social media, this represented a hugely irresponsible move from a global internet company that rakes in billions of dollars in profit every year.
A lack of native speakers regulating content means anything can be shared and very little of the stuff that breaks the rules will be taken down. Facebook has since said it has remedied this situation, and it is actively seeking to grow its staff of native speakers all around the world. But in Myanmar, this irresponsible action and the slow response to the developing crisis has come at a huge cost. The spread of fake news and racially discriminating content across Facebook helped facilitate, according to Medecins sans Frontiers, the deaths of at least 6,700 Rohingya including 730 children under the age of five in just the first month after the crisis erupted last August.
On Monday, Facebook announced that it would take further action against those pushing the false information across its network. In a blog post, Facebook acknowledged that it was too slow to act, but that it was now attempting to help solve the massive injustice it helped create. The biggest move comes in the form of removing 18 Facebook accounts, one Instagram account, and 52 Facebook pages linked to the spread of fake news in Myanmar. In total, these pages had a reach of over 12 million people. Furthermore, these accounts were linked to top officials in the Myanmar military regime, with Facebook saying:
“Specifically, we are banning 20 individuals and organizations from Facebook in Myanmar — including Senior General Min Aung Hlaing, commander-in-chief of the armed forces, and the military’s Myawady television network. International experts, most recently in a report by the UN Human Rights Council-authorized Fact-Finding Mission on Myanmar, have found evidence that many of these individuals and organizations committed or enabled serious human rights abuses in the country. And we want to prevent them from using our service to further inflame ethnic and religious tensions.”
This action needs to be applauded, but that applause should be muted. Although this action will prevent the further spread of fake news, it is merely Facebook getting its own house in order. At home, we learn regularly that Facebook is abusing the trust we’ve given it; abroad, it sits at the heart of state-sponsored attempts at genocide that have seen hundreds of villages burned to the ground, thousands of people killed, and millions of refugees forced to flee their homeland to escape persecution.
Well done for taking a stand against powerful people in Myanmar, but for Facebook, one question will hang over the social network’s head from now and for years to come. How did it come this?