Facebook admitted recently that it had played a role in spreading hate, propaganda and crime in Myanmar, a Southeast-Asian country torn apart by genocide. The social network commissioned Business for Social Responsibility (BSR). BSR is a nonprofit group that specializes in advising businesses on how they impact human rights, politics, and climate change. The company released a 60-page analysis on the impact Facebook had over crimes against the Rohingya people, a minority muslim group that’s been forced to flee from Myanmar. Now, Facebook is trying to tackle the issue head on by identifying and eliminating hate groups present on the site. Is the move a well-informed decision to reduce the amount of hate-speech on the site? Or was it an overreaction to criticisms of the company’s practices?
Myanmar’s hatred toward the Rohingya people goes back generations. The people’s presence in Burma (which was later renamed to Myanmar) dates back to the 1500’s. However, the group still suffers persecution from the Burmese government today. Laws put in place throughout the last couple of decades have left the Rohingya stateless and unable to do things like get married or travel without specific approval from the Burmese government. Hundreds and thousands of Rohingya have fled to neighboring Bangladesh, but the country refuses to take anymore refugees. With so many stranded in Myanmar, violent forces have acted against the Rohingya, causing war and deeper famine. The United Nations has called the Burmese government into question and classifies the hate crimes against the Rohingya as textbook ethnic cleansing.
Within the 60-page analysis that BSR provided, the company gave Facebook a comprehensive list of recommendations on how to tackle the issue head on. In the wake of the analysis, Facebook has been working hard to identify and dismantle politically charged hate speech on the site. Of Myanmar’s population of 53 million people, nearly 20 million are on Facebook. This means that the website dominates the market, making it particularly susceptible to misinformation and propaganda. BSR recommended that Facebook improve its internal decision making and accountability for human rights campaigns. It also recommended implementing artificial intelligence to implement standards on a stricter violence policy. Above all, though, these tasks involved hiring teams that have a deep understanding of the issues surrounding Myanmar.
Facebook has retaliated against the presence of hate-speech and propaganda on its website by identifying and banning any party that posts or promotes violence against the Rohingya. In a statement posted to the websites blog earlier this month, Facebook announced its plans to tackle the issue head on. The social network has removed hundreds of accounts posting information tied to the Myanmar government. It began its purge of these accounts late last summer. The company has received criticism from those working closely with the issue for not hiring specialists to handle the problem.
There may be things that Facebook can implement to decrease the spread of misinformation and crime in Myanmar. However, this deeply rooted political issue isn’t something that Facebook can—or should—fix on its own. BSR identified that the 2020 election in Myanmar will bring about a massive wave of political misinformation and potential hate speech. This will be a good opportunity for Facebook to implement the strategies recommended by BSR.