Facebook has a responsibility to fight abuse on Facebook. This is especially true in countries like Myanmar where many people are using the internet for the first time and social media can be used to spread hate and fuel tension on the ground.
The ethnic violence in Myanmar is horrific and we have been too slow to prevent misinformation and hate on Facebook. It’s why we created a dedicated team across product, engineering and policy to work on issues specific to Myanmar earlier this year. Today we’re sharing details on the investments we have made and the results they have started to yield.
Better Tools and Technology
The rate at which bad content is reported in Burmese, whether it’s hate speech or misinformation, is low. This is due to challenges with our reporting tools, technical issues with font display and a lack of familiarity with our policies. So we’re investing heavily in artificial intelligence that can proactively flag posts that break our rules.
In the second quarter of 2018, we proactively identified about 52% of the content we removed for hate speech in Myanmar. This is up from 13% in the last quarter of 2017, and is the result of the investments we’ve made both in detection technology and people, the combination of which help find potentially violating content and accounts and flag them for review. As recently as last week, we proactively identified posts that indicated a threat of credible violence in Myanmar. We removed the posts and flagged them to civil society groups to ensure that they were aware of potential violence.
We’re also working to make it easier for people to report content in the first place. One of the biggest problems we face is the way text is displayed in Myanmar. Unicode is the global industry standard to encode and display fonts, including for Burmese and other local Myanmar languages. However, over 90% of phones in Myanmar use Zawgyi, which is only used to display Burmese. This means that someone with a Zawgyi phone can’t read websites, posts or Facebook Help Center instructions written in Unicode properly. Myanmar is switching to Unicode, and we’re helping by removing Zawgyi as an option for new Facebook users and improving font converters for existing ones. This will not affect people’s posts but it will standardize how they see buttons, Help Center instructions and reporting tools in the Facebook app.
Our teams are always looking for ways to make reporting easier and more intuitive. In addition to improving our Facebook reporting tools, we’ve introduced new tools on the Messenger mobile app for people to report conversations that might violate our Community Standards.
When we do receive reports of hate speech, they’re sent to our content review team, which has had Myanmar language reviewers for several years. As of this June, we had over 60 Myanmar language experts reviewing content and we will have at least 100 by the end of this year. But it’s not enough to add more reviewers because we can’t rely on reports alone to catch bad content. Engineers across the company are building artificial intelligence tools that help us identify abusive posts and experts from our policy and partnerships teams are working with civil society and building digital literacy programs for people in Myanmar.
Evolving and Enforcing our Policies
It has also become clear that in Myanmar, false news can be used to incite violence, especially when coupled with ethnic and religious tensions. We have updated our credible violence policies to account for this, removing misinformation that has the potential to contribute to imminent violence or physical harm.
We’re working with a network of independent organizations to identify these posts – and we’ve already taken down content in Myanmar that they’ve flagged. This new policy will be global, but we are initially focusing our work on countries where false news has had life or death consequences. These include Sri Lanka, India, Cameroon, and the Central African Republic as well as Myanmar.
While we’re adapting our approach to false news given the changing circumstances, our rules on hate speech have stayed the same: it’s not allowed. And we are getting much more proactive in designating Myanmar hate figures and organizations on Facebook, including Wirathu, Thuseitta, Parmaukkha, Ma Ba Tha and the Buddha Dhamma Prahita Foundation. These individuals and groups are now banned from Facebook — they aren’t allowed to have a presence on Facebook, and no one else is allowed to support, praise or represent them.
Partnerships and Programs on the Ground
We continue to learn from civil society, which has a strong grasp of these issues and helps us understand how our policies play out on the ground. With their help, we have held education campaigns in Myanmar for three years, including a locally translated and illustrated version of our Community Standards. More recently we introduced locally designed tips on how to spot false news, and we’re working to strengthen individual account security in the country where security is typically weak.
As part of our membership in the Global Network Initiative, we routinely conduct impact assessments of product and policy decisions across our apps. Local organizations have asked that we conduct a human rights impact assessment in Myanmar. We have hired Business for Social Responsibility, a non-profit that has expertise in human rights, to do this work and we’ll share the results once we have them.
This is some of the most important work being done at Facebook. And we know we can’t do it alone — we need help from civil society, other technology companies, journalists, schools, government, and most important of all members of our community.
The weight of this work, and its impact on the people of Myanmar, is felt across the company.