'Too slow' in preventing hate speech in Myanmar: Facebook

Sakshi Chaturvedi
Published on: 16 Aug 2018 4:58 AM GMT
Three Indian women selected as Facebook global community leaders
X

San Francisco: The ethnic violence in Myanmar is horrific and we have been "too slow" to prevent the spread of misinformation and hate speech on our platform, Facebook acknowledged on Thursday.

The admission came after a Reuters investigation on Wednesday revealed that Facebook has struggled to address hate posts about the minority Rohingya, the social media giant said the rate at which bad content is reported in Burmese, whether it's hate speech or misinformation, is low.

"This is due to challenges with our reporting tools, technical issues with font display and a lack of familiarity with our policies. We're investing heavily in Artificial Intelligence that can proactively flag posts that break our rules," Sara Su, Product Manager at Facebook, said in a statement.

ALSO READ: Now play AR games with friends on Facebook Messenger

According to Facebook, in the second quarter of 2018, it proactively identified about 52 per cent of the content it removed for hate speech in Myanmar.

"This is up from 13 per cent in the last quarter of 2017, and is the result of the investments we've made both in detection technology and people, the combination of which help find potentially violating content and accounts and flag them for review," said Facebook.

Facebook said it proactively identified posts as recently as last week that indicated a threat of credible violence in Myanmar.

"We removed the posts and flagged them to civil society groups to ensure that they were aware of potential violence," said the blog post.

In May, a coalition of activists from eight countries, including India and Myanmar, called on Facebook to put in place a transparent and consistent approach to moderation.

The coalition demanded civil rights and political bias audits into Facebook's role in abetting human rights abuses, spreading misinformation and manipulation of democratic processes in their respective countries.

Besides India and Myanmar, the other countries that the activists represented were Bangladesh, Sri Lanka, Vietnam, the Philippines, Syria and Ethiopia.

Facebook said that as of June, it had over 60 Myanmar language experts reviewing content and will have at least 100 by the end of this year.

"But it's not enough to add more reviewers because we can't rely on reports alone to catch bad content. Engineers across the company are building AI tools that help us identify abusive posts," said the social media giant.

ALSO READ: Facebook lost $120 bn in market value over slow growth

Not only Myanmar, activists in Sri Lanka have argued that the lack of local moderators -- specifically moderators fluent in the Sinhalese language spoken by the country's Buddhist majority -- had allowed hate speech run wild on the platform.

Facebook said it is working with a network of independent organisations to identify hate posts.

"We are initially focusing our work on countries where false news has had life or death consequences. These include Sri Lanka, India, Cameroon, and the Central African Republic as well as Myanmar," said the company.

IANS

Sakshi Chaturvedi

Sakshi Chaturvedi

A journalist, presently working as a Sub-Editor at newstrack.com.

Next Story