Facebook Inc.’s lack of a serious response to signs of abuse on its platform in Sri Lanka may have helped stoke deadly violence in the country in 2018, according to an investigation of the social network’s operations there.
The company released a summary of the findings Tuesday, along with other independent assessments of the service’s impact on human rights in Indonesia and Cambodia.
“We deplore this misuse of our platform,” the company said in a response to the Sri Lanka report. “We recognize, and apologize for, the very real human rights impacts that resulted.” Facebook also highlighted actions it has taken to address the problems, including hiring content moderators with local language skills, implementing technology that automatically detects signs of hate speech and keeps abusive content from spreading, and trying to deepen relationships with local civil society groups.
The report on Sri Lanka details Facebook’s failure to respond to almost a decade of warnings about misuse of its platform from groups within the country. In 2018, a viral video falsely purporting to show a Muslim restaurateur admitting to mixing “sterilization pills” into the food of Sinhala-Buddhist men may have contributed to unrest and physical harm.
Facebook’s poor track record on human rights in international markets has been a black mark on the company for years. As it expanded rapidly, it staffed local operations in far-off countries with skeleton crews or not at all, making it unresponsive to the specific forms of local manipulation, according to the reports. Facebook’s decision to design algorithms that encourage more engagement also made it vulnerable to disinformation and incitement to violence. Human rights advocates have pushed the company to release assessments like the ones it shared Tuesday.
This is not the first Facebook apology. A 2018 assessment of its operations in Myanmar found it partly to blame for violence in that country. One response was to hire activist Miranda Sissons as the first Facebook director of human rights, last year. According to Sissons, Facebook has begun to conduct more country-by-country reviews of its performance on human rights, and will begin releasing reports regularly. She didn’t lay out a timeline, and declined to say which countries the company is studying.
The recently-released reports, which cover a similar time period as the Myanmar assessment, describe Facebook’s impact as complicated. Having access to the social network often increased freedom of speech and gave marginalized communities a new way to communicate. But governments also used Facebook to identify dissidents and spread misinformation. Groups looking to stoke communal violence found the social network to be fertile ground for recruitment and incitement.
Article One Advisors LLC, the consultant that conducted the assessments into Sri Lanka and Indonesia, found significant improvement at Facebook. “There has been a very big cultural shift — a very welcome cultural shift,” said Chloe Poynton, a co-founder and principal at the firm. Facebook has implemented many of the group’s suggestions related to content moderation and slowing the spread of abusive messaging. It is still considering some others, like Article One’s call for the company to appoint a member of its board of directors to be specifically responsible for human rights, according to Sissons.
The company’s products may still pose a challenge going forward — particularly end-to-end encryption on WhatsApp. Viral misinformation on the messaging service has already stoked violence and deaths in India, and while Facebook has started to fight this by limiting some message forwarding, the company’s inability to read encrypted messages makes it difficult to spot potentially dangerous activity.
The assessment of Cambodia was conducted by Business for Social Responsibility, a nonprofit consulting firm. It found no significant failures, but highlighted challenges Facebook faced engaging with a repressive political system. It urged Facebook to push the country’s government to pass more humane regulations for social media, and to call out issues such as surveillance.
Sissons said Facebook was grappling with its responsibilities in such situations. “These are among the most difficult questions companies face,” she said. “We are prepared to engage with this question, but we don’t have answers to signal yet.”
Source: Bloomberg