Facebook allows incitement to genocide in Ethiopia |
Facebook has come under renewed scrutiny, accusing it of continuing to allow activists to incite genocide amid the escalating war in Ethiopia.
An analysis by the Bureau of Investigative Journalism and Monitoring found that Facebook still allows users to post content that incites violence through hate and misinformation.
This comes despite the realization that doing so directly inflamed tensions and drew accusations of inaction and indifference towards the social media giant.
The investigation revealed relatives who linked Facebook posts to the death of a member of his family. A senior Ethiopian media official accused the company of standing there and watching the country collapse.
The allegations come as the decision to edit Facebook's content drew increased attention after it was previously accused of playing a role in the racial persecution of Rohingya people in Myanmar.
The analysis comes as Facebook is considering an independent investigation into its work in Ethiopia after its oversight board requested an investigation into how the platform was used to spread hate speech.
Investigators from the International Federation of Journalism, Justice and Observers also conducted interviews with a number of fact-checkers, civil society organizations and human rights activists in the country. They described Facebook support as very weak. Others said they believed requests for help were ignored and that the meeting did not take place.
They say these failures have fueled a conflict that has killed thousands and displaced millions since fighting broke out between government forces and armed opposition groups in Tigray in November 2020. Both sides are accused of atrocities.
Facebook accused of taking no action after posting horrific content
Meta denied the allegations, saying it had invested in security measures to combat hate and inflammatory language and had taken proactive steps to stop the spread of misinformation in Ethiopia.
Four months later, the Bureau of Investigative Journalism found ethnic cleansing posts on Facebook. Although Meta says it has removed any content that violates its policies.
As revealed by Hogan to the US Congress, Meta has known the dangers of such issues for years.
An internal report on malicious content on the platform in January 2019 described the situation in Ethiopia as grave. About a year later, Ethiopia topped Facebook's list of countries in need of action.
More than a year later, the company is said to have repeatedly ignored requests for help from the country's fact-checkers.
Some civil society organizations said they had not seen the company for 18 months. In September, the company appointed its first Ethiopian political executive to work in East Africa.
Meta operates a third party fact checker. The program gives affiliates access to internal tools and gets paid to review messages.
However, the company has not yet worked with any organization in Ethiopia to counter misinformation about the conflict in the country.