Mediation Fails, Facebook Cases in African Countries Get Hotter Tech – 6 hours ago

Jakarta, CNBC Indonesia – Mediation held between Meta (Facebook, Instagram, WhatsApp) and Kenyan citizens has stalled. The two parties did not agree on a mutually beneficial meeting point.

The Kenyans in question are Facebook contract workers who act as content moderators in that country.


A total of 184 content moderators in Kenya sued Meta and two of the giant’s subcontractors earlier this year. The moderators said they were arbitrarily fired by Meta’s sub-contractor, Sama.

Facebook then changed its contractor to Majorel. Unfortunately, moderators have already been blacklisted and are not allowed to apply for the same position at Majorel.

Last August, the court asked Meta and its sub-contractors to carry out amicable negotiations with the Kenyan citizen who was the moderator of the contract.

However, Foxglove, which is a technology workers’ rights organization from England, said that Meta and Sama were not cooperative in the mediation process. Foxglove accused Meta and Sama of not making serious efforts to address the core problems raised by the moderators.

“Respondent [Meta dan Sama] just stalling for time and not being sincere in solving this problem. We continue to wait for their participation. “However, they always ask for an extension of time and are reluctant to take responsibility,” said the worker’s lawyer, Mercy Mutemi, quoted by Reuters, Tuesday (17/10/2023).

Moderators in Kenya assessed that Meta and its sub-contractors had violated work contracts. They also considered that the working conditions being accommodated were not suitable to support their productivity.

Open Letter to Meta from Ethiopian Citizens

Previously, Ethiopians who border Kenya also sent an open letter to Meta. The letter was written by a group of activists and technology accountability institutions representing Ethiopian citizens.

Facebook is considered responsible for escalating ethnic conflicts and civil wars in eastern Africa.

The Insider report said that Meta often ignored warnings from local residents about hateful content on Facebook. This was revealed by 6 experts from Ethiopia who were interviewed by Insider.

In 2017, Facebook was also one of the perpetrators of violence against Rohingya Muslims in Myanmar. At that time, Meta appointed a network of trusted partners to moderate content.

These partners are contracted by Meta to provide local and linguistic expertise. However, Meta’s trusted partner in Ethiopia admitted that Facebook was not really serious about blocking hate content.

They said Meta sometimes ignored or delayed responding to content that was malicious in nature. In fact, in the internet era, the spread of dangerous content can result in someone’s death.

In 2021, trusted partner Meta said it had asked the giant to act on hate content targeting Meareg Amare, a chemistry professor from Tigray.

Unfortunately, Meta didn’t take quick action. Amare was finally killed outside his house, 5 weeks after the first post spread on Facebook.

Amare’s son and 2 activists who signed the petition ultimately filed a lawsuit worth US$ 1.6 billion against Meta in Kenya. Cori Crider, director of the non-profit law firm Foxglove Legal from England, is accompanying the lawsuit.

“Facebook knows that professors and intellectuals from Tigray have been repeatedly threatened and attacked on its platform, but have done nothing,” he said.

“If Facebook had acted after being warned, Professor Amare might still be alive today,” he continued.

In response to this, Facebook simply issued a template statement. “Input from local citizens and organizations guides us to maintain security and integrity in Ethiopia,” said a Meta representative.

[Gambas:Video CNBC]

Next Article

Chronology of Elon Musk and Mark Zuckerberg Fist Fight, Listen!

(fab/fab)