Home » Big Tech Is in Crisis in Africa
Tech & Innovation

Big Tech Is in Crisis in Africa

A landmark case against Meta in Kenya could reshape the way tech giants operate on the continent.

Court Cases Across Africa Could Shake Up Big Tech

The legal troubles of Meta, Facebook’s parent corporation, in Africa look set to continue after a Kenyan court ruled that Meta can be sued for unfair dismissal and blocked the sacking of 184 African tech workers hired as Facebook moderators.

The lawsuit alleges that Meta and its third-party contractor Sama fired workers illegally in January after failing to issue them with appropriate redundancy notices as required by Kenyan law. The case has potentially global implications for tech workers, including sacked employees in Twitter’s only African office who are pursuing legal action.

Sama, a U.S.-based outsourcing company, quit content moderation services in January following a lawsuit alleging worker exploitation and union-busting.

Meta argued that it does not employ Sama’s sacked staff. A judge in Nairobi disagreed. On June 2, the court said the moderators did Meta’s work, used its technology and platform, adhered to its metrics, and therefore Sama was “merely an agent … or manager.”

“There is nothing in the arrangements to absolve the first and second respondents as the primary and principal employers of the content moderators,” the 142-page ruling read.

“We fundamentally disagree with this interim ruling and we have appealed it,” a Meta spokesperson told Foreign Policy by email.

It is the latest in a series of significant blows to Facebook’s parent company, which is facing three court cases overall in Kenya—the first cases brought by workers to be filed outside the United States. In all, Meta has argued that it cannot be sued in Kenya because it is not registered there.

In February, Nairobi’s labor court ruled that it had authority to hear a case filed by Daniel Motaung, a former Sama employee from South Africa working on Facebook content moderation in Kenya. Motaung alleges that content moderators in Kenya were subjected to undignified working conditions and were not provided with mental health care after being exposed on the job to graphically violent content. He alleges that he was unlawfully dismissed after attempting to unionize for better pay and working conditions.

In an emailed statement, Sama representatives said that “Sama disputes the claims made in this case. Allegations of union busting are not correct. No union was formed,” though it did acknowledge that “a collection of workers came together to threaten to strike.” Sama said it had mandated that content moderators take an hour and a half of wellness and meal breaks per day and provided onsite counseling sessions, available 24/7.

In April, the Kenyan High Court ruled that a $1.6 billion lawsuit can go ahead against Meta by

Kenya’s Katiba Institute and two Ethiopian individuals suing the tech giant for failing to adequately moderate online hate speech in Africa. “Content moderation everywhere is a difficult, stressful, and potentially psycho-toxic job,” said Cori Crider, a director at Foxglove, a London-based nonprofit legal firm that is supporting former African moderators with their cases.

Armed conflict across parts of Africa has fueled particularly horrific content, such as during Ethiopia’s two-year civil war. “At that time, for 117 million people in Ethiopia with 87 languages, there were about 25 content moderators speaking three of those languages,” Crider said.

The petitioners allege that Facebook’s algorithm amplified genocidal posts in Ethiopia, which led to the murder of a university professor and half a million other deaths during the Tigray War. One person described watching a video of a person being burned alive in a cage. Former workers allege that they had to watch and evaluate a new video every 55 seconds. Meta has said it does not put a time limit on reviewing content.

Meta’s independent Oversight Board in late 2021 recommended a review of how Facebook and Instagram have been used to spread content that heightens the risk of violence in Ethiopia. Tech giants are accused of not hiring enough workers proficient in African languages and outsourcing work to firms located in countries with lax labor laws.

Moderators hired from across Africa, including Morocco, Kenya, Ghana, Ethiopia, Uganda, Somalia, and South Africa, say they review the most toxic content and earn as little as $1.50 an hour.

Meta told Foreign Policy that the “companies we work with are also contractually obliged to pay their employees who review content on Facebook and Instagram above the industry standard in the markets they operate. We respect the right of our vendor’s employees to organize and do not oppose or inhibit their right to unionize.”

According to documents leaked by Frances Haugen, a former Facebook staffer, 87 percent of Facebook’s global budget for classifying misinformation goes toward the United States, while 13 percent is set aside for the rest of the world. North American users make up just 10 percent of Facebook’s daily users.

Africa’s ready supply of highly educated but unemployed youths makes it ripe for unfair pay. “You see these promising young kids, graduates … go into this job because its what’s available and emerge with their life totally derailed because they can’t sleep, they struggle to relate to people,” Crider said.

Sama’s Facebook contract has shifted to Majorel, which is headquartered in Luxembourg. Majorel is partly owned by an investment firm founded by Moulay Hafid Elalamy, a former Moroccan minister of industry and trade. Former moderators who worked for Majorel in Morocco reviewing TikTok’s African content said they experienced severe “psychological distress” while being paid $2 an hour.

In response, a spokesperson for TikTok told Foreign Policy: “As we strive to promote a caring work environment for our employees and contractors, we continue to develop ways to help moderators feel supported mentally and emotionally.”

More than 150 workers whose moderation work is vital to the AI systems of Facebook, TikTok, and ChatGPT gathered in Nairobi last month and established what they said to be the first African Content Moderators Union. The union wants tech giants to increase salaries, provide access to psychiatrists, and end exploitative labor practices.

What is significant about the June 2 ruling is that it instructs Kenya’s government agencies to evaluate laws, investigate the conditions in the sector, and suggest reforms to better protect tech workers. If successful, it could reform how tech giants operate across the continent.

Source: FOREIGNPOLICY